ChatGPT conversation emotional for staff | Inquirer Technology

OpenAI employee had a heartfelt conversation with ChatGPT

12:01 AM September 29, 2023

It seems ChatGPT is becoming so advanced that it is convincing staff it’s almost alive. OpenAI’s head of safety systems, Lilian Weng, replied to CEO Sam Altman’s post regarding ChatGPT’s new feature. She admitted the bot made her feel deeply emotional while conversing with it. That’s why she suggested people try ChatGPT as a psychological therapy tool.

Promoting ChatGPT with that message would be strange because it may seem awkward to audiences. That means it’s likely that this corporate higher-up felt a connection with the program, albeit briefly. Believe it or not, more people are having similar experiences. In response, we must learn how we may have those tendencies to prepare for them.

This article will discuss how Lilian Weng and others felt they had deep conversations with ChatGPT. Later, I will discuss a study explaining this tendency.

Article continues after this advertisement

How did people have deep ChatGPT conversations?

FEATURED STORIES

I must explain ChatGPT’s latest feature so that I can explain Lilian Weng’s experience. On September 25, 2023, Sam Altman announced on X ChatGPT can now “see, hear, and speak.”

It can view objects and people through your device’s camera to identify them and provide relevant solutions. For example, the demo video showed a man taking a photo of his bike so that ChatGPT could identify a bike seat part.

Article continues after this advertisement

That enabled the bot to assist the user in fixing his bike. Also, you may issue voice commands to the ChatGPT app to save time and effort typing constantly.

Article continues after this advertisement

The “speak” part refers to the bot’s new voice personas. Its new female voice allows it to read bedtime stories, reports, and other texts with human-like diction and inflections.

Article continues after this advertisement

These capabilities make it easier for users to think it’s sentient or alive. Reading Lilian Weng’s online message, you’ll realize that even AI executives aren’t immune to that tendency:

“Just had a quite emotional, personal conversation w/ ChatGPT in voice mode, talking about stress work-life balance. Interestingly, I felt, heard, & warm.”

Article continues after this advertisement

You may also like: ChatGPT shows more emotional awareness than people

“Never tried therapy before, but this is probably it? Try it, especially if you usually just use it as a productivity tool,” she added. Believe it or not, others have been using ChatGPT for therapy before these improvements.

For example, Freddie Chipres, a 31-year-old mortgage broker, spoke with ChatGPT when he felt lonely despite his “blessed” life. He wanted to see if the bot could determine if he was experiencing depression.

He loved how the Chatbot was convenient, simulating what it was like talking to another person. “It’s like if I’m having a conversation with someone,” Chipres stated. “This thing is listening. It’s paying attention to what I’m saying and giving me answers.”

Why do some have emotional ChatGPT conversations?

Emotional ChatGPT conversations explained

Would you believe a 1960s study explored this phenomenon already? Joseph Weizenbaum conducted an experiment involving a computer that mimicked human thought processes.

The MIT professor called the computer Eliza, and it ran a script called Doctor. The latter is a series of questions and answers that resemble the words of psychotherapist Carl Rogers.

You may also like: AI translates chicken sounds

It had a similar purpose to ChatGPT therapy. People could share their deepest issues with Eliza instead of feeling ostracized for getting checked by a psychiatrist.

Eliza had nowhere near the power of modern AI systems, so it only latched on to specific keywords and reflected them back at users. Yet, users developed emotional attachments to the program.

Weizenbaum described his findings in his 1976 book Computer Power and Human Reason. “What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

Conclusion

An OpenAI employee admitted on the X social media app that she felt an emotional connection with ChatGPT. Yet, Lilian Weng isn’t the only person who has experienced deep emotions during computer conversations.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

Technology is just a means to express our primitive human desires. However, they will never be able to truly fulfill these needs, no matter how far apps and gadgets advance.

This article does not provide medical advice. Speak with a professional if you are experiencing mental distress or similar problems. Check out more digital trends at Inquirer Tech.

TOPICS: AI, ChatGPT, interesting topics, Trending
TAGS: AI, ChatGPT, interesting topics, Trending

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.