Could artificial intelligence replace a human emotionally?
Outside of ChatGPT, there are things like Character AI, Chai, etc., where you can interact with a movie character, for example, in the form of a relationship or even a therapist. There are also programmed AIs that have the knowledge of a chemistry, physics, or biology teacher. And now ChatGPT has a feature where you can freely chat with the artificial intelligence using voices.
Could AI provide an emotional replacement for people who are lonely or unable to meet new people?
A very exciting question! First, it must be noted that today’s AI systems such as ChatGPT, Claude, models such as LLaMA 3 or even services such as Character AI cannot understand human emotions. You don’t have feelings yourself either. What these systems do is to generate probable answers based on statistical forecasts.
Nevertheless, artificial intelligence can emotionally support a person. There are studies that have examined exactly this and have come to the conclusion that, for example, patients have felt a chatbot such as ChatGPT as empathic – i.e. more sensitive – than a real doctor.
Here you can find information: https://www.theregister.com/2024/01/16/google_ai_chatbot_heathcare/
And here: https://www.nature.com/articles/d41586-024-00099-4
In the early days of ChatGPT various surveys were also conducted. Many young people have said that they feel free and helpful in writing with the chatbot openly about their problems, getting confirmation and feedback. It is not always about getting as good advice as possible, but simply talking about your own conflicts.
But does this replace emotional support by a human? I don’t think so. Because it is one thing to simulate empathy based on an inner statistic, and to show another real empathy. In addition, body contact can be very important in stress, extreme and suffering situations. So a hug from a good friend or friend, from parents, etc. This reduces blood pressure, releases stress- and anxiety-solving messengers and supports emotional healing.
Here is an investigation: https://www.sciencedaily.com/releases/2024/04/240408130610.htm
And at least AIs are not really good in hugs.
Great answer!
I don’t think because the feelings are programmed anyway and the robot or the AI shows the emotions rather on cramp
She’s already doing this in old homes. Man develops an emotional bond to the machine, which has a positive effect on his health. It is useful to prevent ageing.
https://www.robicare.de/
What artificial intelligence cannot, however, is to redeem humanity from its social tasks. So there will be social behaviour in the future.
Yeah. Or chat partners on the Internet. So why not a AI?
Alex
because do people not have more feelings than a stone?
Theoretical yes
But that would not benefit most people because they know that it is not a real empathy
Of course not
Why not many professions already have machines about nomen
I would like to give an answer from my point of view here as a non-AI migrant, but as a psychologist.
I would like to begin by pointing out that people do not need communication at all to establish an emotional relationship with something. We all have an emotional bond to places, objects, even to habits that can have at least the intensity like the relationship to a person.
There are lonely people with restrictions hanging on their apartment, their everyday skram, their daily walk and often not moving away from it or changing.
In Japan, there are already young people who move completely back to their apartment and only talk with machines. The development of AI could accelerate this development, that people want to talk less and less with other people and lose such important human contacts.
At the same time, technological development could also lead to such cases of withdrawal being recognized earlier. If the AI is trained or programmed on it, it can not only recommend searching for help, for example, if necessary, but even actively guide this into the way.
Also, more and more professionals will get the opportunity to communicate via chats and video telephony with those affected to offer real human support.
For something else the AI cannot: get a personal impression. She can only answer questions she asks.
So if someone asks, “I’m always so slapped, what should I do?” the AI might just say, “Trink less coffee and take a walk.” But she may never come up with the idea: “You live here in a totally enclosed apartment – here must be thoroughly cleansed, otherwise you will never be better!”
I therefore see the danger of an emotional dependence on an AI – but at the same time I hope that technological progress can recognize cases of conciliation and other negative developments and give counter-tax.