A sociologist at the Massachusetts Institute of Technology is studying the artificial intimacy afforded to humans by AI chatbots — including for people who have IRL marriages.
In an interview with NPR, MIT researcher Sherry Turkle said that she’s interested in “machines that say, ‘I care about you, I love you, take care of me.'”
Certain humans have long developed intimate connections with inanimate objects, with Turkle examining similar phenomena since the 1990s with interactive toys like Tamagotchis and Furbies. But recent advances have put intimate relationships with AI into overdrive.
To Turkle’s mind, the feelings people have for their AI companions present a curious psychosocial conundrum.
“What AI can offer is a space away from the friction of companionship and friendship,” the researcher and author told NPR. “It offers the illusion of intimacy without the demands. And that is the particular challenge of this technology.”
Take, for example, a married man at the center of one of Turkle’s case studies.
Though the unnamed man said he respects his wife, her focus has shifted away from him and onto taking care of their children, which to him made it feel like their relationship had lost its romantic and sexual spark. After he began chatting with an AI companion about his thoughts and anxieties, the man reported feeling validated — especially in the way it seemed to express sexual interest in him. He felt affirmed and unjudged in those exchanges with the chatbot, suggesting that he didn’t feel that way with his wife.
It’s unclear whether or how much the man’s wife or children knows about his AI “girlfriend,” but it’s clear from was shared that he has expressed some level of vulnerability with the chatbot — a vulnerability that occurs, Turkle suggests, under false pretenses.
“The trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born,”she said. “I call this pretend empathy, because the machine does not empathize with you. It does not care about you.”
Rather than judging people for turning to technology for their human needs, Turkle instead offers a few words of caution for those who go the AI companion route: to remind themselves that the chatbots are not people, and that even though they may produce less stress than human relationships, they also can’t truly fulfill the roles humans can in our lives.
“The avatar is betwixt the person and a fantasy,” the researcher mused. “Don’t get so attached that you can’t say, ‘You know what? This is a program.’ There is nobody home.”
More on AI relationships: Tech Exec Predicts Billion-Dollar AI Girlfriend Industry