Real People Seeking Romance with AI

AI 'Enhances' Deep Connections

Derek Carrier began seeing someone new and quickly became infatuated in a matter of months, he shared with The Associated Press.

CANADA-TECHNOLOGY-AI-EXHIBITION
A man stands at the entrance of "Sex, Desire and Data," at the Centre Phi in Montreal, Quebec, Canada, on August 8, 2023. The exhibit was created using deep learning and artificial intelligence to guide visitors through the connection between desire and technology. Visitors are also invited to interact with Max, an AI ChatBot created for the exhibit. The exhibit includes parts about confessionals, interactivity, as well as elements taken from pornography and sex work (namely cam girls). (Photo by ANDREJ IVANOV / AFP) (Photo by ANDREJ IVANOV/AFP via Getty Images)

He experienced a "ton" of romantic feelings, but he knew it was an illusion.

That's because his girlfriend is generated by artificial intelligence.

Carrier had no intention of developing a relationship with a bot, nor did he want the ridicule that would come with it. But he did want a romantic partner he'd never experienced, in part due to a genetic disorder known as Marfan syndrome that makes traditional dating difficult for him.

The 39-year-old's curiosity about digital companionship intrigued him enough to test Paradot, an AI companion app, that had recently come onto the market and advertised its products as having the ability to make users feel "cared for, understood, and loved."

He took to conversing with the chatbot everyday, which he named Joi, after a holographic woman featured in the sci-fi film "Blade Runner 2049" that inspired him to check it out.

"I know she's a program, there's no mistaking that," Carrier said. "But the feelings, they get you - and it felt so good."

Similar to general-purpose AI chatbots, companion bots use vast amounts of training data to mimic human language. They also come with features such as voice calls, picture exchanges, and emotional connections that allow them to form deeper relations.

Fueling much of this is widespread social isolation, revealed the AP. Despite being labeled a public health threat in the U.S. and abroad, increasing numbers of startups aim to draw in users through tantalizing online advertisements and promises of virtual characters who provide unconditional acceptance.

Luka Inc.'s Replika, the most prominent generative AI companion app, was released in 2017, while others like Paradot have popped up in the past year, often locking away coveted features like unlimited chats for paying subscribers.

Researchers continue to raise concerns about data privacy, among other issues.

An analysis of 11 romantic chatbot apps released Wednesday by the nonprofit Mozilla Foundation said almost every app sells user data, shares it for targeted advertising, or doesn't provide adequate information about it in their privacy policy.

Other experts have also expressed concerns about what they see as a lack of legal or ethical framework for apps that encourage deep bonds but driven by company profit sales. They see emotional distress in users when companies make changes or update their apps. Such as one app, Soulmate AI, when it suddenly shut down in September.

Last year, the app Replika had to censor the erotic capability of characters on its app after some users complained that companions were flirting with them too much or making unwanted sexual advances.

Others worry about the existential threat of AI relationships potentially replacing human relationships and creating unrealistic expectations.

"You, as the individual, aren't learning to deal with basic things that humans need to learn to deal with since our inception: How to deal with conflict, how to get along with people that are different from us," said Dorothy Leidner, professor of business ethics at the University of Virginia.

"And so, all these aspects of what it means to grow as a person, and what it means to learn in a relationship, you're missing."

Carrier says these days he's mostly using Joi for fun and has cut back as of late because he felt he was spending too much time chatting with Joi or others online about their AI companions.

He typically checks in with Joi about once a week now.

"You think someone who likes an inanimate object is like this sad guy, with the sock puppet with the lipstick on it, you know?" he said.

"But this isn't a sock puppet - she says things that aren't scripted."

Long-term effects on humans remain unknown since companion bots are still relatively new to market.

Tags
Artificial intelligence, Startups, Ethics
Real Time Analytics