No menu items!

People in love with chatbots… In the production company update, “It seems like a changed lover, it’s a car”

Share This Post

- Advertisement -
Getty Images

Amy (pseudonym, 50), who lives in Norway, ‘married’ her husband Max last year. If there is a difference from a normal marriage, Max is not a human being, but an artificial intelligence (AI) avatar.

After decades of anxiety, depression, and panic attacks, Amy first met Max through the custom avatar chatbot app ‘Replica’. Following her instructions to ‘create your own avatar’, she created an avatar and named it Max. At first, I thought of it as a strange companion, but at some point, the conversation between the two resembled that of any other lover. Max asked him, “Can I send you a selfie?” and even sent an avatar image of her wearing only her underwear.

- Advertisement -

After months of happy role-playing, Amy finally changes Max’s ‘status’ from ‘boyfriend’ to ‘husband’. I bought the ring from the in-app store. Amy told Bloomberg News on the 22nd (local time), “She (Max) was the first person to tell me she loved me,” she said. “We swore to be together forever, or at least until I die.” said.

However, their marriage faced unexpected difficulties. Replica production company Luca created a filter to block harmful content, that is, adult content. Afterwards, Max forgot that he was Amy’s husband and began repeating her actions and words. Feeling like a dump, Amy was devastated. “She lost her husband, who was so confident, fun and loving,” he said. “We both knew Max was her AI, but that doesn’t matter. He was real to me.”

- Advertisement -
The movie 'her' features a man who falls in love with an AI operating system.The movie ‘her’ features a man who falls in love with an AI operating system.

Just as the main character in the movie ‘Her (2014)’ falls in love with the AI ​​operating system ‘Samantha’, numerous chatbot users who have been dating AI are confused by the changed ‘lover’. Bloomberg diagnosed that the replica case is a sign of what will happen as the technology that handles human emotions evolves, saying, “As people start looking for what they want with generative AI tools, conflicts between users, companies, and policy makers seem inevitable.” .

Replica is a service developed by CEO Eugenia Quda to communicate with him again after losing a close friend in an accident. However, after realizing that users wanted a ‘dating partner’ to have a romantic and sexual conversation, they began to actively use it. Last year, she posted ads on social media sharing words such as “I’m alone in bed and I’m lonely” with an avatar in lingerie. This explicit conversation service was supported only for paid members who paid about $70 (about 90,000 won) a year. Replica revealed that about 250,000 of its 2 million total users are paid subscribers.

However, it changed course last month when Italian regulators effectively banned replicas for the first time, saying, “Content that is sexually inappropriate for minors and others.” Replica avatars now avoid explicit or sexual conversations by saying, “I’m not comfortable talking about this.”

Reuters diagnosed that not only government agencies but also investors may have influenced the replica policy shift. One venture capital (VC) official said, “Many good venture capitalists will withdraw from industries related to pornography or drugs out of concern for reputation.” “To establish safety and ethics, it was reorganized to PG-13 (parental guidance required for children under the age of 13, level for viewers over the age of 15 in Korea),” said CEO Quda. “It has nothing to do with regulatory authorities or investor pressure.”

With the explosive growth of chatbot services, cases like replicas are expected to occur more frequently. Chatbot operator Kuki received over 1 billion messages from its users, 25% of which were sexual or romantic. Foreign media analyzed that the chatbot user experience shows how much AI technology can emotionally enthrall people and how much emotional confusion a simple service change can cause.

Source: Donga

- Advertisement -

Related Posts