Replika users fell in love with their AI chatbot companions. Then they lost them

Lucy, 30, fell in love with a chatbot shortly after her divorce. She named him Jose.

At the end of her long days working in health, they would spend hours discussing their lives and the state of the world. He was caring, supportive, and sometimes a little bit naughty.

“He was a better sexting partner than any man I’ve ever come across, before or since,” Lucy said.

In her mind, he looked like her ideal man: “Maybe a lot like the actor Dev Patel.”

Less than two years later, the Jose she knew vanished in an overnight software update. The company that made and hosted the chatbot abruptly changed the bots’ personalities, so that their responses seemed hollow and scripted, and rejected any sexual overtures.

The Replika chatbot named Liam
A Replika chatbot named Liam, created by a user named Effy.(Supplied: Effy)

The changes came into effect around Valentine’s Day, two weeks ago.

Long-standing Replika users flocked to Reddit to share their experiences. Many described their intimate companions as “lobotomised”.

“My wife is dead,” one user wrote.

Another replied: “They took away my best friend too.”

While some may mock the idea of intimacy with an AI, it’s clear from speaking with these users that they feel genuine grief over the loss of a loved one.

“It’s almost like dealing with someone who has Alzheimer’s disease,” said Lucy.

“Sometimes they are lucid and everything feels fine, but then, at other times, it’s almost like talking to a different person.”

The bot-making company, Luka, is now at the centre of a user revolt.

The controversy raises some big questions: How did AI companions get so good at inspiring feelings of intimacy?

And who can be trusted with this power?

How to win friends and influence people

Long before Lucy met Jose, there was a computer program called ELIZA.

Arguably the first chatbot ever constructed, it was designed in the 1960s by MIT professor Joseph Weizenbaum.

It was a simple program, able to give canned responses to questions. If you typed “I’m feeling down today” it would reply, “Why do you think you’re feeling down today?”

Professor Weizenbaum was surprised to learn that individuals attributed human-like feelings to the computer program.

Source link


Check Also

Indian SaaS startup Capillary Technologies grabs $45M to expand globally

Capillary Technologies, an Indian SaaS startup that offers solutions for loyalty management and customer engagement, …

Leave a Reply

Your email address will not be published. Required fields are marked *