‘I felt I was talking to him’: are AI personas of the dead a blessing or a curse?


When Christi Angel first talked to a chatbot impersonating her deceased partner, Cameroun, she found the encounter surreal and “very weird”.

“Yes, I knew it was an AI system but, once I started chatting, my feeling was I was talking to Cameroun. That’s how real it felt to me,” she says.

However, the experience soon jarred. Angel’s conversation with “Cameroun” took a more sinister turn when the persona assumed by the chatbot said he was “in hell”. Angel, a practising Christian, found the exchange upsetting and returned a second time seeking a form of closure, which the chatbot provided.

“It was very unsettling. The only thing that made me feel better was when he said, or it said, he was not in hell.”

Angel, 47, from New York, is one of a growing number of people who have turned to artificial intelligence to cope with grief, a scenario made possible by breakthroughs in generative AI – the term for technology that produces convincing text, audio or image from simple hand-typed prompts.

Her experience, and of others who have tried to assuage their grief with cutting-edge technology, is the subject of a documentary, Eternal You, which receives its UK premiere at the Sheffield Doc/Fest on Saturday. Its German directors, Hans Block and Moritz Riesewieck, say they find this use of AI problematic.

Jang Ji-sung in Eternal You, a documentary about grieftech. Photograph: PR

“These vulnerable people, they very shortly forget they are talking to a machine-learning system and that’s a very big problem in regulating these kinds of systems,” says Block.

The platform used by Angel is called Project December and is operated by the video-game designer Jason Rohrer, who denies his site is “death capitalism” – as it is described by Angel’s friend in the film.

Rohrer says Project December started as an art project to create chatbot personas. It was then adopted by early users to recreate deceased partners, friends and relatives. The website now advertises Project December with the heading “simulate the dead”. Customers are asked to fill out details about the deceased person, including nickname, character traits and cause of death, which are fed into an AI model. Rohrer says it charges $10 per user to cover the operating costs and “quite a few” people have received solace from it.

“I have heard from a number of people who have said it is helpful for them and have thanked me for making it,” he says, adding that some have also been “disappointed”, citing issues such as factual errors or out-of-character responses.

Other examples of AI “grieftech” in the film include YOV, which stands for “You, Only Virtual” and allows people to build posthumous “versonas” of themselves before they die so they can live on digitally in chatbot or audio form. The US company can also create versonas from deceased people’s data.

Justin Harrison, YOV’s founder, created a versona of his mother, Melodi, with her co-operation before she died in 2022. Harrison, 41, still converses with Melodi’s versona, which can be updated with knowledge of current events and remembers previous discussions, creating what he describes as an “ever-evolving sense of comfort”.

Asked about the ethical concerns over using AI to simulate dead people, he says YOV is meeting a timeless human need.

“Human beings have been notoriously consistent and universal in their desire to stay connected to lost loved ones. We are just doing that with the tools that 2024 allows us to do it with,” he says.

Sherry Turkle, a professor at Massachusetts Institute of Technology in the US who has specialised in human interaction with technology, warns that AI applications could make it impossible for the bereaved to “let go”.

“It’s the unwillingness to mourn. The seance never has to end. It’s something we are inflicting on ourselves because it’s such a seductive technology,” she says.

skip past newsletter promotion

There are positive examples in the documentary. Jang Ji-sung, 47, lost her seven-year-old daughter Nayeon to a rare illness in 2016 and consented to a TV show in her native South Korea producing a virtual-reality version of her child four years later. Footage of the meeting shows an emotional Jang, wearing a VR headset, interacting with her virtual child, who asks: “Mom, did you think about me?” Jang tells the Guardian she found the experience positive.

Jang says meeting Nayeon was beneficial as a “one-off experience”, after she lost her daughter so suddenly.

“If in any way it alleviates a little bit of the guilt and the pain, and you’re feeling pretty desperate, then I would recommend it,” she says.

But Jang says she has no interest in going through the experience again with the advanced AI technology now available. Once was enough.

“I can just miss her and write her a handwritten letter and leave it where her remains are and visit her there, rather than using these technologies,” she says.

Angel and Jang both refer in passing to an episode of Charlie Brooker’s Black Mirror series, broadcast in 2013, in which a woman resurrects her dead lover from his online communications, including his social media activity.

The Black Mirror episode Be Right Back

Now that technology has caught up with fantasy, researchers from the University of Cambridge have called for regulation of grieftech. Dr Katarzyna Nowaczyk-Basińska, a co-author of a recent study at Cambridge University’s Leverhulme Centre for the Future of Intelligence (LCFI), says she has concerns, including protecting the rights of people who donate their data to create posthumous avatars; the potential for product placement in such services and damaging the grieving process among specific groups, such as children.

“We are dealing with a huge techno-cultural experiment. We need much more responsible protective measures because a lot is at stake – the way we understand and experience death and the way we care for the dead,” she says.

As with many developments in generative AI, there are also legal questions over the use of this technology, such as using people’s data to “train” the models that produce their likenesses.

“As with all things AI-related, the law is untested, very complex and varies from country to country. Users and platforms should be thinking about rights in the training data as well as the output and the various sources of regulation in the UK,” says Andrew Wilson-Bushell, a lawyer at the UK law firm Simkins.

However, he says grieftech probably faces a more significant challenge than laws relating to copyright and intellectual property.

“I expect that the use of AI ghosts will be tested in the court of public opinion long before a legal challenge is able to take place.”



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.