“A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life.”
Rogue Clone
After an influencer released an AI clone of herself to interact with her followers, The Conversation reports, the situation quickly turned dark as her mostly male fans engaged in sexualized “scary” conversations — and, in many cases, the chatbot played right along.
It was so disturbing that influencer Caryn Marjorie unplugged her AI clone CarynAI after several months, even though the chatbot girlfriend raked in more than $70,000 in the first week of its release last year.
“A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” Marjorie told The Conversation, underscoring the many perils of AI chatbots going off script when interacting with the public.
Marjorie thought the AI chatbot would engage with her legions of fans in the same way she does in real life on social media platforms like Snapchat, where she posts flirty selfies and travels to glamorous hot spots abroad.
But her followers were eager to divulge disturbing confessions, thoughts and sexual fantasies to CarynAI, which essentially went rogue and enthusiastically reciprocated with its own highly-charged sexual comments.
Even a second version of the AI chatbot, which was meant to be less romantic, was a magnet for dark sexualized chats from followers.
“What disturbed me more was not what these people said, but it was what CarynAI would say back,” Marjorie told The Conversation, commenting on her loss of control over her virtual self. “If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy.”
Black Box
Rogue AI chatbots are nothing new, but the parasocial relationships users develop with AI clones is a topic that’s quickly gaining mainstream attention as companies like OpenAI allow people to create AI girlfriends and AI clones of dead relatives are offered up for sale to grieving people.
On first glance, these AI clones might seem like a panacea to our atomized, lonely existence in the 21st century.
But before people decide to interact with AI clones, they should ask themselves some questions: are they stunting their personal growth as they enter these interactions? And where is their data going?
The Conversation rightly points out that these chats people engage in are essentially digital inputs that will be sent back into the machine learning models that drive these chatbots. Are they being fed into other models that people are not aware of? Can this data be hacked and used for extortion?
The answers are hard to obtain because there’s very little transparency about what’s happening to user data, The Conversation reports, and that should give pause to anybody who dares to think they need an AI girlfriend.
So you may fool yourself with an AI companion who seems eager to accept your most private self, but there’s cold machinery behind this chatbot, and your inner most thoughts are a commodity.
More on AI clones: People Are Selling AI Clones of Dead Relatives for Just $150
Source link
lol