A dad just can’t seem to figure out why his six-year-old daughter wasn’t impressed by the AI toy he gave her for Christmas.
Alex Volkov, a self-styled “AI evangelist” and founder of a translation service powered by the tech, tweeted about his parenting difficulties over the holidays.
For better or worse, it’s provided us a glimpse into the mind of the ardent techno-optimist realizing that others, especially those of uncorrupted minds, may not share in their enthusiasm for large language models.
“Would you like to play a game or maybe hear a fun fact?” asks the annoyingly chirpy-sounding AI in a video shared by Volkov.
“No,” his daughter replies.
Incidents like these seem to genuinely baffle Volkov. He writes that he cannot understand why his daughter disabled the dinosaur plushie’s built-in AI voice — opting, instead, to play with it like a regular toy, and dressing it with clothes she made.
“She played with this Dino, chatted with it, and then… learned to turn it off, and doesn’t want it to talk anymore,” Volkov wrote. “She still loves playing with it,” he added, “but every time I ask her if she’d like to chat with it, she says no.”
“I gently asked why, and wasn’t able to really understand where’s the resistance,” he mused.
The Dino doll, sold by Magical Toys for $200 a piece, is billed as an alternative to letting your kid get their brain fried by iPad screens — and, well, fair enough. Using an app, parents can view their kid’s chat history and instruct the toy’s AI on what topics to talk about.
It’s unclear what AI model powers the toy. But whatever it is, it’s not impressing Volkov’s kid, no matter how many “experiments” he runs.
Processing the events aloud, Volkov says he made sure to tell her that Dino “wasn’t like other toys” — that it “has AI in it.” He also insists that his child, a wee lass of six years, does in fact understand what AI is.
It took him a while to get the message. Volkov turned the AI back on several times, but on each occasion, his daughter would speak with it only momentarily before turning it off again once she got bored.
Later, Volkov saw that his daughter was pretending the toy was a baby, so he asked the AI to act like it was one — by crying.
This, too, backfired. “It sounded weird, which made her laugh really hard,” Volkov wrote. “It was basically making crying sounds like talking.”
“Is this uncanny valley?” he pondered.
One user who said she was a psychiatrist who works with children offered an explanation for Volkov’s daughter’s lack of enthusiasm for the AI.
“I think because it takes away control from the child,” she wrote. “Play is how children work through emotions, impulses and conflicts [as] well as try out new behaviors. I would think [it] would be super irritating to have the toy shape and control your play — like a totally dominating playmate!”
That the toy deprives the child of using their imagination is a criticism echoed by many users who weighed in — some of them harshly. “This is the pinnacle of AI bros being the worst,” another netizen opined. “They cannot understand the value of imagination and creativity.”
Whatever the implications for a child’s development, there are serious questions to be asked about trusting a hallucination-prone AI model to talk with your kid.
More on AI: OpenAI’s Most Advanced AI Release Stumped by New York Times Word Game
Source link
lol