Google’s AI Overview Believes Baby Elephants Can Sit in the Palm of a Human Hand

Google’s AI Overview Believes Baby Elephants Can Sit in the Palm of a Human Hand


Are baby elephants, which weigh hundreds of pounds at birth, tiny enough to fit squarely within the bounds of a human palm? Common sense would say no — but Google’s AI Overview, or the AI-generated summary of web content that now often pops up at the top of Google results pages, seems to think the answer is yes.

A Google search for “baby elephant” returns an incongruous AI summary that, despite accurately noting that newborn elephant calves “weigh between 200 and 364” pounds and “stand about 3 [feet] tall,” inexplicably includes an obviously fake image of a teacup-sized elephant nestled into the palm of a human hand. It’s smaller than Demi Moore’s dog — and yes, it’s clearly AI-generated.

Things only get worse from there. A simple click on the image reveals that it was created by an Etsy user named DazzlingVisions, who’s selling the synthetic photo — along with many other similar images depicting a variety of tiny versions of definitely-much-larger animals — as a “digital print” for a dirt-cheap $1.69. The critter is described in an associated caption as an “adorable/impossible baby elephant.”

The Overview slip-up is a fascinating instance of Google’s AI search feature blending sound information with glaring fabulism, resulting in a confusing information stew and again calling the discernment of the search giant’s AI Overview — how it sources text and imagery, and what it considers reliable — into question.

The Etsy seller themself, after all, noted that the elephant was “impossible.” How did it make its way into Google’s AI summary?

The bizarre inclusion is the latest hiccup for Google’s AI search function, which has drawn widespread ridicule and numerous controversies since its release in 2023. Growing public controversy around the feature’s oft-questionable results reached a fever pitch in 2024, when the tool was found telling people they should eat rocks for their health, or consider mixing Elmer’s glue into pizza sauce to ensure optimal cheese stickiness.

Meanwhile, among plenty of other gaffes, users also found that the tool falsely referred to Barack Obama as the “first Muslim president” — a troubling harkening back to the debunked Birther conspiracy. Our reporting has found the AI Overview offering terrible, deeply mangled geography information, as well as some very gross — not to mention ill-advised! — guidance for parents hoping to bathroom-train their kids.

In each of these varied instances, the issue seemed to come down to one very important quality: nuance.

Take the example of the AI suggesting that users might consider chowing down on rocks. Its source for this very bad piece of advice was The Onion — the satirical website that publishes news-like comedy articles, which the AI seemingly took at face value. As for the the glue-on-pizza advice? That brain blast came from a decade-old Reddit comment posted by a user who went by the moniker “Fucksmith.” Not exactly a reliable source.

And now, of course, we have AI Overview’s embrace of “adorable/impossible baby elephant,” which speaks once again to the AI’s apparent lack of media literacy — and, by the same token, renews concerns around the search tool’s usefulness. Over a year in, is Google’s AI Overview making searchers’ lives easier? Or is it muddying information waters, and causing more problems than it’s worth?

More on Google’s Search AI: Google Is Stuffing Annoying Ads Into Its Terrible AI Search Feature





Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.