AI cloning of celebrity voices outpacing the law, experts warn


It’s the new badge of celebrity status that nobody wants. Jennifer Aniston, Oprah Winfrey and Kylie Jenner have all had their voices cloned by fraudsters. Online blaggers used artificial intelligence to fake the Tiggerish tones of Martin Lewis, the TV financial adviser. And this weekend David Attenborough described himself as “profoundly disturbed” to have discovered that his cloned voice had been used to deliver partisan US news bulletins.

AI-generated David Attenborough voice used to report on Donald Trump – video

Now experts have warned that voice-cloning is outpacing the law as technologists hone previously clunky voice generators into models capable of emulating the subtlest pauses and breathing of human intonation.

Dr Dominic Lees, an expert in AI in film and television who is advising a UK parliamentary committee, told the Guardian on Monday: “Our privacy and copyright laws aren’t up to date with what this new technology presents, so there’s very little that David Attenborough can do.”

Lees is advising the House of Commons culture, media and sport select committee in an inquiry that will look at the ethical use of AI in film-making. He also convenes the Synthetic Media Research Network, whose members include the firm making an AI version of the late chatshow interviewer Michael Parkinson, which will result in an eight-part unscripted series, Virtually Parkinson, with new guests. That voice-cloning project is being done with the consent of Parkinson’s family and estate.

“The government definitely needs to look at [voice cloning], because it’s a major issue for fraud,” Lee said. “It needs the stick of government regulation in order to deter [misuse] … we can’t allow it to be a free-for-all.”

AI voice cloning scams were up 30% in the UK in the last year, according to research by NatWest bank this month. Another lender, Starling bank, found 28% of people had been targeted by an AI voice-cloning scam at least once in the past year.

Voice cloning is also reportedly being used by fraudsters to perpetrate a version of the “hi mum” text scam, in which fraudsters pose as children needing their parent to send funds urgently. On already fuzzy telephone lines, detecting that a pleading child is a scammer’s clone can be hard. Consumers are advised to check by hanging up and calling back on a trusted number.

People whose voices are cloned without their consent find it more than a nuisance. Attenborough told the BBC on Sunday: “Having spent a lifetime trying to speak what I believe to be the truth, I am profoundly disturbed to find that these days my identity is being stolen by others and greatly object to them using it to say what they wish.”

When a new voice option on OpenAI’s latest AI model, ChatGPT-4o, featured tones that were very close to those of the actor Scarlett Johansson, she said she was shocked and angered as the voice “sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference”.

skip past newsletter promotion

OpenAI voice assistant Sky sounds like Scarlett Johansson – video

The rise of cloned voices raises the question of what they miss about real human tones. Lees said: “The big problem is that AI doesn’t understand emotion and how that changes how a word or a phrase might have emotional impact, and how you vary the voice to represent that.”

The voiceover industry, which provides voices for adverts, animations and instructional training, is having to respond quickly to technological advances. Joe Lewis, the head of audio at the Voiceover Gallery in London, which has provided real human voices for adverts for Specsavers and National Express, said it had already cloned the voices of some of its artists.

He said AI seemed to work best with English male voices, perhaps because that reflected the bias in the type of recordings that had been used to train the algorithm, but he cautioned that in general “there’s something about the way it is generated that makes you less attentive”.

“When the AI [voice] breathes, it is a very repetitive breath,” he said. “The breaths are in the right place, but they don’t feel natural … [But] can it get to the point when it’s really perfect? I don’t see why not, but to get to the full emotional spectrum is a long way off.”



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.