OpenAI’s ChatGPT just got a whole lot chatty. The company is rolling out its Advanced Voice Mode to a select group of ChatGPT Plus users, allowing for real-time conversations that sense and respond to emotion.
Think less robotic AI voice and more uncanny valley human-like chat. But there’s a catch—it’s not quite the revolutionary voice feature OpenAI first teased back in May.
You might recall the demo that had everyone freaking out over a voice that sounded suspiciously like Scarlett Johansson’s. Well, that particular voice, known as Sky, isn’t part of the alpha rollout.
OpenAI claims it didn’t actually use Johansson’s voice, but the actress was spooked enough to lawyer up. Now, Advanced Voice Mode will be limited to four preset voices created with paid voice actors.
So what makes this voice mode so “advanced”?
For starters, GPT-4o, the AI model behind the voice, is multimodal. That means it can process voice inputs and generate human-like responses without needing separate models for transcription and voice synthesis.
The result is supposedly way more natural and less laggy. OpenAI’s even tested it with over 100 external testers speaking 45 different languages.
But here’s the thing: this voice tech is a potential minefield of safety and ethical issues. We’ve already seen AI voice cloning tech used to impersonate politicians.
To avoid that kind of thing, OpenAI says it’s introducing filters to block copyrighted audio and impersonations. The company is also promising a report on its safety efforts in August.
As for the rest of us, it seems we’ll have to wait until fall for a chance to try out Advanced Voice Mode. OpenAI’s taking a cautious approach, rolling it out gradually to Plus users.
Guess that means we’ll just have to stick with the less advanced, but still kind of cool voice mode currently available in ChatGPT for now.
How-To (Maybe): Get Ready for Hyperrealistic AI Chats
If you’re one of the lucky Plus users getting access to Advanced Voice Mode, OpenAI will alert you in the ChatGPT app. Keep an eye out for that, then follow the instructions in the email.
As you’ll see below. some users are getting their hands on it now and sharing their findings in the wild.
As for the rest of us, here’s a little prep work you can do for when Advanced Voice Mode lands:
- Brush up on your voice prompts: Think about what kinds of conversations you want to have with ChatGPT. The more specific and natural your prompts, the better the AI’s responses will be.
- Get comfy with voice assistants: If you haven’t already, try out some voice commands with your favorite assistant (Siri, Alexa, Google – take your pick). See what works well and what doesn’t.
- Think about the ethics: AI voice tech is powerful stuff. Start considering where you’re comfortable with it, and where things might get creepy.
We’ll be keeping an eye on how this rolls out, and OpenAI’s promised safety measures.
In the meantime, let us know if you’re one of the lucky few who gets to try out Advanced Voice Mode! What’s your first conversation going to be?
Have any thoughts on this? Drop us a line below in the comments, or carry the discussion to our Twitter or Facebook.
Editors’ Recommendations:
Follow us on Flipboard, Google News, or Apple News
Source link
lol