The company made a splash with its AI announcement for Apple Intelligence during a keynote Monday at its annual WWDC event. CEO Tim Cook called Apple Intelligence “the new personal intelligence system that makes your most personal products even more useful and delightful.”
Apple Intelligence can be used across iPhones, iPads, and Macs. On your phone, for example, it can organize notifications by priority to make sure you don’t miss something important. New writing tools available through Apple Intelligence can also help rewrite, proofread, and summarize text for you across apps like Mail, Notes, and Safari.
Apple Intelligence can also lend a hand when it comes to image generation, another hot frontier for the AI race. You can create personalized images to add to your conversations in one of three styles — sketch, illustration, or animation.
One example shown in the demo highlights that Photos learns to recognize people regularly in your pictures. For example, it can create an image of your friend blowing out candles that you can send when you wish them a happy birthday.
This works in apps like Messages, Notes, Freeform, Keynote, and Pages.
Apple Intelligence can also help with tasks that require knowledge of your “personal context,” said Craig Federighi, Apple’s senior vice president of software engineering.
“Apple Intelligence is grounded in your personal information and context, with the ability to retrieve and analyze the most relevant data from across your apps as well as to reference the content on your screen like an email or calendar event you’re looking at.”
As an example, Federighi imagined a meeting was rescheduled and he was wondering if it’d prevent him from getting to his daughter’s play on time.
Apple Intelligence can “understand who my daughter is, the play details she sent several days ago, the time and location for my meeting, and predicted traffic between my office and the theater,” he said in the demo.
Apple Intelligence will also open up a new world of possibilities for Siri, said Kelsey Peterson, Apple’s director of machine learning and AI.
It’ll allow you to speak more conversationally with Siri; if you stumble on your words or accidentally misspeak before correcting yourself, Siri will still understand what you mean.
Siri will also maintain conversational context, so you can follow up in a conversation without having to spell everything out for Siri again in each question or command you give.
If you don’t want to talk to Siri, you’ll be able to double-tap at the bottom of your screen and type your questions or commands in there.
In addition, you can ask Siri questions about settings or features on your iPhone, even if you don’t know their specific name, and she’ll show you the answer or relevant result in the iPhone User Guide.
Apple Intelligence will also give Siri on-screen awareness to understand and act upon what’s on your screen.
If a friend texts you his new address, for example, you can tell Siri from the Messages app, “Add this address to his contact card.” Siri will take the address from the message on-screen, as well as the name of your friend, and carry out the task.
In the demo, Siri was also able to handle a request to “show me my photos of Stacey in New York wearing her pink coat” and surface the photos.
Source link
lol