Apple’s annual Worldwide Developer Conference keynote took place in California on Monday (10 June), marking a major milestone for the company’s AI offerings.
The announcement-packed conference saw the tech giant unveil a range of updates, including new apps and a new operating system for its Apple Vision Pro device. However, the company’s main focus remained firmly on AI.
Apple CEO, Tim Cook, announced a series of GenAI products and services during the conference keynote, including a long-rumoured partnership with OpenAI and the company’s most significant AI offering yet, Apple Intelligence.
Apple said: “A cornerstone of Apple Intelligence is on-device processing, which delivers personal intelligence without collecting users’ data.”
The conference announcements put Apple firmly back in the AI race after concerns that the company had fallen behind.
Here are Apple’s biggest AI updates from WWDC 2024.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataApple announces Apple Intelligence
In a big push for the company’s AI capabilities, Apple announced a range of new tools under the umbrella of Apple Intelligence.
The tech giant said the tools, which will operate on-device, built in to new versions of iPhones, iPads and Macs, will help users write and get things done with ease. CEO Tim Cook said Apple Intelligence will go “beyond AI” into “personal intelligence”.
“AI has to understand you and be grounded in your personal context like your routine, your relationships, your communications and more. It’s beyond AI. It’s personal intelligence,” said Cook.
Apple Intelligence will recognise notifications which are important to an individual user’s personal context, proofread improvements across all apps, and generate photos based on the user’s photo library.
Craig Federighi, senior vice president of software engineering at Apple, said the privacy of Apple Intelligence was a top priority for the company. In a blog post following the conference, Apple described its intelligence as setting a “brand-new standard for privacy in AI.”
Federighi said Apple Intelligence was a collection of large language and “diffusion models” that can work across apps to identify data. He added that while most of these models will work on the device itself, some will need to be stored in the cloud.
Federighi said users will be able to be in charge of the data stored in the cloud and how it can be accessed.
“We want to extend the privacy and security of your iPhone into the cloud,” Federighi added.
Siri will use OpenAI’s ChatGPT
Apple also announced its long-rumoured partnership with OpenAI, stating that Siri will integrate ChatGPT technology into responses in a new version of the company’s voice assistant.
Siri will ask a user if they want to share the question with ChatGPT and then share its responses. The technology will also be implemented across all of Apple’s writing tools.
According to the company, the technology will be free to use and will not require users to create an account. Apple also claimed users’ requests and data will not be logged.
Apple announces big new Siri update with AI
Apple also announced a range of AI updates to Siri which will be powered by the company’s Apple Intelligence.
Siri will now be able to correct users’ statements in real time, be controlled via text and be better integrated into the operating system.
The company said Siri can use its AI to answer “thousands of questions” and take action across a whole device. For example, a user can ask Siri to find a photo of a family member in the Photos app and email it to someone from the Mail app.
Siri will also take advantage of the “personal context” that Apple Intelligence can provide a user. The voice assistant will be able to give personalised updates through machine learning, such as showing you something you sent to a contact over the last few days.