Apple Intelligence is out today
The iPhone 16 will finally be able to use the AI it was supposedly “built for.” | Photo: Allison Johnson / The Verge
Apple’s AI features are finally starting to appear. Apple Intelligence is launching today on the iPhone, iPad, and Mac, offering features like generative AI-powered writing tools, notification summaries, and a cleanup tool to take distractions out of photos. It’s Apple’s first official step into the AI era, but it’ll be far from its last.
Apple Intelligence has been available in developer and public beta builds of Apple’s operating systems for the past few months, but today marks the first time it’ll be available in the full public OS releases. Even so, the features will still be marked as “beta,” and Apple Intelligence will very much remain a work in progress. Siri gets a new look, but its most consequential new features — like the ability to take action in apps — probably won’t arrive until well into 2025.
In the meantime, Apple has released a very “AI starter kit” set of features. “Writing Tools” will help you summarize notes, change the tone of your messages to make them friendlier or more professional, and turn a wall of text into a list or table. You’ll see AI summaries in notifications and emails, along with a new focus mode that aims to filter out unimportant alerts. The updated Siri is signified by a glowing border around the screen, and it now allows for text input by double-tapping the bottom of the screen. It’s helpful stuff, but we’ve seen a lot of this before, and it’ll hardly represent a seismic shift in how you use your iPhone.
Apple says that more Apple Intelligence features will arrive in December. ChatGPT will be available in Siri; Writing Tools will let you describe the changes you want Apple’s AI to make; and Apple’s AI camera feature — Visual Intelligence — will be able to tell you about objects around you. In the following months, Apple says that it’ll launch Priority Notifications and major upgrades for Siri, including awareness of what’s on your screen and the ability to take action within apps.
The AI future we’ve been promised may still be a long way off, but in the meantime, you can download iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 yourself to start testing out the basics. Apple Intelligence will be available first in US English, with other languages to follow in the coming year. You’ll also need recent Apple hardware to use it, with the feature largely being gated to M-series chips and the very latest iPhones and iPads.
Availability will expand in December to Australia, Canada, Ireland, New Zealand, South Africa, and the UK, with additional languages coming in April.
The iPhone 16 will finally be able to use the AI it was supposedly “built for.” | Photo: Allison Johnson / The Verge
Apple’s AI features are finally starting to appear. Apple Intelligence is launching today on the iPhone, iPad, and Mac, offering features like generative AI-powered writing tools, notification summaries, and a cleanup tool to take distractions out of photos. It’s Apple’s first official step into the AI era, but it’ll be far from its last.
Apple Intelligence has been available in developer and public beta builds of Apple’s operating systems for the past few months, but today marks the first time it’ll be available in the full public OS releases. Even so, the features will still be marked as “beta,” and Apple Intelligence will very much remain a work in progress. Siri gets a new look, but its most consequential new features — like the ability to take action in apps — probably won’t arrive until well into 2025.
In the meantime, Apple has released a very “AI starter kit” set of features. “Writing Tools” will help you summarize notes, change the tone of your messages to make them friendlier or more professional, and turn a wall of text into a list or table. You’ll see AI summaries in notifications and emails, along with a new focus mode that aims to filter out unimportant alerts. The updated Siri is signified by a glowing border around the screen, and it now allows for text input by double-tapping the bottom of the screen. It’s helpful stuff, but we’ve seen a lot of this before, and it’ll hardly represent a seismic shift in how you use your iPhone.
Apple says that more Apple Intelligence features will arrive in December. ChatGPT will be available in Siri; Writing Tools will let you describe the changes you want Apple’s AI to make; and Apple’s AI camera feature — Visual Intelligence — will be able to tell you about objects around you. In the following months, Apple says that it’ll launch Priority Notifications and major upgrades for Siri, including awareness of what’s on your screen and the ability to take action within apps.
The AI future we’ve been promised may still be a long way off, but in the meantime, you can download iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 yourself to start testing out the basics. Apple Intelligence will be available first in US English, with other languages to follow in the coming year. You’ll also need recent Apple hardware to use it, with the feature largely being gated to M-series chips and the very latest iPhones and iPads.
Availability will expand in December to Australia, Canada, Ireland, New Zealand, South Africa, and the UK, with additional languages coming in April.