In case you’ve upgraded to a more moderen iPhone mannequin just lately, you’ve most likely observed that Apple Intelligence is displaying up in a few of your most-used apps, like Messages, Mail, and Notes. Apple Intelligence (sure, additionally abbreviated to AI) confirmed up in Apple’s ecosystem in October 2024, and it’s right here to remain as Apple competes with Google, OpenAI, Anthropic, and others to construct the perfect AI instruments.
What’s Apple Intelligence?
Cupertino advertising executives have branded Apple Intelligence: “AI for the rest of us.” The platform is designed to leverage the issues that generative AI already does properly, like textual content and picture technology, to enhance upon current options. Like different platforms together with ChatGPT and Google Gemini, Apple Intelligence was skilled on giant info fashions. These methods use deep studying to kind connections, whether or not it’s textual content, pictures, video or music.
The textual content providing, powered by LLM, presents itself as Writing Instruments. The function is offered throughout numerous Apple apps, together with Mail, Messages, Pages and Notifications. It may be used to offer summaries of lengthy textual content, proofread and even write messages for you, utilizing content material and tone prompts.
Picture technology has been built-in as properly, in comparable style — albeit a bit much less seamlessly. Customers can immediate Apple Intelligence to generate customized emojis (Genmojis) in an Apple home model. Picture Playground, in the meantime, is a standalone picture technology app that makes use of prompts to create visible content material that can be utilized in Messages, Keynote or shared by way of social media.
Apple Intelligence additionally marks a long-awaited face-lift for Siri. The good assistant was early to the sport, however has largely been uncared for for the previous a number of years. Siri is built-in far more deeply into Apple’s working methods; as an example, as a substitute of the acquainted icon, customers will see a glowing gentle across the fringe of their iPhone display when it’s doing its factor.
Extra importantly, new Siri works throughout apps. Which means, for instance, that you may ask Siri to edit a photograph after which insert it immediately right into a textual content message. It’s a frictionless expertise the assistant had beforehand lacked. Onscreen consciousness means Siri makes use of the context of the content material you’re at the moment engaged with to offer an applicable reply.
Main as much as WWDC 2025, many anticipated that Apple would introduce us to an much more souped-up model of Siri, however we’re going to should wait a bit longer.
“As we’ve shared, we’re continuing our work to deliver the features that make Siri even more personal,” mentioned Apple SVP of Software program Engineering Craig Federighi at WWDC 2025. “This work needed more time to reach our high-quality bar, and we look forward to sharing more about it in the coming year.”
This yet-to-be-released, extra customized model of Siri is meant to have the ability to perceive “personal context,” like your relationships, communications routine, and extra. However in keeping with a Bloomberg report, the in-development model of this new Siri is too error-ridden to ship, therefore its delay.
At WWDC 2025, Apple additionally unveiled a brand new AI function referred to as Visible Intelligence, which helps you do a picture seek for stuff you see as you browse. Apple additionally unveiled a Reside Translation function that may translate conversations in actual time within the Messages, FaceTime, and Cellphone apps.
Visible Intelligence and Reside Translation are anticipated to be out there later in 2025, when iOS 26 launches to the general public.
When was Apple Intelligence unveiled?
After months of hypothesis, Apple Intelligence took middle stage at WWDC 2024. The platform was introduced within the wake of a torrent of generative AI information from corporations like Google and Open AI, inflicting concern that the famously tight-lipped tech big had missed the boat on the most recent tech craze.
Opposite to such hypothesis, nonetheless, Apple had a crew in place, engaged on what proved to be a really Apple strategy to synthetic intelligence. There was nonetheless pizzazz amid the demos — Apple at all times likes to placed on a present — however Apple Intelligence is in the end a really pragmatic tackle the class.
Apple Intelligence isn’t a standalone function. Somewhat, it’s about integrating into current choices. Whereas it’s a branding train in a really actual sense, the big language mannequin (LLM) pushed know-how will function behind the scenes. So far as the patron is anxious, the know-how will largely current itself within the type of new options for current apps.
We realized extra throughout Apple’s iPhone 16 occasion in September 2024. Throughout the occasion, Apple touted a variety of AI-powered options coming to its gadgets, from translation on the Apple Watch Collection 10, visible search on iPhones, and a variety of tweaks to Siri’s capabilities. The primary wave of Apple Intelligence is arriving on the finish of October, as a part of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1.
The options launched first in U.S. English. Apple later added Australian, Canadian, New Zealand, South African, and U.Okay. English localizations. Help for Chinese language, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese will arrive in 2025.
Who will get Apple Intelligence?

The primary wave of Apple Intelligence arrived in October 2024 by way of iOS 18.1, iPadOS 18., and macOS Sequoia 15.1 updates. These updates included built-in writing instruments, picture cleanup, article summaries, and a typing enter for the redesigned Siri expertise. A second wave of options grew to become out there as a part of iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2. That record consists of, Genmoji, Picture Playground, Visible Intelligence, Picture Wand, and ChatGPT integration.
These choices are free to make use of, as long as you’ve got one of many following items of {hardware}:
- All iPhone 16 fashions
- iPhone 15 Professional Max (A17 Professional)
- iPhone 15 Professional (A17 Professional)
- iPad Professional (M1 and later)
- iPad Air (M1 and later)
- iPad mini (A17 or later)
- MacBook Air (M1 and later)
- MacBook Professional (M1 and later)
- iMac (M1 and later)
- Mac mini (M1 and later)
- Mac Studio (M1 Max and later)
- Mac Professional (M2 Extremely)
Notably, solely the Professional variations of the iPhone 15 are getting entry, owing to shortcomings on the usual mannequin’s chipset. Presumably, nonetheless, the entire iPhone 16 line will be capable of run Apple Intelligence when it arrives.
How does Apple’s AI work with out an web connection?

Whenever you ask GPT or Gemini a query, your question is being despatched to exterior servers to generate a response, which requires an web connection. However Apple has taken a small-model, bespoke strategy to coaching.
The largest good thing about this strategy is that many of those duties develop into far much less useful resource intensive and might be carried out on-device. It’s because, moderately than counting on the sort of kitchen sink strategy that fuels platforms like GPT and Gemini, the corporate has compiled datasets in-house for particular duties like, say, composing an electronic mail.
That doesn’t apply to all the pieces, nonetheless. Extra complicated queries will make the most of the brand new Personal Cloud Compute providing. The corporate now operates distant servers operating on Apple Silicon, which it claims permits it to supply the identical degree of privateness as its client gadgets. Whether or not an motion is being carried out regionally or by way of the cloud will probably be invisible to the consumer, except their machine is offline, at which level distant queries will toss up an error.
Apple Intelligence with third-party apps

A whole lot of noise was made about Apple’s pending partnership with OpenAI forward of the launch of Apple Intelligence. Finally, nonetheless, it turned out that the deal was much less about powering Apple Intelligence and extra about providing another platform for these issues it’s not likely constructed for. It’s a tacit acknowledgement that constructing a small-model system has its limitations.
Apple Intelligence is free. So, too, is entry to ChatGPT. Nonetheless, these with paid accounts to the latter may have entry to premium options free customers don’t, together with limitless queries.
ChatGPT integration, which debuts on iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2, has two main roles: supplementing Siri’s data base and including to the present Writing Instruments choices.
With the service enabled, sure questions will immediate the brand new Siri to ask the consumer to approve its accessing ChatGPT. Recipes and journey planning are examples of questions which will floor the choice. Customers also can immediately immediate Siri to “ask ChatGPT.”
Compose is the opposite main ChatGPT function out there via Apple Intelligence. Customers can entry it in any app that helps the brand new Writing Instruments function. Compose provides the flexibility to write down content material primarily based on a immediate. That joins current writing instruments like Model and Abstract.
We all know for certain that Apple plans to accomplice with extra generative AI providers. The corporate all however mentioned that Google Gemini is subsequent on that record.
Can builders construct on Apple’s AI fashions?
At WWDC 2025, Apple introduced what it calls the Basis Fashions framework, which is able to let builders faucet into its AI fashions whereas offline.
This makes it extra doable for builders to construct AI options into their third-party apps that leverage Apple’s current methods.
“For example, if you’re getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging,” Federighi mentioned at WWDC. “And because it happens using on-device models, this happens without cloud API costs […] We couldn’t be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you’re offline, and that protect your privacy.”