Apple has confirmed it will use Google's Gemini to power the new version of Siri, in what represents the most radical change to the voice assistant in its history. The deal, valued at approximately $1 billion per year, will completely transform how you interact with your iPhone.
The integration was originally planned for iOS 26.4, with a developer beta scheduled for late February. However, Apple has decided to delay some of the more ambitious features to later versions of the operating system.
What changes in Siri with Gemini
An assistant that truly understands context
The new Siri will use a 1.2 trillion parameter model (internally known as Apple Foundation Models v10) running on Apple's Private Cloud Compute infrastructure. This means Siri will finally be able to:
- Understand what's on your screen and act on it
- Remember previous conversations and follow up
- Perform actions within apps (not just open apps)
- Process complex, multi-step requests
Privacy: how Apple protects your data
Apple insists that data processed by Gemini on Private Cloud Compute is never stored or accessible to Google. Processing occurs on Apple's own servers with custom chips, and data is deleted immediately after each request.
Which features arrive first and which are delayed
Available in iOS 26.4 (March 2026)
- Screen-aware Siri: can see and understand what you're looking at
- Basic in-app actions via App Intents API
- More natural, conversational responses
- Better understanding of ambiguous requests
Delayed to iOS 26.5 or later (May 2026+)
- Advanced personal data access (searching old messages, photos, etc.)
- Extended chatbot-style conversation features
- Deep smart home device integration
Why Apple chose Gemini over Claude or OpenAI
According to reports from AppleInsider, Apple evaluated both Anthropic (Claude) and OpenAI before signing with Google. The decision came down to pricing: Anthropic asked for significantly more for access to its models. Google, with its massive infrastructure, was able to offer a more competitive deal.
Additionally, the existing relationship between Apple and Google (the default search engine deal for Safari is worth over $20 billion annually) facilitated negotiations.
Impact on other Apple products
The delay in advanced Siri features has a domino effect on at least four products Apple had planned:
- Smart home hub: the long-rumored device needs an advanced Siri
- Apple smart doorbell: depends on improved voice recognition
- Apple AR glasses: require sophisticated voice control
- New Apple TV: integration with Siri as the home hub center
These products could be delayed to late 2026 or even 2027 if Siri doesn't reach the necessary capabilities in time.
What this means for you as a user
If you have an iPhone compatible with iOS 26.4 (iPhone 15 and later), you'll receive the basic Siri improvements in March. The truly revolutionary features will arrive progressively throughout the rest of 2026.
The good news is that Siri will finally compete with Google Assistant and ChatGPT in AI capabilities. The bad news is that Apple, true to form, will do it at its own pace.