Apple & Google Confirm Multi-Billion Dollar AI Partnership; Major Siri Overhaul Delayed to 2026
Apple and Google have announced a multi-year collaboration to integrate Google's Gemini AI models into Apple's services, including a more personalized version of the Siri virtual assistant. Separately, Apple is planning a significant overhaul of Siri, including a redesigned interface, a standalone application, and expanded third-party AI integrations, expected to debut in 2026. The developments follow internal organizational changes at Apple and reported delays in the company's in-house AI efforts.
The Apple-Google Partnership
Agreement and Financial Terms
"The deal is structured as a cloud computing contract, with an estimated annual payment of approximately one billion dollars."
Apple and Google confirmed that Google's Gemini AI models will serve as the foundation for future Apple Intelligence features. The companies stated that Apple Foundation Models will be based on Google's Gemini models and cloud technology. Google described the agreement as a multi-year contract.
According to the Financial Times, the deal is characterized as a multi-billion dollar agreement. A source familiar with the arrangement indicated the deal is structured as a cloud computing contract, with an estimated annual payment of approximately one billion dollars. This structure could result in Apple remitting several billion dollars to Google over the contract's duration.
Technical Implementation
Apple stated that the Gemini models will run on its Private Cloud Compute servers to maintain user privacy. Apple Intelligence will continue to operate on Apple devices and its Private Cloud Compute infrastructure. Local models will also remain part of the system. Apple stated its evaluation concluded that Google's AI technology provides an adequate foundation for its models while maintaining its privacy standards.
OpenAI's Position
A recent report stated that OpenAI made a deliberate choice not to partner with Apple for Siri's AI in the autumn of the previous year. A person close to OpenAI mentioned the company intends to prioritize the development of its own AI device, which is hardware designed by former Apple design chief Jony Ive. It is not confirmed whether Apple formally offered a contract or if OpenAI explicitly declined a potential offer.
Organizational and Leadership Changes
Reorganization Under Craig Federighi
Apple reorganized its artificial intelligence strategy, placing it under the direct oversight of software chief Craig Federighi. This restructuring aims to accelerate the overhaul of Siri by integrating external AI models, following previous internal delays. Federighi reportedly expressed dissatisfaction with the pace of Apple's AI progress in the fall. By December, AI leadership was consolidated under him, a transition that began earlier in the year when Siri's oversight moved to his software division.
Leadership Background and Style
Federighi was previously characterized as an "AI skeptic" until he engaged with OpenAI's ChatGPT in late 2022. Reports indicate he is meticulous about expenses, including office supplies, and has shown hesitation in funding high-risk projects with uncertain returns. He reportedly preferred software with fixed behaviors over algorithms that dynamically alter functionality. After his experience with ChatGPT, Federighi reportedly developed an appreciation for AI technology and subsequently directed his teams to explore its integration into Apple products.
Currently, Mike Rockwell, who leads Siri development, reports to Federighi. Reports have also highlighted a dynamic between Federighi and John Giannandrea regarding AI strategy.
Internal Tensions and Strategy
Tensions over AI strategy arose around 2019 when Mike Rockwell, then leading Vision Pro development, proposed an AI-driven interface and criticized Federighi's approach as conservative. Federighi initially viewed AI as unpredictable, favoring deterministic software, and reportedly rejected proposals for AI to dynamically reorganize the iPhone home screen to avoid user confusion.
Federighi reportedly concluded that Apple's internal models did not perform adequately on devices, while the foundation models team felt challenges related to model optimization were within the software organization's scope. Apple intends to continue developing its own on-device AI models and plans to adapt external partner models to run more efficiently on Apple hardware, aiming to reduce long-term dependence. The company is reportedly considering acquiring smaller AI firms specializing in model compression and optimization.
Apple Intelligence Rollout and Market Performance
Rollout Challenges
Apple encountered initial difficulties with its Apple Intelligence rollout. The introduction of Apple Intelligence faced challenges in 2024. The iPhone 16, initially marketed as 'Built for Apple Intelligence,' launched without these specific features. While some functionalities were introduced over subsequent months, a more advanced Siri did not materialize as anticipated. Apple executives acknowledged a need to re-evaluate the strategy, which led to internal personnel adjustments.
Device Compatibility Expansion
Initially, Apple Intelligence required an A17 Pro chip or later, limiting availability to iPhone 15 Pro and iPhone 15 Pro Max models at launch. The delay in rollout has resulted in a broader range of devices supporting Apple Intelligence. This includes all iPhone 16 and iPhone 17 models, in addition to the iPhone 15 Pro series. Consequently, the number of Apple Intelligence-supported iPhone models has increased to 11. This expansion means Apple Intelligence will be accessible as a free software update for a substantial portion of the iPhone customer base.
Market Performance
Consumer demand for iPhones remained robust during this period. IDC’s Q3 2025 report indicated strong demand for the iPhone 17 lineup, with pre-orders surpassing those of the previous generation. Counterpoint Research identified Apple as the global smartphone market leader in 2025, reporting a 10% year-over-year growth in market share. Concurrently, the marketing emphasis on Apple Intelligence for the iPhone 17 was less prominent compared to the iPhone 16.
Siri Overhaul and iOS 27 Features
Standalone Siri Application
Apple is developing a standalone Siri application for iPhone, iPad, and Mac. The app will feature an interface similar to other chatbot applications, displaying past conversations in a grid or list format. Users will be able to engage with Siri via text or voice, favorite chats, search within conversations, initiate new chats, and save interactions. The conversation layout is expected to resemble iMessage with chat bubbles. New conversations will begin with suggested prompts to guide user interaction.
Enhanced Capabilities
The updated Siri will offer functionalities including:
- Searching the web with visually rich results.
- Generating images and content.
- Summarizing information.
- Analyzing uploaded documents and files.
- Using personal data to complete tasks.
- Ingesting information from emails, messages, and other files.
- Analyzing open windows and on-screen content for contextual actions.
- Controlling device features and settings.
- Searching for on-device content, replacing current Spotlight functionality.
Siri will also integrate into core Apple applications like Mail, Messages, Apple TV, Xcode, and Photos, enabling specific actions such as image searching, photo editing, coding assistance, media suggestions, and email composition. Siri Suggestions will expand with greater access to user data for more relevant prompts.
Multi-Request Processing
Apple is testing a new feature for Siri that would enable it to process multiple requests within a single query. For instance, users could combine requests such as checking the weather, creating a calendar appointment, and sending a message into one command. Siri is also expected to incorporate persistent context, allowing it to remember context over a longer period.
User Interface Updates
The chatbot version of Siri will feature a new visual interface. Activating Siri will display a new animation prompting users for input. Apple is reportedly testing an integration of Siri within the Dynamic Island, where a glowing Siri icon and "searching" label would appear during processing, expanding into a translucent panel for results. A pull-down gesture would initiate a conversation interface.
Additional UI changes may include an "Ask Siri" button in app menus to send content directly to Siri with requests, and a "Write with Siri" option within the iOS keyboard to access writing tools.
Camera Integration
Apple plans to add a new Siri Camera Mode alongside standard photo and video options in the Camera app in iOS 27. The Visual Intelligence feature, currently accessible via the Camera Control button or Control Center, will be integrated into the Camera app to improve discoverability. Siri mode is described as an enhanced version of Visual Intelligence, with a redesigned shutter button styled after the Apple Intelligence logo. Additional capabilities in iOS 27 will include scanning a nutrition label on food packaging to log dietary information and adding contact details by scanning information.
Third-Party AI Integration
Extensions System
"Extensions allow agents from installed apps to work with Siri, the Siri app and other features on your devices."
Apple plans to introduce a feature called "Extensions" in iOS 27, iPadOS 27, and macOS 27, allowing users to select which third-party AI services power Siri and other Apple Intelligence features. Users will be able to choose from third-party AI models via App Store apps. The selected models will integrate with Siri, Writing Tools, Image Playground, and other Apple Intelligence features.
Companies such as Google and Anthropic could add support for their respective AI models (Gemini and Claude) through this system. Bloomberg reports that models from Google and Anthropic are being tested. Internal, pre-release versions of iOS 27 reportedly contain text in the Settings app stating the quote above. The App Store will feature a specific "Extensions" section, described as a marketplace for third-party AI integrations.
Integration with Third-Party Chatbots
Siri will connect with multiple third-party AI chatbots, including Google's Gemini, Anthropic's Claude, and OpenAI's ChatGPT, via the Extensions system. Users will be able to specify which chatbot to use for additional information and services.
Memory and Privacy
Apple is discussing the extent of conversational memory for the Siri chatbot. Memory retention may be limited to enhance user privacy.
Timeline and Availability
Announcement Schedule
Apple is expected to announce a new version of Siri in the second half of February, according to Bloomberg's Mark Gurman. This updated Siri will showcase capabilities resulting from Apple's partnership with Google, featuring demonstrations of Gemini-powered functionalities.
A more comprehensive reveal of the new Siri, internally codenamed Campos, is anticipated at Apple's annual developer conference (WWDC 2026), which begins on June 8. Apple is expected to unveil its upcoming software updates, including potential "AI advancements" related to Siri, at this event.
Software Release Timeline
The new Siri is expected to be integrated into iOS 26.4, with beta testing scheduled for February, leading to a public release in March or early April. The latest Siri and associated Gemini-powered Apple Intelligence features are projected to arrive with iOS 27, iPadOS 27, and macOS 27, with beta releases expected in the summer and a broad release to users in the fall (September).