The news of Apple’s pending deal with Google says a lot. While neither Apple nor Google will likely ever publicly acknowledge it outside of necessary financial disclosures, Apple fans should take note.
It’s a very good thing for users. And also very concerning. We should be equally pleased and worried.
The Deal
The deal, as it has been reported (Apple has not officially acknowledged it) will see Apple paying Google about $1B a year to use a customized version of its Gemini AI model for the new Siri, which should be released to users in the spring.
The model is big and advanced, with a reported 1.2 trillion parameters, and will run on Apple’s own Private Cloud Compute servers so neither Google nor anyone else gets to scoop up your data. The partnership seemingly was struck after Apple evaluated Google’s AI, along with others from Anthropic and OpenAI, against its own internally developed LLM technology.
What it means for users
As a user, this is all pretty good news. The latest versions of Gemini are among the top LLMs in the industry—benchmarks vary, and Apple’s version might not be the same as Gemini 2.5 Pro, but it’s clear that Apple isn’t going with a second-rate model here.
Of course, talking to the new Siri won’t be just like talking to Google’s Gemini. For one, the voices will sound different, but they’ll also have different priorities and tuning, and Siri will have access to the private data stored on your phone. You could think of it as two completely different cars that have the same engine but different options and chassis.

Apple will use Gemini’s LLM as a foundation for the new Siri, but the two assistants wil still be very different.
Foundry
The fact that Apple was willing to break out the checkbook to use a core technology from another company for one of its most important (and oft-maligned) features speaks volumes to a change of mindset in Cupertino. When Apple needs new core technology, it usually builds it or buys a company that already has (often both).
That Apple is willing to step away from its homegrown mentality to deliver a new Siri that doesn’t disappoint is worthy of applause.
Not Invented Here
But Apple fans should also be wary. I’m generally critical of Apple’s “Not Invented Here” ethos, where it seemingly needs to own or build everything, whether it’s good for its users or not. There are lots of examples of that working out well—it took a long time, but Apple’s cellular modems and N1 networking chips give an experience at last as good as the Qualcomm and Broadcom stuff did—, but there instances where Apple’s stubbon reliance on in-house tech didn’t make sense.
For example, when OpenGL outlived its usefulness as a graphics API, Apple could have moved to the open Vulkan standard that replaced it, helping shape its future. Instead, it developed its own graphics API, Metal, and I’m not convinced that it was better for developers or users. I don’t think Apple needed its own lossless audio format.
Perhaps most notably, Google pays Apple some $20 billion a year for Google to be the default search engine for Safari. And yes, there are other search engine options, but we all know almost nobody strays from the default, which is why it’s worth so much to Google.

The Apple Maps rollout was bumpy at best.
That hasn’t been good for users. Google has been steadily degrading its search results experience while using the data from all those searches to consolidate its control over search and web advertising. If there’s anything Apple should have invested years ago, it is building its own privacy-minded, ad-free web search.
And we all know about the Apple Maps fiasco. The company’s attempt to stop relying on a third-party mapping service resulted in a terrible product, ironically because that product was built with a mishmash of data that it didn’t own or control. It took years for Apple to build a Maps experience using all its own data, and now that it has, the experience is top-tier.
Apple needs its own LLM
So it’s clear that some core technologies Apple needs to build for itself and have total control over, while others it can and probably should find outside solutions for. A foundation AI large language model is definitely in the former category.
As the years roll on, AI models are going to be part of so much more than chatbots. AI models are all over Apple’s products, from cameras recognizing your gestures to image editing to notification summaries and more. But the big foundational LLM that interacts with users and does everything from controlling our devices to gathering information about the world, that’s the most important AI model in the stack.
Apple having its own top-tier LLM is as important as Apple controlling any other major piece of its technology stack. It’s arguably going to be more important than Apple having its own web browser.
It’s great news that Apple recognized that its own internally developed LLM isn’t good enough right now, and is willing to go to Google to solve the problem. But in the end, Apple desperately needs to catch up or surpass the technology it is buying. And it’s not clear if it has the ability to do that, as competitors’ LLMs continue to improve and Apple seemingly loses AI talent every week.
As a user, you should be glad about the Apple-Google-Gemini-Siri AI deal. As long as it doesn’t last.

