In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what
In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what it's able to do.


The iOS 26.4 version of ‌Siri‌ won't work like ChatGPT or Claude, but it will rely on large language models (LLMs) and has been updated from the ground up.

Upgraded Architecture


The next-generation version of ‌Siri‌ will use advanced large language models, similar to those used by ChatGPT, Claude, and Gemini. Apple isn't implementing full chatbot interactions, but any upgrade is both better than what's available now and long overdue.

Right now, ‌Siri‌ uses machine learning, but it doesn't have the reasoning capabilities that LLM models impart. ‌Siri‌ relies on multiple task-specific models to complete a request, going from one step to another. ‌Siri‌ has to determine the intent of a request, pull out relevant information (a time, an event, a name, etc), and then use APIs or apps to complete the request. It's not an all-in-one system.

In iOS 26.4, ‌Siri‌ will have an LLM core that everything else is built around. Instead of just translating voice to text and looking for keywords to execute on, ‌Siri‌ will actually understand the specifics of what a user is asking, and use reasoning to get it done.

LLM Improvements


‌Siri‌ today is usually fine for simple tasks like setting a timer or alarm, sending a text message, toggling a smart home device on or off, answering a simple question, or controlling a device function, but it doesn't understand anything more complicated, it can't complete multi-step tasks, it can't interpret wording that's not in the structure it wants, it has no personal context, and it doesn't support follow-up questions.

An LLM should solve most of those problems because ‌Siri‌ will have something akin to a brain. LLMs can understand the nuance of a request, suss out what it is someone actually wants, and take the steps to deliver that information or complete the requested action.

We already know some of what LLM ‌Siri‌ will be able to do because Apple described the Apple Intelligence features it wants to implement when iOS 18 debuted.

Promised Siri Apple Intelligence Features


Apple described three specific ways that ‌Siri‌ will improve, including personal context, the ability to see what's on the screen to know what the user is talking about, and the capability to do more in and between apps.

‌Siri‌ will understand pronouns, references to content on the screen and in apps, and it will have a short-term memory for follow-up requests.

Personal Context


With personal context, ‌Siri‌ will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent.

  • Show me the files Eric sent me last week.

  • Find the email where Eric mentioned ice skating.

  • Find the books that Eric recommended to me.

  • Where's the recipe that Eric sent me?

  • What's my passport number?


Onscreen Awareness


Onscreen awareness will let ‌Siri‌ see what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell ‌Siri‌ to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask ‌Siri‌ to do it for you.

Deeper App Integration


Deeper app integration means that ‌Siri‌ will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what ‌Siri‌ will be capable of, but Apple has provided a few examples of what to expect.

  • Moving files from one app to another.

  • Editing a photo and then sending it to someone.

  • Get directions home and share the ETA with Eric.

  • Send the email I drafted to Eric.


Bigger Than Promised Update


In an all-hands meeting in August 2025, Apple software engineering chief Craig Federighi explained the ‌Siri‌ debacle to employees. Apple had attempted to merge two separate systems, which didn't work out.

There was one system for handling current commands and another based on large language models, and the hybrid approach was not working due to the confines of the current ‌Siri‌ architecture. The only way forward was to upgrade to the second-generation architecture built around a large language model.

In the August meeting, Federighi said Apple had successfully revamped ‌Siri‌, and that Apple would be able to introduce a bigger upgrade than it promised in iOS 18.

"The work we've done on this end-to-end revamp of ‌Siri‌ has given us the results we needed," Federighi told employees. "This has put us in a position to not just deliver what we announced, but to deliver a much bigger upgrade than that we envisioned."

Adopting Google Gemini


Part of Apple's problem was that it was relying on AI models that it built in-house, and that were not able to match the capabilities of competitors. Apple started considering using a third-party model for ‌Siri‌ and other future AI features shortly after delaying ‌Siri‌, and in January, Apple announced a multi-year partnership with Google.

For the foreseeable future, Apple's AI features, including the more personalized version of ‌Siri‌, will use a custom model Apple built in collaboration with Google's Gemini team. Apple plans to continue work on its own in-house models, but for now, it will rely on Gemini for many public-facing features.

‌Siri‌ in iOS 26.4 will be more similar to Google Gemini than ‌Siri‌ today, though without full chatbot capabilities. Apple plans to continue to run some features on-device and use Private Cloud Compute to maintain privacy. Apple will keep personal data on-device, anonymize requests, and continue to allow AI features to be disabled.

What's Not Coming in iOS 26.4


‌Siri‌ is not going to work as a chatbot, so the updated version will not feature long-term memory or back-and-forth conversations, plus Apple plans to use the same voice-based interface with limited typing functionality.

Apple's Embarrassing Siri Delay


In what became an infamous move, Apple went all-in showing off a smarter, Apple Intelligence-powered version of ‌Siri‌ when it introduced iOS 18 at the 2024 Worldwide Developers Conference. Apple said these features would come in an update to iOS 18, but right around when launch was expected, Apple admitted that ‌Siri‌ wasn't ready and would be delayed until spring 2026.



Apple executives went on a press tour to explain the ‌Siri‌ shortcomings after WWDC 2025, promising bigger and better things for iOS 26, and explaining what went wrong. The ‌Apple Intelligence‌ ‌Siri‌ features we saw at WWDC 2024 were actually implemented and weren't faked, but ‌Siri‌ wasn't working as well as expected behind the scenes and Apple was dealing with quality issues.

Since Apple advertised the new ‌Siri‌ features with the iPhone 16, some people who bought the iPhone because of the new functionality were upset about the delay and sued. Apple was able to quietly settle the case in December 2025, so most of the ‌Siri‌ snafu has been resolved.

Internal Restructuring


The misstep with ‌Siri‌'s debut and the failure of the hybrid architecture led Apple to restructure its entire AI team. Apple AI chief John Giannandrea was removed from the ‌Siri‌ leadership team, with Vision Pro chief Mike Rockwell taking over instead.

Apple CEO Tim Cook was no longer confident in Giannandrea's ability to oversee product development, and Giannandrea is set to retire in spring 2026. Rockwell reports to Federighi, and Federighi told employees that the new leadership has "supercharged" ‌Siri‌ development. Federighi has apparently played an instrumental role in changing Apple's approach to AI, and he is making the decisions that will allow the company to catch up to rivals.

Apple has struggled with retaining AI employees amid the ‌Siri‌ issue and recruitment strategies from companies like Meta. Meta poached several key AI engineers from Apple, offering pay packages as high as $200 million. At Apple's August all-hands meeting, Cook and Federighi aimed to reassure employees that AI is critically important to the company. "There is no project people are taking more seriously," Federighi said of ‌Siri‌.

Cook said that Apple will "make the investment" to be a leader in AI.

iOS 26.4 Siri Launch Date


Apple has promised that the new version of ‌Siri‌ is coming in spring 2026, which is when we're expecting iOS 26.4. Testing on iOS 26.4 should begin in late February or early March, with a launch to follow around the April timeframe.

LLM Siri Compatibility


The new version of ‌Siri‌ will presumably run on all devices that support ‌Apple Intelligence‌, though Apple hasn't explicitly provided details. Some new ‌Siri‌ capabilities may come to older devices as well.

iOS 27 Chatbot Upgrade


Apple plans to upgrade ‌Siri‌ even further in the iOS 27 update, turning Siri into a chatbot. ‌Siri‌ will work like Claude or ChatGPT, able to understand and engage in back and forth conversation.

Details about the ‌Siri‌ interface and how a chatbot version of ‌Siri‌ will work are still in short supply, but iOS 26.4 will be a stop on the path to a version of ‌Siri‌ able to actually function like products from Anthropic and OpenAI.
This article, "Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised" first appeared on MacRumors.com

Discuss this article in our forums

original link


You may also be interested in this

MacRumors Giveaway: Win a…

For this week's giveaway, we've teamed up with GRID Studio to offer MacRumors readers a chance to win a 256GB iPhone 15 Pro from GRID Studio. If you're not familiar

Top Stories: iPhone 17, N…

Buckle up for a busy week with Apple's annual iPhone event taking center stage on Tuesday, where we'll be seeing a bunch of new hardware as well as a final

Top Stories: iPhone 17 Ai…

We've known for quite some time about Apple's plans for a thinner "iPhone 17 Air" coming later this year, but wow, the latest dummy models give us our best look

What You Can and Can̵…

With iOS 18.2 and iPadOS 18.2, Apple introduced Genmoji, the feature that lets you create a custom emoji character if there's not already an emoji that exists for what you

Unreleased Audio Product …

Apple today updated its codebase with a numerical reference to an unreleased audio product. We don't know what the product is, but based on where the information was found and

Apple Announces New iPad …

Apple today unveiled redesigned iPad Pro models featuring the M4 chip, Ultra Retina XDR OLED displays, a nano-texture display option, and more. The new ‌iPad Pro‌ offers a considerably thinner

Apple Pencil Buyer’…

Apple now offers three different Apple Pencil models at $79, $99, and $129 price points, each with different feature sets. Our guide helps you decide which ‌Apple Pencil‌ is best

Here’s How the iOS …

With the fourth beta of iOS 26.1, Apple added a toggle that makes Liquid Glass more opaque and reduces transparency. We tested the beta to see where the toggle works
X

A whimsical homage to the days in black and white, celebrating the magic of Mac OS. Dress up your blog with retro, chunky-grade pixellated graphics to evoke some serious computer nostalgia. Supports a custom menu, custom header image, custom background, two footer widget areas, and a full-width page template. I updated Stuart Brown's 2011 masterpiece to meet the needs of the times, made it responsive , got dark mode, custom search widget and more.You can download it from tigaman.com, where you can also find more useful code snippets and plugins to get even more out of wordpress.