Gurman: Siri Is Becoming an AI Chatbot in iOS, macOS 27
Mark Gurman, back from his holiday break, reporting for Bloomberg:
Apple Inc. plans to revamp Siri later this year by turning the digital assistant into the company’s first artificial intelligence chatbot, thrusting the iPhone maker into a generative AI race dominated by OpenAI and Google.
The chatbot — code-named Campos — will be embedded deeply into the iPhone, iPad, and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the “Siri” command or holding down the side button on their iPhone or iPad.
The new approach will go well beyond the abilities of the current Siri — or even a long-promised update that’s coming earlier in 2026. Today’s Siri lacks a chat-like feel and the back-and-forth conversational abilities of OpenAI’s ChatGPT or Google’s Gemini.
This is next year’s release that will come in addition to the “more personalized Siri” first demonstrated at the 2024 Worldwide Developers Conference. The “more personalized Siri” includes the new AI-powered web search engine (dubbed “World Knowledge Answers”), App Intents support, and personal context. As Gurman says, we still expect the “long-promised update” to come in a few months with support for these features. In iOS 27, meanwhile, Gurman reports that Siri will transform into an AI chatbot to compete mainly with ChatGPT.
This is a notable shift in strategy for Apple. John Giannandrea, Apple’s previous machine learning chief, discounted AI chatbots and thought that they weren’t worth spending time on. This is more or less the strategy Apple has kept with for the last three years, culminating in the now-infamous “more personalized Siri” that has turned into an embarrassment for the company. With Giannandrea out, Craig Federighi, Apple’s software engineering chief, is taking over, along with Mike Rockwell, previously the head of the Apple Vision Pro project who now works on Siri. These two clearly have a shared vision for Apple’s future AI endeavors: Leave the expensive pre-training to Google and focus purely on building user-facing chatbot software. I don’t think that’s inherently bad, despite my belief that chatbots aren’t the end-game. A chatbot built into iOS and macOS is, at this point, common sense. It wasn’t in 2024. This is objectively just the way the tech industry is headed for the foreseeable future.
The 2026 revamp of Siri lays the groundwork for this chatbot because it merges the two Siri architectures together: the older, offline version of Siri, and the new, large language model-powered one. In iOS 27, Gurman reports that Apple will use a more powerful, Gemini 3-like LLM to power Siri, and combined with new post-training and fine-tuning, Siri will take the personality of a chatbot. Gurman writes:
Like ChatGPT and Google Gemini, Apple’s chatbot will allow users to search the web for information, create content, generate images, summarize information, and analyze uploaded files. It also will draw on personal data to complete tasks, being able to more easily locate specific files, songs, calendar events, and text messages…
Campos may let Apple jettison its Spotlight function as well…
The iOS 26.4 update of Siri, the one before the true chatbot, will rely on a Google-developed system internally known as Apple Foundation Models version 10. That software will operate at 1.2 trillion parameters, a measure of AI complexity.
Campos, however, will significantly surpass those capabilities. The chatbot will run a higher-end version of the custom Google model, comparable to Gemini 3, that’s known internally as Apple Foundation Models version 11.
I find it unlikely that Apple will “jettison” Spotlight as much as it will combine Siri and Spotlight. Currently, people can type a URL into Spotlight, and it’ll navigate to the URL in Safari — I highly doubt that functionality will disappear. Similarly, I’m doubtful Apple would let an LLM take over file search capabilities, knowing the non-deterministic nature of AI chatbots. What’s more likely is that if a user types a specific question into Spotlight, it’ll reroute it to Siri, which will then take over the request in chatbot form. Gurman does not attribute this to any sources (“The people said…”), but what I think he’s trying to get at is that the new Siri, much like the current Siri, won’t be a separate experience like ChatGPT. Perhaps that’ll be one of its selling points. I’d imagine a chatbot that knows everything about a user and that is deeply integrated into iOS and macOS would have a distinct advantage over ChatGPT and Gemini, so long as it’s equally powerful (which Gurman insinuates it will be).
In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple’s own Private Cloud Compute servers, which rely on high-end Mac chips for processing.
Apple wouldn’t squander the last remaining competitive advantage it has in the AI space: its silicon. Apple silicon, especially the new M5 processor, is fantastic at inference. I contend that Apple is not sufficiently equipped to pre-train its own models, in no small part thanks to Giannandrea’s inaction, but Private Cloud Compute is innovative, private, and jibes well with Apple’s stated mission of becoming carbon neutral by 2030. If it were to hand inference off to Google, it would have to disclose that to users in its privacy policy – which undermines Apple’s privacy-centric approach to AI — and kick the can down the road to achieve carbon neutrality. I find both of these to be unlikely. Private Cloud Compute is so industry-leading that it took Google over a year to begin rolling out a similar technology.