Gurman: LLM-Powered Siri Slated for April 2026 Release
Mark Gurman, reporting for Bloomberg:
Apple Inc. is racing to develop a more conversational version of its Siri digital assistant, aiming to catch up with OpenAI’s ChatGPT and other voice services, according to people with knowledge of the matter.
The new Siri, details of which haven’t been reported, uses more advanced large language models, or LLMs, to allow for back-and-forth conversations, said the people, who asked not to be identified because the effort hasn’t been announced. The system also can handle more sophisticated requests in a quicker fashion, they said…
The new voice assistant, which will eventually be added to Apple Intelligence, is dubbed “LLM Siri” by those working on it. LLMs — a building block of generative AI — gorge on massive amounts of data in order to identify patterns and answer questions.
Apple has been testing the upgraded software on iPhones, iPads, and Macs as a separate app, but the technology will ultimately replace the Siri interface that users rely on today. The company is planning to announce the overhaul as soon as 2025 as part of the upcoming iOS 19 and macOS 16 software updates, which are internally named Luck and Cheer, the people said.
To summarize this report, Siri will be able to do what ChatGPT had in fall 2023 — a conversational, LLM-powered voice experience. People, including me, initially compared it to ChatGPT’s launch in November 2022, but that isn’t an apples-to-apples comparison since ChatGPT didn’t ship with a voice mode until a year later. Either way, Apple is effectively two and a half years late, and when this conversational Siri ships, presumably as part of next year’s Apple Intelligence updates, ChatGPT 5 will probably be old news. ChatGPT’s voice mode, right now, can search the internet and deliver responses in near real-time, and I’ve been using it for all my general knowledge questions. It’s even easy to access with a shortcut — how I do it — or a Lock Screen or Control Center control.
Meanwhile, the beta version of Siri that relies on ChatGPT is also competitive, although it’s harder to use because most of the time, Siri tries to answer by itself (requiring queries to be prefaced with “Ask ChatGPT,” which, at that point, it’d be a better use of time to tap one button to launch ChatGPT’s own app), and the ChatGPT feature isn’t conversational. The other day, I asked, “Where is DeepSeek from?” and Siri answered the question by itself. I then followed up with, “Who is it made by?” and Siri went to ChatGPT for an answer but came back with, “I don’t know what you’re referring to by ‘it.’ Could you provide the name of the product or service you’re wondering about?” Clearly, the iOS 18.2 version of Siri is way too confident in its own answers and also doesn’t know how to prompt ChatGPT effectively. The best voice assistant on the iPhone is the ChatGPT voice mode via a shortcut or Lock Screen control.
Personally, I think Apple should just stop building conversational LLMs of its own. It’s never going to be good at them, as evidenced by the fact that Siri’s ChatGPT integration is so haphazard that it can’t even ask basic questions. A few weeks ago, when Vice President Kamala Harris was scheduled to be on “Saturday Night Live,” I asked Siri when it begins. Siri responded by telling me when “SNL” first began airing: October 11, 1975. I had to rephrase my question, “Ask ChatGPT when ‘SNL’ is on tonight,” and then only it used ChatGPT to give me a real-time answer, including sources at the bottom. Other times, Siri was good at handing off queries to ChatGPT, but it really should be much more liberal — I should never have to prefix “Ask ChatGPT” to any of my questions. The point is, if Apple really wanted to build a conversational version of Siri, it could use its (free) partner, ChatGPT, or even work with it to build a custom version of GPT-4o just for Siri. OpenAI is eager to make money, and Apple could easily build a competitive version of Siri by the end of the year with the tools it’s shipping in the iOS beta right now.
I’ll say it now, and if it ages poorly, so be it: Apple’s LLMs will never be half as good as even the worst offerings from Google or OpenAI. What I’ve learned from using Apple Intelligence over the past few months is that Apple is not a talented machine learning company. It’s barely adequate. Apple Intelligence notification summaries are genuinely terrible at reading tone and understanding the nuances in human communication — it makes for funny social media posts, but it’s just not that useful. I now have them turned off for most apps since I don’t trust them to summarize news alerts or weather notifications — they’re really only useful for email and text messages. And about that: I read most of my email in Mimestream, which can’t take advantage of Apple Intelligence even if it wanted to because there aren’t any open application programming interfaces for developers to use to bring Apple Intelligence to their apps. Visual Intelligence is lackluster, Writing Tools are less advanced than ChatGPT and aren’t available in many apps on the Mac, and don’t even get me started on Genmoji, which is almost too kneecapped to do anything useful.
Apple Intelligence, for now, is a failure. That could change come spring 2025 when Apple is rumored to complete the rollout, but who knows how ChatGPT will improve in the next six months. It isn’t just that April 2026 is too late for an LLM-powered Siri, but that it won’t be any good. Apple doesn’t have a proven track record in artificial intelligence, and it’s struggling to build one.