Benjamin Mayo, reporting for 9to5Mac:

The first developer beta of iOS 18.2 is out now. The update brings the second wave of Apple Intelligence features for developers to try.

iOS 18.2 includes Apple’s image generation features like Genmoji and Image Playground, ChatGPT integration in Siri and Writing Tools, and more powerful Writing Tools with the addition of the ‘Describe your change’ text field. iPhone 16 owners can access Visual Intelligence via the Camera Control. The update also expands Apple Intelligence availability to more English-speaking locales, beyond just US English.

My thoughts on Apple Intelligence overall haven’t changed since June; my disdain for Image Playground and Genmoji still persists. Writing Tools, as I wrote in July when the first round of Apple Intelligence features was released into beta, are disappointing as a writer by trade, and I don’t use them for much of anything, especially since they’re not available in most third-party apps. (My latter qualm should be addressed, though, thanks to a new Writing Tools application programming interface, or API, developers can integrate into their apps. I hope BBEdit, MarsEdit, Craft, and other Mac apps I write in adopt the new API quickly.) I fiddled with Describe Your Change in Notes and TextEdit and found it useless for anything — I write in my own style, and Apple Intelligence isn’t very good at emulating it. Meanwhile, the vanilla Writing Tools Proofread feature only makes some small corrections — mainly regarding comma placement, much of which I disagree with — and even that is a rarity.

ChatGPT integration system-wide is interesting, however. I’m unsure how much Writing Tools relies on it yet, but it’s heavily used in Siri. Even asking Siri to “ask ChatGPT” before beginning a query will prompt OpenAI’s system. It’s not as good as ChatGPT’s voice mode, but it’s there, and most importantly, it’s free. Still, I signed into my paid account, though it’s unclear how many more messages I get by signing in than free users. Once I signed in, I was greeted by a delightful toggle in Settings → Apple Intelligence → ChatGPT: Confirm ChatGPT requests. I initially missed this because of how nondescript it appears to be, but I was quickly corrected on Threads, leading me to enable it, disabling incessant “Would you like me to ask ChatGPT for that?” prompts when Siri cannot answer a question.

I’ve found Siri much better at delegating queries to ChatGPT — when the integration is turned on; it’s disabled by default — than I would expect, which I like. I have Siri set to not speak aloud when I manually press and hold the Side Button, so it doesn’t narrate ChatGPT answers, but I’ve found it much better than the constant “Here’s what I found on the web for…” nonsense from the Siri of yore. Siri now rarely performs web searches; it instead displays a featured snippet most of the time or passes the torch to ChatGPT for more complex questions. This is still not the contextually aware, truly Apple-Intelligent version of Siri, which will reportedly launch sometime in early 20251, but I’ve found it much more reliable for a large swath of questions. I’m unsure if it’ll replicate my photographer friend scenario I wrote about a few weeks ago, but time answers all.

I wasn’t expecting to find ChatGPT anywhere else, but it was quietly added to Visual Intelligence, a feature exclusive to iPhone 16 models with Camera Control. (I quibbled about how it wasn’t available at launch in my review; it’s still unavailable to the general public yet and probably will for a while.) Long pressing on Camera Control — versus either single or double pressing it to open a camera app of choice — opens a new Visual Intelligence interface, which isn’t an app but rather a new system component. It doesn’t appear in the App Switcher, unlike Code Scanner or Magnifier, for instance. There are three buttons at the bottom of the screen, and all point to different services: the shutter, Ask, and Search. The shutter button seems to do nothing important other than take a photo, akin to Magnifier — when a photo is taken, the other two buttons are more prominently visible. (Text in the frame is also selectable, à la Live Text.) Ask seems to be a one-to-one port of ChatGPT 4o’s multimodality: It analyzes the frame and generates a paragraph about it. After that, a follow-up conversation can be had with the chatbot, just like ChatGPT. It’s shockingly convenient to have it built into iOS like that.

Search is perhaps the most interesting, as it’s a combination of Google Lens and Apple’s on-device lookup feature first introduced in iOS 15, albeit in a marginally nicer wrapper. It essentially negates Google’s own Google Lens component of its bespoke iOS app, so I wonder what strings Apple had to pull internally to get Google to agree. (Evidently, it’s using some kind of API, just like ChatGPT, because it doesn’t just launch a web view to Google Lens.) Either way, as Mark Gurman of Bloomberg writes on the social media website X, this feature has singlehandedly killed both the Rabbit R1 and Humane Ai Pin: it’s a $700 — err, $500 — value. I think it’s really neat, and I’m going to use it a ton, especially since it has ChatGPT integration.

As I said back in June, I generally favor Apple Intelligence, and this version of iOS and macOS feels more intelligent to the nth degree. Siri is better, Visual Intelligence is awesome, and I’m sure Genmoji is going to be a hit, even to my chagrin. The only catch is Image Playground, which (a) looks heinous and (b) is quite sensitive to prompts. Take this benign example: I asked it to generate an image of “an eagle with an American flag draped around it” — because I’m American — and it refused. At first, I was truly perplexed, but then it hit me that it probably won’t generate images related to nationalities or flags to refrain from political messages. (The last thing Apple wants is for some person on X to get Image Playground to generate an image of someone shooting up the flag of Israel or whatever.) Whatever the case is, some clever internet Samaritans have already gotten it to generate former President Donald Trump and an eggplant in a person’s mouth.


  1. My prediction still stands: iOS 18.1 will ship by next week, iOS 18.2 by mid-January, and iOS 18.3 Beta 1 sometime around then with a full release coming by March. That release would complete the Apple Intelligence rollout — finally. ↩︎