Apple released iOS 18.1 Beta 1 on Monday alongside the “standard” iOS 18 beta track to release the first beta of Apple Intelligence. Monday’s beta does not include three of perhaps the biggest features coming to iOS, presumably next year, in the spring: the new App Intents-powered Siri with on-screen and in-app processing, Image Playground and Genmoji, and the ChatGPT integration. I’d reckon ChatGPT ships in iOS 18.1 before Thanksgiving, with it going into beta sometime in August, but the new Siri and image generation capabilities clearly need work and will probably become available in a further release going into beta in January.

Regardless, the iOS 18.1 beta, in its current state, has most of the Apple Intelligence features demonstrated during the Worldwide Developers Conference: the new Siri design, Writing Tools, the “Reduce Interruptions” Focus mode, call summaries and recording in Notes, article summaries in Safari, and semantic search in Photos, amongst much, much more. It clearly is half-baked and buggy, though, and it doesn’t even seem clear if all the models Apple has produced are available yet — Apple Intelligence seems to only take up 2.86 gigabytes of storage on iOS and 5.06 GB on macOS. It’s a developer beta, and it certainly isn’t ready for prime time; I don’t think I have a use for any of it yet.

One such beta limitation, and perhaps the biggest disappointment, is Writing Tools. Apple said at WWDC that it would only work in system-native text fields, but that is rather constricting, especially on the Mac, where most writing-specific AppKit apps use custom fields, like MarsEdit, Tot, and Craft. Somewhat unsurprisingly, the best experience is in select Apple-made apps, like Notes and TextEdit, where a bar appears at the top of the screen showing changes the system makes when using the Proofread feature, similar to a diff program like Kaleidoscope. This seems to only work in certain apps since I wasn’t able to replicate it anywhere else, including in some of Apple’s own apps like Mail and Pages. In those apps, only on the Mac, the text is just shown in a pop-out window with options to copy or replace. On iOS, the system automatically underlines text it has modified, and the suggestions can be accepted or dismissed. I assume this inconsistent availability is a bug and hope it’s fixed in the future.

Suggestions on iOS and certain Mac apps also explain why the system elected to rewrite the text that way. The explanations are typically only one sentence in length but are custom to the context in which they exist. In other words, they aren’t canned responses but are tailored to the specific change and are a small-yet-unique element of Apple generating text, not just modifying it.

Another element of text generation is found in Messages and Mail, when someone asks a question in an iMessage or email thread. There, similar to Gmail, Apple Intelligence provides generated responses tailored to the question — they aren’t canned, either. I’ve found these a bit too formal and verbose for my liking — you wouldn’t say “I think that’s a good idea” to a close friend — and there isn’t a way to switch tone, but I’ve already used them with friends and family who understand I’m testing an AI feature. (They were amused.) But, for instance, when “OK” would do just fine, Apple recommended “Sure, that’ll work” instead. It’s not that the suggestions are wrong, it’s just not how any person would talk. Apple does add commas for grammatical accuracy, but it does not append periods to text messages — though it does for emails — and even learns from someone’s texting style, including capitalization and some word choices, which again, is an example of fine-tuning the model for specific tasks.

Back to Writing Tools: The “toolbox” is found by selecting and Control-clicking any text in a supported app and clicking Show Writing Tools from the context menu. On iOS, just select text and choose Writing Tools from the pop-up menu. In apps where it does function, it proofreads excellently, and its summarizations are remarkable — much better than ChatGPT. It doesn’t generate text, obviously, but it edits it well. It does take a while to chug through large amounts of prose, though, and there isn’t a loading indicator to notify users when it is computing, something I assume will be fixed in a later build. For instance, my hands-on first impressions of Apple’s newest operating systems came in at 16,139 words, and Apple Intelligence on my M3 Max MacBook Pro took about two minutes to proofread it in TextEdit. Once it did, it automatically saved the changes it made to the document, which is weird, but they could all be reverted with one click.

On iOS, Writing Tools and the app it’s working in must be in focus; it is impossible to leave the app while the sheet is open. But on the Mac, where people are more likely to deal with long text and also aren’t constrained by battery life and relatively low-performance systems-on-a-chip, other apps can be open while Writing Tools is modifying text or if a summary is in progress in Notes or some other application. (Writing Tools is the one that requires the most computing power for now, anyway.) Looking at iStat Menus, an app that displays real-time system utilization information, compute and graphics usage remained steady, but memory usage peaked, presumably because the models were loaded into memory for the duration of the task. Activity Monitor just labels the app itself as using the memory, so when I was using Writing Tools in TextEdit, Activity Monitor said “TextEdit” was using 3 GB of RAM.

Apple Intelligence seems to exclusively use the Neural Engine for most tasks it does on-device, and if it needs to, offloads the data to the cloud via Private Cloud Compute. I threw thousands of words at it and observed if it was sending any data to the cloud, but it seemingly didn’t. It might have if I tried on iOS, where the Neural Engine is less powerful, or it might be that Private Cloud Compute isn’t available yet for testing. If Private Cloud Compute was used, I don’t think the model would be loaded into my Mac’s memory, and I’d also probably observe some kind of network activity. Either way, the cloud is only used when it is absolutely necessary.

Mail’s summaries work impeccably well. I had an email come in about an order being delayed, and instead of showing me the first line of the email under the subject as any other email app would, it summarized it: “Delivery time updated today, waive fee if order delivered after 4:34.” It also knew an order delay was important, so it placed that at the top of the inbox, labeled with a “Priority” heading. It doesn’t work with all emails yet, just like how Safari’s article summarization is picky about what websites it’ll touch, but it performs the best with auto-written status updates. (I wouldn’t want it to summarize a newsletter, for example.)

Safari’s summaries have been brought to all webpages, though they’re no longer automatically generated within Safari; they have to be manually created by entering Safari Reader and choosing the Summary option at the top, which I find inconvenient as someone who uses Reader only rarely. The important thing, though, is that they work on every website and appear consistently — there isn’t a way to disable them as a site owner, even if Applebot-Extended, Apple’s web scraper, is disallowed. (The summaries are created client-side.) The blurbs are generated very quickly and are astonishingly accurate, though I find them best suited for shorter articles rather than long ones with lots of intricate information, or how-tos, where the artificial intelligence doesn’t even seem to want to sum up step-by-step guides.

Notification summaries also work amazingly. They prioritize key bits of information and aren’t long, which clearly indicates some kind of fine-tuning other large language models lack. Other AI tools usually begin their summaries with “This text message reads…” or something similar, but Apple Intelligence gets right to the point: “Delivery arrival, on July 29, at 4:40 p.m.” That’s all anyone needs, and it’s much better than showing the first few lines of a text message a robot sent. It’s much less inclined to summarize human communication, which is a good thing because there is a much stronger likelihood it fails to understand the nuances of person-to-person communication. Also, people like reading text messages from other people, but status updates can be filed away and deleted.

The Reduce Interruptions Focus mode acts as any other Focus in the sense that it allows people to choose specific contacts and apps that should always be allowed through, while the rest are subject to the Intelligent Breakthrough feature, which discerns which specific messages are critical enough to warrant a disruption. As weird as this comparison sounds, it almost reminds me of the Adaptive Audio feature on AirPods Pro, which lives in between Transparency Mode and noise cancellation, permitting some sounds like human speech while silencing loud external noises. Reduce Interruptions does the same, peering into the contents of notifications rather than just where they came from. When priority notifications do come in, they’re supplemented with a badge that says “Maybe Important,” and some notifications are even summarized. This is my new favorite Focus mode because it alleviates the stress of a blanket ban on everyone but a few select contacts and apps. If there’s a contact I don’t necessarily communicate with often but who needs something urgently, they should be able to come through.

In my few hours working with it enabled, I haven’t gotten a single bad notification. Text messages came through fine, I got some updates on an order summarized by Apple Intelligence so I wasn’t distracted by them, and my unimportant apps didn’t bother me. I’ve never had a “work” Focus mode because I would just end up letting everything in out of paranoia, but now, thanks to Apple Intelligence, I’ll be using this one when I need to get away from constant pings. I was worried about how it would function under real-world scenarios, but spending a few hours with it proved my worries unnecessary. It’s a fantastic feature and perfectly ties in with Apple Intelligence’s summary chops.

Some other tidbits I’ve noticed:

  • To activate Type to Siri on the Mac, press Globe-S anywhere in the operating system or double-press either Command key. It also works on iOS by double-tapping the bottom of the screen, but an early beta bug requires a restart of the device after the update is installed for it to work. Siri, for right now, works the same but understands me a lot better.

  • When recording a call, Apple says to “respect the preferences of the person you’re calling” and plays an audio notification that the call is being recorded. Then, it is transcribed in Notes, though recordings begin in the Phone app.

  • Any text in any app that supports Writing Tools — regardless of whether that text is editable or not — can be summarized and proofread by Apple Intelligence just by Control-clicking. Unfortunately, there is no keyboard shortcut to access Writing Tools; it is only accessible via the menu bar or text selection menu.


The biggest source of confusion online has been the waiting list, which is present even in the first Apple Intelligence beta. When it is first downloaded, Apple Intelligence is opt-in, as it will be for everyone when it ships later this year. To find it, go to Settings → Siri and Apple Intelligence, which now has a new icon. But the top item is to join a waiting list to use Apple Intelligence, and the system says it will notify the user when it becomes available to them. It took me about five minutes to be let in, but I surmise that’s because there is no list in actuality — at least while Apple Intelligence is in beta — and that it’s just a demonstration to test the functionality of the waitlist.

Either way, the waitlist exists to handle demand, presumably for Private Cloud Computer. If I had to bet, I believe Apple will eliminate the list eventually, as soon as it knows how many people are interested and can begin to build out its server infrastructure, but for now, I think it makes sense to have it in place, especially to gatekeep the ChatGPT features to prevent its Azure servers from being hammered, which Microsoft wouldn’t be very happy with.1 Once someone is let in, their Apple account is whitelisted, so they don’t have to sign up on every device they wish to use Apple Intelligence on.

Notably, the models don’t begin downloading until a user is permitted to use Apple Intelligence, and after they’re off the waitlist, the models download in the background. This process takes quite a bit of computing power, though running the models themselves on macOS uses about 5.5 watts of power, according to Max Weinbach, an analyst for Creative Strategies, a market research firm. In my testing, I noticed a peak of about 16 watts through iStat Menus while proofreading a long text, though I would also assume the models would be more conservative on iOS and iPadOS.

As I said a few months ago, I’m very excited about this next chapter in Apple’s software history. There is a lot more work to be done, both in preparing the company’s infrastructure for the influx of new users, ironing out bugs, and expanding availability to more users, apps, and product categories. And, of course, launching the new large action model-like Siri with App Intents and Semantic Index will be a big step toward ambient computing, where the computers do the computing and we do the creating.


An update was made on July 29, 2024, at 9:27 p.m.: I’ve since discovered iMessage has the same automatically generated replies as Mail. This article has been updated to add that information.

A correction was made on July 29, 2024, at 11:01 p.m.: Type to Siri works just fine on iOS — it just requires a restart of the device. After that, double-tapping at the bottom of the display works as it should. I regret the error.

A correction was made on July 30, 2024, at 3:06 a.m.: The diff-like experience in Writing Tools is limited, but not only to TextEdit. It’s also available in Notes.


  1. In honesty, I wish Apple never even added the ChatGPT integration in the first place. I don’t think it’s necessary; it opens the company up to antitrust concerns, and I don’t miss any text generation features. Yes, I think text generation is important, but I also believe its best place is web search, such as with SearchGPT. Chatbots aren’t here to stay, whereas Apple Intelligence clearly is. Even OpenAI knows that, which is why it’s less focused on generating stories about cows on the moon and more on clinching crucial content deals with publishers to enhance SearchGPT, its Google competitor. ↩︎