Thoughts on Apple’s ‘It’s Glowtime’ Event
An hour-and-a-half of vaporware — and the odd delight
Apple’s “It’s Glowtime” event on Monday, which the company held from its Cupertino, California, headquarters, was a head-scratcher of a showcase.
For weeks, I had been anticipating Monday to be an iterative rehashing of the Worldwide Developers Conference. Tens of millions of people watch the iPhone event because it is the unveiling of the next generation of Apple’s one true product, the device that skyrocketed Cupertino to fame 17 years ago. On iPhone day, the world stops. U.S. politics, even in an election year, practically comes to a standstill. Wall Street peers through its television screens straight to Apple Park. A monumental antitrust trial alleging Google of its second monopoly of the year is buried under the hundreds of Apple-related headlines on Techmeme. When Apple announces the next iPhone, everyone is watching. Thus, when Apple has something big to say, it always says it on iPhone day.
Ten years ago, on September 9, 2014, Apple unveiled the Apple Watch, its foray into the smartwatch market, alongside the iPhone 6 and 6 Plus, the best-selling smartphones in the world. Yet it was the Apple Watch that took center stage that Tuesday, an intentional marketing choice to give the Apple Watch a head start — a kick out the door. Apple has two hours to show the world everything it wants to, and it takes advantage of its allotment well. Each year, it tells a story during the iPhone event. One year, it was a story of courage: Apple was removing the headphone jack. The next, it was true innovation: an all-screen iPhone. In 2020, it was 5G. In 2022, it was the Dynamic Island. This year, it was Apple Intelligence, Apple’s yet-to-be-released suite of artificial intelligence features. The tagline hearkens back to the Macintosh from 1984: “AI for the rest of us.” Just that slogan alone says everything one needs to know about Apple Intelligence and how Apple thinks of it.
Before Monday, only two iPhones supported Apple Intelligence: iPhone 15 Pro and iPhone 15 Pro Max. That is not enough for Apple Intelligence to go mainstream and appeal to the masses; it must be available on a low-end iPhone. For that reason, Monday’s event was expected to be the true unveiling of Apple’s AI system. The geeks, nerds, and investors around the globe already know about Apple Intelligence, but the customers don’t. They’ve seen flashy advertisements on television for Google Gemini during the Olympic Games and Microsoft Copilot during the Super Bowl, but they haven’t seen Apple’s features. They haven’t seen AI for the rest of us. And why should they? Apple wasn’t going to recommend people buy a nearly year-old phone for a feature suite still in beta. Thus, the new iPhone 16 and iPhone 16 Pro: two models built for Apple Intelligence from the ground up. Faster neural engines, 8 gigabytes of memory, and most importantly, advertising appeal. New colors, a new flashy Camera Control, and a redesign of the low-end model. These factors drive sales.
It’s best to think of Monday’s event not as a typical iPhone event, because, really, the event was never about the smartphones themselves; it was about Apple Intelligence — the new phones simply serve as a catalyst for the flashy advertisements Apple is surely about to air on Thursday Night Football games across the United States. Along the way, it announced new AirPods, because why not — they’re so successful — and a minor Apple Watch redesign to commemorate the 10th anniversary of Apple’s biggest product since the iPhone. By themselves, the new iPhones are just new iPhones: boring, predictable, S-year phones. They have the usual camera upgrades, one new glamorous feature — the Camera Control — and new processors. They’re unremarkable in every angle, yet they are potentially the most important iPhones Apple launches this decade for a software suite that won’t even arrive in consumers’ hands until October. People who watch Apple’s event on Monday are buying a promise, a promise of vaporware eventually turning into a real product. Whether Apple can keep that promise is debatable.
AirPods
Tim Cook, Apple’s chief executive, left the event’s announcements up to nobody’s best guess. He, within the first minute, revealed the event would be about AirPods, the Apple Watch, and the iPhone — a perfect trifecta of Apple’s most valuable personal technology products. The original AirPods received an update just as the rumors foretold, bringing the H2 processor from the AirPods Pro 2, a refined shape to accommodate more ear shapes and sizes, and other machine-learning features like Personalized Spatial Audio and head gestures previously restricted to the premium version. All in all, for $130, they’re a great upgrade to the first line of AirPods, and I think they’re priced great. AirPods 4: nothing more, nothing less.
However, the more intriguing model is the eloquently named AirPods Pro 4 with Active Noise Cancellation, priced at $180. The name says it all: the main additions are active noise cancellation, Transparency Mode, and Adaptive Audio, just like AirPods Pro. However, unlike AirPods Pro, the noise-canceling AirPods 4 do not have silicone ear tips to provide a more secure fit. I’m curious to learn how efficacious noise cancellation is on AirPods 4 compared to AirPods Pro because canceling ambient sounds usually requires some amount of passive noise cancellation to be effective. No matter how snug the revamped fit is, it is not airtight — Apple describes AirPods 4 as “open-ear AirPods” — and will be worse than AirPods Pro, but it may also be markedly more comfortable for people who cannot stand the pressure of the silicone tips. That isn’t an issue for me, but every ear is different.
For $80 more, the AirPods Pro offer better battery life, sound quality, and presumably active noise cancellation, but if the AirPods 4 with Active Noise Cancellation — truly great naming job, Apple — are even three-quarters as good as AirPods Pro, I will have no hesitation recommending them. I’m all for making AirPods more accessible. I’m also interested in learning about the hardware differences between the $130 model and the $180 model since I’m sure it’s not just software that differentiates them: Externally, they appear identical, but the noise-canceling ones are 0.08 ounces heavier. Again, they have the same processor and I believe they have the same microphones, so I hope a teardown from iFixit will put an end to this mystery.
AirPods Pro 2 don’t receive a hardware update but will get three new hearing accessibility features: a hearing test, active hearing protection, and a hearing aid feature. Apple describes the suite as “the world’s first all-in-one hearing health experience,” and as soon as it was announced, I knew it would change lives. It begins with a “scientifically validated” hearing test, which involves listening to a series of progressively higher-in-pitch and quieter tones played through the Health app on iOS once it is released in a future version of the operating system. Once results are calculated, a user will receive a customized profile to modify sounds played through their AirPods Pro to be more audible. If moderate hearing loss is detected, iOS will make the hearing aid feature available, which Apple says has been approved by the Food and Drug Administration and will be accessible in over 150 countries at launch. And to prevent the need for hearing remedies to begin with, the new Hearing Protection feature uses the H2 processor to reduce loud sounds.
The trifecta will change so many lives for the better. Over-the-counter hearing aids, though approved by the FDA, are scarce and expensive. Hearing tests are complicated, require a visit to a special office, and are price-prohibitive. By contrast, many people already have AirPods Pro and an iPhone, and they can immediately take advantage of the new features when they launch. I’m glad Apple is doing this.
The new life-changing AirPods features are only available on AirPods Pro 2 due to the need for the H2 chip and precise noise cancellation provided by the silicone ear tips. Apple, however, does sell over-the-ear headphones with spectacular noise cancellation, too: the AirPods Max. Mark Gurman, Bloomberg’s chief Apple leaker and easily the best in the business, predicted Sunday night that Apple would refresh the AirPods Max, which sell for $550, with a USB Type C port and H2 chip to bring new AirPods features like Adaptive Audio to Apple’s flagship AirPods, and I, like many others, thought this was a reasonable assertion. As Apple rolled out the AirPods Max graphic, I waited in anticipation behind my laptop’s lid for refreshed AirPods Max, the first update to the product in four years. All Apple did, in the end, was add new colors and replace the ancient Lightning port with a USB-C connector. That’s it.
More than disappointment, I was angry. It reminded me of another Apple product that suffered an ill fate in the end: the original HomePod, which was discontinued in 2021 after being neglected for years without updates. It seems to me like Apple doesn’t care about its high-end audio products, so why doesn’t it just discontinue them? Monday’s “update” to AirPods Max isn’t an update at all — it is a slap in the face of everyone who loves that product, and Apple should be ashamed of itself. AirPods Max have a flawed design that needs fixing, and now they have fewer features than the $130 cheapest pair of AirPods. Once again, AirPods Max are $550. It is unabashedly the worst product Apple still pretends to remember the existence of. Nobody should buy this pair of headphones.
Apple Watch
The Apple Watch Series 10 feels like Apple was determined to eliminate — or at least negate — the Apple Watch Ultra from its lineup. Cook announced it as having an “all-new design,” which is far from the truth, but it is thinner and larger than ever before, with 42- and 46-millimeter cases. Though the screens are gargantuan — the largest size is just 3 millimeters smaller than the Apple Watch Ultra — the bezels around the display are noticeably thicker than the Series 7 era of the Apple Watch. The reason for this modification is unclear, but Apple achieved the larger screen size by enlarging the case and adding a new wide-angle organic-LED display for better viewing angles. The corner radius has also been rounded off, adding to a look I think is simply gorgeous. The Apple Watch Series 10 is easily the most beautiful watch Apple has designed, and I don’t mind the thicker bezels.
Apple has removed the stainless steel case option for the first time since the original Apple Watch, which came in three models: Apple Watch Sport, made from aluminum; Apple Watch, made from polished stainless steel; and Apple Watch Edition, made from 24-karat gold. (The last was overkill.) As the Apple Watch evolved, the highest-end material became titanium, whereas aluminum remained the cheapest option and stainless steel sat in the middle. Now, aluminum still is the most affordable Apple Watch, but the $700 higher-tier model is made of polished titanium. I’ve always preferred titanium to steel for watches since I like lighter hand watches, but Apple has historically used brushed titanium on the Apple Watch, resulting in a finish similar to aluminum. Now, the polished titanium finish matches the stainless steel while retaining the weight benefit, and I think it’s a perfect balance. There is no need for a stainless steel watch.
The aluminum Apple Watch also welcomes Jet Black back to Apple’s products for the first time since the iPhone 7. I think it’s a gorgeous color and is easily the one I’d buy, despite the micro-abrasions. It truly is a striking, classy, and sophisticated timepiece — only Apple could make a black watch look appealing to me. (The titanium model comes in three colors: Natural Titanium, Gold, and Slate; Natural Titanium is my favorite, though Gold is beautiful.)
Feature-wise, the major addition is sleep apnea notifications, which Apple says will be made available in a future software update. This postponing of marquee features appears to be an underlying trend this year, and I find it distasteful, especially since this year’s watch is otherwise a relatively minor update. Punting features, like Apple Intelligence for example, down the pipeline might have short-term operational benefits, but it comes at the expense of marketability and reliability. At the end of the day, no matter how successful Apple is, it is selling vaporware, and vaporware is vaporware irrespective of who develops it. Never purchase a technology product based on the promise of future software updates.
Apple has not described how the sleep apnea detection feature works in-depth other than with some fancy buzzwords, and I presume that is because it relies on the blood oxygen sensor from the Apple Watch Series 9, which is no longer allowed to function or ship to the United States due to a patent dispute with Masimo, a health technology company that allegedly developed and patented the sensor first. This unnecessary and largely boring patent dispute has boiled over into not just a new calendar year — it has been going on since Christmas last year — but a new product cycle entirely. Apple has fully stopped marketing the sensor both on its website and in the keynote because it is unable to ship in the United States, but it still remains available in other countries, as indicated by the Apple Watch Compare page in other markets. I was really hoping Apple and Masimo would settle their grievances before the Series 10, but that doesn’t seem to be the case, and I’m interested to see if Apple will ever begin marketing the blood oxygen sensor again.
This year’s model adds depth and water temperature sensors for divers, borrowing from the Apple Watch Ultra and leaving Apple Watch Ultra buyers in a precarious position: The most expensive watch only offers a marginally larger display, Action Button, and better battery life. I don’t think that’s worth $400, especially since the Apple Watch Ultra 2 doesn’t have the new, faster S10 system-in-package. It, along with the Series 9, however, will support the sleep apnea monitoring feature, but it does not have a water temperature sensor. I’d recommend skipping the Ultra until Apple refreshes it, presumably next year, with a faster processor and brings it up to speed with the Series 10 because Apple’s flagship watch is not necessarily its best anymore.
The Apple Watch Ultra 2, in a similar fashion to the AirPods Max, just adds a new black color to the line. Again, as nice as it looks, I’d rather purchase a new Series 10 instead. Even the new FineWoven1 band option and Titanium Milanese Loop are available for sale online, so original Apple Watch Ultra owners shouldn’t feel left out, either. The Apple Watch lineup is now so confusing that it reminds me of the iPad line pre-May, where some models are just not favorable to purchase. Shame.
iPhone 16
The flagship product unveiling of this event, in my opinion, is not iPhone 16 Pro, but the regular iPhone 16, which I firmly believe is the most compelling iPhone of the event. The list of additions and changes is numerous: Apple Intelligence support, Camera Control, the A18 system-on-a-chip, a drastically improved ultra-wide camera, new camera positioning for Spatial Photos and Videos, and Macro Mode from iPhone 13 Pro. Most years, the standard iPhone is meant to be alright and usually is best a year post-release when its price drops. This year, I think it’s the iPhone to buy.
The A18 SoC powers Apple Intelligence, but the real barrier to running it on prior iPhones was a shortage of memory. When Apple Intelligence is on, it has to store the models it is using at all times in the system’s volatile memory, amounting to about 2 GB of space permanently taken up by Apple Intelligence. To accommodate this while allowing iOS to continue functioning as usual, the phone needs more memory, and this year, all iPhones have 8 GB.
The interesting part, however, is the new processor: the A18, notably not the A17 Pro from last year or a binned version of it simply called “A17.” Instead, it’s an all-new processor. iPhone 15 opted to remain with the A16 from iPhone 14 Pro instead of updating to an A17 processor, which didn’t exist; Apple only manufactured an A17 Pro chip. In my event impressions from last September, I speculated what Apple would do the following year:
The iPhone 15, released days ago, has the A16, a chip released last year, while the iPhone 15 Pro houses the A17 Pro. Does this mean that Apple will bring the A17 Pro to a non-Pro iPhone next year? I don’t think so — it purely makes no sense from a marketing standpoint for the same reason they didn’t bring the M2 Pro to the MacBook Air. The Pro chips stay in the Pro products, and the “regular” chips remain in the “regular” products. This leads me to believe that Apple is preparing for a shift coming next year: instead of putting the A17 Pro in iPhone 16, they’ll put a nerfed or binned version of the A17 Pro in it instead, simply calling it “A17.”
I was correct that Apple wouldn’t put a “Pro” chip in non-Pro iPhones, but I wasn’t about which chip it binned. This year, Apple opted to create two models of the A18: the standard A18, and a more performant A18 Pro, reminiscent of the Mac chips. Both are made on Taiwan Semiconductor Manufacturing Company’s latest 3-nanometer process, N3E, whereas the A17 Pro — as well as the M3 series — was fabricated on the older process, N3B. Quinn Nelson, host of the Apple-focused technology YouTube channel Snazzy Labs, predicted that Apple wants to ditch N3B as fast as possible and that it will in Macs later this year with the M4, switching entirely to N3E. This is the continuation of that transition and is why Apple isn’t using any derivative of the A17 Pro built on the older process.
Apple didn’t elaborate much on the A18 except for some ridiculous graphs with no labels, so I don’t think it’s worth homing in on specifications. It’s faster, though — 30 percent faster in computing, and 40 percent faster in graphics rendering with improved ray tracing. From what I can tell, it appears to be a binned version of the A18 Pro found in iPhone 16 Pro, not a completely separate chip — and though Apple highlighted the updated Neural Engine, the A16’s Neural Engine is not what prevented iPhone 15 from running Apple Intelligence.
Camera Control, aside from Apple Intelligence, is the highlight feature of this year’s iPhone models and is what was referred to in the rumors as the “Capture Button.” It is placed on the right side of the phone, below the Side Button, and is a tactile switch with a capacitive, 3D Touch-like surface. Pressing it opens the Camera app or any third-party camera utility that supports it, and pressing it again captures an image or video. Pressing in one level deeper opens controls, such as zoom, exposure, or locking autofocus, and double pressing it opens a menu to select a different camera setting to adjust. The system is undoubtedly complicated, and many controls are hidden from view at first. Jason Snell writes about it at Six Colors well:
If you keep your finger on the button and half-push twice in quick succession, you’ll be taken up one level in the hierarchy and can swipe to different commands. Then half-push once to enter whatever controls you want, and you’re back to swiping. It takes a few minutes to get used to the right set of gestures, but it’s a potentially powerful feature—and at its base, it’s still intuitive: push to bring up the camera, push to shoot, and push and hold to shoot video.
I’m sure I’ll get used to it once I begin using it, but for now, the instructions are convoluted. And, again, keeping with the unofficial event theme of the year, the lock autofocus mode is strangely coming in a future software update for some unknown reason. Even though the Action Button now comes to the low-end iPhone, I think Camera Control will be a handy utility for capturing quick shots and making the iPhone feel more like a real camera. There will no longer be a need to fumble around with Lock Screen swipe actions and controls thanks to this button, and I’m grateful for it.
Camera Control, when the iPhone is held in its portrait orientation, is used to launch a new feature exclusive to iPhone 16 and iPhone 16 Pro called Visual Intelligence, which works uncannily similar to the Humane Ai Pin and Rabbit R1: users snap a photo, Apple Intelligence recognizes subjects and scenes from it, and Visual Lookup searches the web. When I said earlier this year that those two devices would be dead, I knew this would happen — it just seemed obvious. There seems to be some cynicism around how it was marketed — someone took a photograph of a dog to look up what breed it was without asking the owner — but I’m not really paying attention to the marketing here as much as I am the practicality. This is an on-device, multimodal AI assistant everywhere, all with no added fees or useless cellular lines.
As fascinating as Visual Intelligence is, it is also coming “later this year” with no concrete release date. In fact, Apple has seemingly forgotten to even add it to the iPhone 16 and 16 Pro’s webpages. The only evidence of its existence is a brief segment in the keynote, and the omission is puzzling. I’m interested to know the reason for the secrecy: Perhaps it isn’t confident it will be able to ship it yet alongside Round 1 of the Apple Intelligence features in October? I’m unsure.
The camera has now been updated to the suite from iPhone 14 Pro. The main camera is now a 48-megapixel “Fusion” camera, a new name Apple is using to describe the 2× pixel binning feature first brought to the iPhone two years ago; and the ultra-wide is the autofocusing sensor from iPhone 13 Pro. This gives iPhone 16 four de facto lenses: a standard 1× 48-megapixel 24-millimeter sensor, a 2× binned 48-millimeter lens, a 0.5× 13-millimeter ultra-wide lens, and a macro lens powered by the ultra-wide for close-ups. This squad is versatile for tons of images — portraits and landscapes — and I’m glad it’s coming to the base-model iPhone.
The cameras are also arranged vertically, similar to the iPhone X and Xs, for Spatial Video and Photo capture for viewing on Apple Vision Pro. It’s apparent how little Apple cares about Apple Vision Pro by how quickly the presenter brushed past this item in the keynote. Apple has also added support for Spatial Photo capture on the iPhone; previously it was limited to the headset itself — Spatial Photos and Videos are now separated into their own mode in the Camera app for easy capture, too. (This wasn’t possible on iPhone 15 because both lenses were placed diagonally; they must be placed vertically or horizontally to replicate the eyes’ stereoscopic vision.)
The last two of the camera upgrades are “intelligence” focused: Audio Mix and Photographic Styles. I don’t understand the premise of the latter; here’s why: This year, Photographic Styles can be added, changed, or removed after a photo has already been taken. My question is, what is the difference between a Photographic Style and a filter? They both can be applied before and after a photo’s capture, so what is the reason for the distinction? Previously, I understood the sentiment: Photographic Styles were built into the image pipeline whereas filters just modified the photo’s hues afterward, like a neutral-density, or ND, filter. Now, Photographic Styles just seem the same as filters but perhaps more limited, and in honesty, I even forgot about their existence post-iPhone 13 Pro.
Audio Mix is a clever suite of AI audio editing features that can help remove background noise, focus on certain subjects in the frame, capture Dolby Atmos audio like a movie, or home in on a person’s speech to replicate a cardioid podcast microphone. All of this is like putting lipstick on a pig: No matter how much processing is added to iPhone microphones, they’re still pinhole-sized microphones at the bottom of a phone and they will undoubtedly sound bad and artificial. The same ML processing is also available in Voice Memos via multi-track audio, i.e., music can be played through the iPhone’s speakers while a recording is in progress and iOS will remove the song from the background afterward. In other words, it’s TikTok but made by Apple, and I’m sure it’ll be great — it’s just not for me.
All of this is wrapped in a traditional iPhone body that, this year, reminds me a bit of an Android phone with the new camera layout, but I’m sure I’ll get used to it. And, as always, it costs $800, and while I usually bemoan that price, I think it’s extremely price-competitive this year. The color selection is fantastic, too: Ultramarine is the new blue color, which looks truly stunning, and Teal and Pink look peppy, too. Here, once again, is another year of hoping for good colors on the Pro lineup, just to be disappointed by four shades of gray.
iPhone 16 is very evidently the Apple Intelligence iPhone. It is made as a catalyst to market Apple Intelligence, and yes, it’s light on features. But so has been every other iPhone since iPhone X. Most years, Apple tells a mundane story about how the iPhone is integral to our daily lives and how the next one is going to be even better. This year, the company had a different story to tell: Apple Intelligence. It successfully told that story to the masses on Monday, and in the process, we got a fantastic phone. For the first time, Apple mentioned its beta program in an iPhone keynote, all but encouraging average users to sign up and try Apple Intelligence; it’s even labeled with a prominent “Beta” label on the website. Apple Intelligence is that crucial to understanding iPhone 16.
iPhone 16 Pro
iPhone 16 Pro, from essentially every angle, is a miss. It adds four main features: the Camera Control, 4K video at 120 frames per second, a larger screen, and the A18 Pro processor. It doesn’t even have the marketability advantage of iPhone 16 because its predecessor, iPhone 15 Pro, supports Apple Intelligence. I can gawk about how beautiful I think the new Desert Titanium copper-like finish is, how slim the bezels are — the slimmest ever — or how 4K 120 fps video will improve so many workflows. All of that commentary is true, as was the slight enthusiasm I had toward iPhone 16. Nothing on iPhone 16 was revolutionary, per se, yet I was excited because (a) all of the new features came to the masses, graduating from the Pro line, and (b) the phone really wasn’t about the phone itself. iPhone 16 Pro does not carry that advantage — it can’t be about Apple Intelligence.
The Pro and non-Pro variants of the iPhone follow a tick-tock cycle: When the non-Pro model is great, the Pro model feels lackluster. When the Pro model is groundbreaking, the non-Pro feels skippable. When iPhone 12 came out, iPhone 12 Pro seemed overpriced. When iPhone 13 Pro was launched, the iPhone 13 had no value without ProMotion. The same went for iPhone 14 Pro’s Dynamic Island and iPhone 15 Pro’s titanium. Apple hasn’t given the mass market a win since 2020, but now it finally has — the Pro phone has reached an ebb in the cycle. That’s nothing to cry about because that’s how marketing works, but for the first time, iPhone 16 Pro really feels Pro. The update from last year is incremental, whereas the base-model iPhone is, for all intents and purposes, an iPhone 14 Pro without the Always-On Display and ProMotion.
I fundamentally have nothing to write home about regarding iPhone 16 Pro because it is not a very noteworthy device. When I buy mine and set it up in a few weeks, I’m sure I’ll love it and the larger display, but I’ll continue using it like my iPhone 15 Pro. But whoever buys an iPhone 16 won’t — that phone is markedly different from its predecessor. Perhaps innovation is the wrong word for such a phenomenon — it’s more like an incremental update — but it feels like what every phone should aspire to be like. I know, the logical rebuttal to this is that nobody upgrades their phone every year and that reviewers and writers live in a bubble of their own biased thoughts — and that’s true. But I’m not here writing about buying decisions; I’m writing about Apple as a company.
Thinking about a product often requires evaluating it based on what’s new, even if that is not the goal of that product. People want to know what Apple has done this year — what screams iPhone 16 rather than iPhone 15 but better. There is a key difference between those two initial thoughts. Sometimes, it’s a radical redesign. In the case of the base-model iPhone 16, it’s Apple Intelligence. iPhone 16 Pro has no such innovation, and that’s why I’m feeling sulky about it — and I’ve observed that this is not a novel vibe amongst the nerd crowd on Monday. There is truly nothing to talk about here other than that the Pro model is the necessary counterpart to the Apple Intelligence phone.
I will enjoy the new Camera Control, the 48-megapixel ultra-wide lens, which finally catches the ultra-wide up to the main sensor for crisper shots, and the 5× telephoto now coming to the standard Pro model from iPhone 15 Pro Max last year. Since the introduction of the triple camera system, all three lenses have visually looked different — the main camera is the best, the ultra-wide is the worst, and the telephoto is right in the center. Now, they should all look nice, and I’m excited about that. I’m less excited about the size increase; while the case hasn’t enlarged, the display is now 6.3 inches large on the smaller phone, and 6.9 inches large on the larger one, and I think that’s a few millimeters too large for a phone — iPhone Pro Max buyers should just buy the normal iPhone.
Like it or not, Monday’s Apple event was the WWDC rehash event. iPhone 16 is the Apple Intelligence phone, and iPhone 16 Pro is just there. But am I excited about the new phones like I was last year? Not necessarily. Maybe that’s what happens when three-quarters of the event is vaporware.
-
FineWoven watch bands and wallets are still available, but FineWoven cases have completely disappeared with no clear replacement. Apple now only sells clear plastic and silicone cases. The people have won. ↩︎