Mark Gurman, reporting for Bloomberg in his Power On newsletter:

But there was another type of device that wasn’t on Apple’s list until around 2022: a simpler form of smart glasses without a display. Meta Platforms Inc. pioneered this category with its camera-equipped Ray-Bans, showing that there was demand for such an item. And now Apple is currently developing its own version, internally code-named N50. The idea is to unveil the product at the end of 2026 or early the following year, with the actual release coming in 2027…

Like Meta’s offering, Apple’s glasses will be designed to handle everyday uses: capturing photos and videos, syncing with a smartphone for editing and sharing, handling phone calls, listening to notifications, playing music, and enabling hands-free interaction via a voice assistant. In Apple’s case, that assistant will be a significantly upgraded Siri coming in iOS 27.

The glasses are part of a broader, three-pronged AI wearables strategy that also includes new AirPods and a camera-equipped pendant. Each device is designed to leverage computer vision to interpret the user’s surroundings and feed contextual awareness into Siri and Apple Intelligence. That will enable features like improved turn-by-turn map directions and visual reminders…

According to employees working on the project, Apple’s strategy is to outdo competitors by tightly integrating the glasses with the iPhone and offering a higher-end build. While Meta relies heavily on partner EssilorLuxottica SA for frames, Apple is unsurprisingly planning to go at it alone in terms of design. That also should set it apart from Alphabet Inc.’s Google and Samsung Electronics Co., which are leaning on Warby Parker.

Gurman has flip-flopped on this rumor many times, but I think that’s less a reflection of Gurman’s reporting — which is oftentimes very accurate — and more indicative of uncertainty at Apple Park. Apple’s end goal has always been true augmented reality glasses that display an operating system in front of people’s eyes. It shipped Apple Vision Pro in 2024 to lay a foundation for such a product by previewing visionOS, an operating system ultimately designed for AR, not virtual reality. Despite Apple Vision Pro objectively being a VR headset physically, visionOS is centered around AR — windows floating in real space. The R1 processor, the real-time elements of visionOS, and the design of the operating system couldn’t paint a clearer picture.

Apple, according to Gurman, canceled its first AR glasses project and now thinks such a product will only be viable at the “end of the decade.” The complexity is mostly around running visionOS in such a constrained chassis. visionOS requires, at a minimum, a Mac-level chip to perform well, even without considering the real-time aspects of the operating system (which the R1 handles). Apple’s original plan was to put the processor in a separate compute unit tethered to the glasses, but again, it scuttled that project in early 2025. While we’re waiting on that product — and a lighter, cheaper Apple Vision-series device to live alongside it and increase developer adoption of visionOS — Apple is going with the Meta strategy, which I think is a great move.

The Ray-Ban Meta Glasses — not the Meta Ray-Ban Display, a flop AR product — are a hit among normal people, despite Meta’s anti-privacy reputation. That just goes to show how amazing a product they are. People use them to capture relatively high-quality photos and videos and post them to Instagram, WhatsApp, and Facebook hands-free. Some even use them to listen to podcasts and music in place of headphones. But fundamentally, they’re just a camera on people’s faces. As I’ve said before, I think Apple could do quite well in this market. It wouldn’t have the distinct social platform advantage Meta does, but it would have something Meta has desired since the dawn of the mobile era: full interoperability with a mobile operating system. If the new Siri really does succeed — and that’s a big if — it could let people use the apps and services on their iPhone without even looking at the display.

I hardly doubt anyone will carry around an iPhone, Apple Watch, Siri-enabled AirPods Pro, and Siri-enabled Apple sunglasses. Except perhaps Tim Cook, the company’s chief executive, who swears he uses every Apple product every day. (I doubt that.) But the goal is to get people to have at least one of these products on their person at all times so they can interact with the new Siri more naturally. It comes back to the idea of ambient computing, something I’ve always been bullish on. But concurrently, Apple shouldn’t try to eclipse the iPhone, its most successful product line — each of these new products should complement the iPhone well. If a person wants to move a task to the iPhone from their glasses, that should be easy. Apple’s biggest boon is that people carry around large, high-quality, Apple-made screens in their pockets all day. The new glasses should be a way to interact with Siri and agents on the fly. If it is, I surmise they’ll be a hit.

And, axiomatically, they shouldn’t cost $3,500.