The Apple AI Pendant Rumor Is Ridiculous
Mark Gurman, reporting for Bloomberg:
Apple Inc. is accelerating development of three new wearable devices as part of a shift toward artificial intelligence-powered hardware, a category also being pursued by OpenAI and Meta Platforms Inc.
The company is ramping up work on smart glasses, a pendant that can be pinned to a shirt or worn as a necklace, and AirPods with expanded AI capabilities, according to people with knowledge of the plans. All three devices are being built around the Siri digital assistant, which will rely on visual context to carry out actions.
What AI are these products going to rely on? Does anyone at Apple think the “more personalized Siri,” let alone one with advanced vision capabilities, will seriously ship in the next five years? I realize I’ve been over this, but it’s a genuine question I keep asking myself whenever stories like this one break. Just the “expanded AI capabilities” quote alone makes me think these products won’t ship until 2028 at a minimum.
The present tells us a lot about these products: that they don’t work without (a) truly outstanding AI models and (b) highly subsidized hardware. Meta’s whole shtick is AI-powered wearable devices, like the Meta Ray-Ban smart glasses, which are a hit. They’re not a hit because of the AI, but because they include an inexpensive, decently high-resolution camera that allows people to share clips to Instagram and Facebook easily. Hardly anyone uses them to trigger Meta’s voice assistant, which is close to useless. (Evidence: Nobody bought the AI-powered Ray-Ban Meta Display smart glasses.) A few hundred dollars for a pair of glasses that records video and shares it automatically to social media is a pretty good deal, but again, it’s not because of the AI, as much as Meta wants it to.
The Humane Ai Pin went in the opposite direction: it was extremely pricey and positioned itself as an AI-focused device. It ran GPT-4, the latest large language model at the time, and used advanced vision models to describe text and other surroundings. It could even search the web and reverse image search with a mandatory cellular data plan. This sounds like Apple’s product from afar. But besides the fact that this technology is useless, it didn’t even work correctly, and even if it did work, a functioning version would probably cost double. Humane ran out of money and was sold to HP, where its co-founders are now working on the finest advancements in printer software.
These two stories tell us that the problem with AI-powered wearables is that they’re largely expensive pieces of garbage nobody wants to use because Apple invented the iPhone nearly 20 years ago and the Apple Watch over a decade ago. People are fine talking to ChatGPT using the iOS app. They don’t use AI-powered reverse image search, a feature baked into new iPhones for the last two years. Ask a random person on the street if they’ve ever used Visual Intelligence and they’ll give you a blank stare — I’d bet money on it. There are plenty of use cases for AI models, but wearable devices are not one of them. People are utterly uninterested in them, and even if they were interested, the research and development would be too expensive. Not to mention it would require incredibly talented engineers who work at OpenAI for million-dollar paychecks, not Apple.
Apple has made significant progress in recent months on its glasses, code-named N50, and has recently distributed a broader set of prototypes within its hardware engineering division. The company is targeting the start of production as early as December, ahead of a public release in 2027.
Like most of Meta’s current offerings, the glasses won’t include a display. Instead, the interface will rely on speakers, microphones, and cameras — letting users make phone calls, access Siri, take actions based on surroundings, play music, and take photos. Apple aims to differentiate the product in two key areas: build quality and camera technology.
Not enough. Nobody cares about build quality — they care that the glasses say “Ray-Ban” on them. That’s enough to make a sale. And nobody cares about camera technology either, because the primary selling point of the Meta glasses is that they’re convenient. “Hey Meta, take a video and post it to Instagram.” Done. Unless Apple is willing to bring iTunes Ping back, Apple’s glasses will never have such an integration. But I know Apple — they’ll use “build quality” as an excuse to charge five times the price of Meta’s product just to have it languish for years. Evidence: Apple Vision Pro, which only last week received a YouTube app two years after visionOS shipped. But at least the glasses will have Siri.
And of course Apple is reluctant to compete in the one area where it has a distinct advantage over Meta: graphical user interfaces. This is what happens when you ditch product people in the C-suite for business executives. Not including a display on these glasses is perhaps one of the worst decisions Apple’s engineers could have made. I’m not even confident that engineers made the decision, because it’s so nonsensical. visionOS is a full-fledged mobile operating system miles ahead of anything Meta could ever dream of shipping. But Apple seems hellbent on letting it go to waste without content, apps, or care. An affordable pair of Apple smart glasses with a cutting-edge display and visionOS — with apps from all the big developers — would sell in droves.
Apple’s industrial design team hatched the pendant idea while working on the glasses — before they had settled on a design for that product. The device is reminiscent of the failed Humane AI pin, but it’s designed as an iPhone accessory rather than a standalone product.
The pendant would essentially serve as an always-on camera for the smartphone that also includes a microphone for Siri input. Some Apple employees call it the “eyes and ears” of the phone.
As much as I have lambasted Apple so far in this article, I know the company is not so short-sighted that it would let something like this ship. Dozens of experimental prototypes are put through their paces at Apple Park, most of which never see the light of day. This is almost certainly one of those projects, probably to pave the way for the cameras and sensors for future Apple glasses or AirPods. I find those two products to be much more likely to ship than a pendant nobody will wear. Apple’s engineers and C-suite almost certainly are aware of the Ai Pin, and even a company as confident as Apple would never dare play with that fire. This doesn’t feel like an Apple product to me, and this rumor is ridiculous.
The AirPods, planned for as early as this year, have been in development for a while, with Bloomberg News first reporting in early 2024 that Apple was exploring camera-equipped earbuds. The company has steadily added AI features to the product, including a new live-translation mode introduced last year.
These AirPods are probably the only product people will buy, and for good reason: people love AirPods. They’re a great product with very few faults, and for the most part, people love the new AirPods Pro 3. I have no reason to believe cameras would change that. But back to the central question I posed at the beginning of this diatribe: What AI will these products run? These cameras would be side-mounted on a user’s head, requiring sophisticated machine learning to analyze one’s surroundings and feed them to a multimodal AI model. I have no confidence that a company so deficient in AI research can make products like the ones Gurman proposes anytime in the next five years. That’s an absurd proposition.
Apple has an AI problem, but if Gurman is to be believed, it appears the company has a hardware one, too. This is a ridiculous rumor from a company that must come to terms with its institutional priorities. Does it want to sit out the AI race entirely and focus on the iPhone and Mac — which is a decent choice at this point! — or does it want to try? And if it chooses the latter path, does it have the leadership equipped to guide it through the tumult?