Apple killed the wrong Vision Pro project
www.macworld.com
MacworldAfter Apple kickstarted the spatial computing era with the high-priced Vision Pro mixed-reality headset last year, it seemed to be a first step toward a future all-day wearable device. However, it turns out that might not be the case. A report by Mark Gurman at Bloomberg claims that Apple is moving in the other direction after canceling its most promising augmented reality glasses project after hitting several roadblocks and focusing instead on the next-gen Vision Pro headset.Apples got it completely backward: Apple should kill Vision Pro 2 and pour more resources into its smart glasses project.More is lessIm not undermining the Vision Pros advanced technology and capabilities. Its objectively one of the highest-end consumer headsets on the market, featuring sharp displays, a dozen sensors, a slew of well-designed apps, and seamless integrations with the Apple ecosystem. Its $3,499 price tag, however, acts as the first hurdle barring mass adoption.Beyond its outrageous price tag, the Vision Pro is the first in its product line, and, naturally, its filled with imperfections and limitations. Reviewers almost unanimously agree that it is too heavy and causes neck discomfort with extended use. Thats not to mention its relatively short battery life and lack of outdoor use.visionOS is essentially an immersive iPadOS/macOS hybrid that runs in users fields of view. But, what if we dont want all of these overpriced complexities?A pair of smart glasses would be a fantastic addition to the Apple ecosystem.FoundryRay of lightWhile Meta has long catered to gamers and VR enthusiasts with its Quest headset (that costs one-tenth of a Vision Pro), its $299 Ray-Ban glasses are a different animal. For one, theyre not much more expensive than a Meta-less pair of Ray-Bans, but more importantly, theyre not catered to a niche tech-first category of customers.First, to make its device appealing to wear, Meta collaborated with Ray-Banone of the most popular sunglasses brands. Sometimes people are embarrassed to wear nerdy accessories in public. So, having the Ray-Ban branding instantly takes away that stigma.Also Meta didnt shoot for the moon like Apple did. Instead, Meta built in just a few useful perks to keep its glasses simple and cheap. The temple tips house discreet open-ear speakers for music streaming on the go, so you dont need a separate pair of earbuds. It has a set of AirPods-like controls built right into the arms.But most importantly, the glasses feature a forward-facing camera instead of a screen, which offers a window into the world for Metas AI bot to analyze what it sees and report back. And it also lets wearers take quick photos and videos for direct posting to Instagram Stories.Unlike the Vision Pro, the IPX4-certified Ray-Ban Meta are meant to be used away from home. So, users can put them on like any regular pair of sunglasses, which is great for family picnics, concerts, and influencers.Help wantedIts clear that Apple is currently focused on the flagship headset line, as the Vision Pros successor could launch as soon as next year. In my opinion, Apple shouldnt kill the glasses project but should leverage its own ecosystem advantage to create lightweight spectacles that rely on other devices to do the heavy lifting.A pair of Apple smart glasses could take some cues from the AirPods.FoundryThe AirPods, for example, can announce notifications. Similarly, the Apple Watch packs exclusive perks unavailable to rivaling smartwatch brands, such as auto Mac unlock. So, Apple is in a position to create the best smart glasses for iOS users, as no other manufacturer has access to the underlying ecosystem infrastructure. Plus there are plenty of people who just dont trust Meta.A pair of Apple glasses doesnt need to have a Vision Pro-like interface. Like the early Apple Watches, they could piggyback on a paired iPhones processor and internet connection to offer some handy shortcuts. Like the AirPods, they could handle Siri requests, announce notifications, and accept calls. And with an embedded camera, Apples new Visual Intelligence feature could come alive. The smart glasses would transmit what they see to the connected iPhone, which would then analyze the content with ChatGPT and send the response back to the shades.Other potential features could include snapping quick shots and clips that users can view in the iPhones Photos app. It could also integrate with FaceTime, letting the caller enjoy the scenic route youre taking while you talk. The possibilities are endless even without incorporating any of the current Vision Pro features.Vision Pro is too advanced to shrink down to a pair of glasses.Thiago Trevisan/FoundryShort-sighted VisionBy putting the Apple Glasses on the back burner, the company is missing the smart glasses train. Meta is actively developing more advanced iterations of the Ray-Bans, while Apple seemingly has no plans to announce a competitor anytime soon. By the time the Apple glasses potentially debut, theyre going to have tremendous competition, and with Apple insisting on high-end features, it could be years before anything comes to market.Even if the Vision Pro 2 addresses most of its predecessors shortcomings, which is unlikely, its clear that the general public isnt interested in this form factor. So, maybe Apple should study whats actually working instead of trying to repair whats inherently broken.
0 Yorumlar ·0 hisse senetleri ·47 Views