![](https://wp.technologyreview.com/wp-content/uploads/2025/01/250124_smartglasses.jpg?resize=1200,600)
Whats next for smart glasses
www.technologyreview.com
MIT Technology Reviews Whats Next series looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here.For every technological gadget that becomes a household name, there are dozens that never catch on. This year marks a full decade since Google confirmed it was stopping production of Google Glass, and for a long time it appeared as though mixed-reality productsthink of the kinds of face computers that dont completely cover your field of view they way a virtual-reality headset doeswould remain the preserve of enthusiasts rather than casual consumers.Fast-forward 10 years, and smart glasses are on the verge of becomingwhisper itcool. Metas smart glasses, made in partnership with Ray-Ban, are basically indistinguishable from the iconic Wayfarers Tom Cruise made famous in Risky Business. Meta also recently showed off its fashion-forward Orion augmented reality glasses prototype, while Snap unveiled its fifth-generation Spectacles, neither of which would look out of place in the trendiest district of a major city. In December, Google showed off its new unnamed Android XR prototype glasses, and rumors that Apple is still working on a long-anticipated glasses project continue to swirl. Elsewhere, Chinese tech giants Huawei, Alibaba, Xiaomi, and Baidu are also vying for a slice of the market.Sleeker designs are certainly making this new generation of glasses more appealing. But more importantly, smart glasses are finally on the verge of becoming useful, and its clear that Big Tech is betting that augmented specs will be the next big consumer device category. Heres what to expect from smart glasses in 2025 and beyond.AI agents could finally make smart glasses truly usefulAlthough mixed-reality devices have been around for decades, they have largely benefited specialized fields, including the medical, construction, and technical remote-assistance industries, where they are likely to continue being used, possibly in more specialized ways. Microsoft is the creator of the best-known of these devices, which layer virtual content over the wearers real-world environment, and marketed its HoloLens 2 smart goggles to corporations. The company recently confirmed it was ending production of that device. Instead, it is choosing to focus on building headsets for the US military in partnership with Oculus founder Palmer Luckeys latest venture, Anduril.Now the general public may finally be getting access to devices they can use. The AI world is abuzz over agents, which augment large language models (LLMs) with the ability to carry out tasks by themselves. The past 12 months have seen huge leaps in AI multimodal LLMs abilities to handle video, images, and audio in addition to text, which opens up new applications for smart glasses that would not have been possible previously, says Louis Rosenberg, an AR researcher who worked on the first functional augmented-reality system at Stanford University in the 1990s.We already know Meta is definitely interested in AI agents. Although the company said in September that it has no plans to sell its Orion prototype glasses to the public, given their expense, Mark Zuckerberg raised expectations for its next generations of Metas smart glasses when he declared Orion the most advanced pair of AR glasses ever made. Hes also made it clear how deeply invested Meta is in bringing a highly intelligent and personalized AI assistant to as many users as possible and that hes confident Metas glasses are the perfect form factor for AI.Although Meta is already making its Ray-Ban smart glasses AI more conversationalits new live AI feature responds to prompts about what its wearer is seeing and hearing via its camera and microphonefuture agents will give these systems not only eyes and ears, but a contextual awareness of whats around them, Rosenberg says. For example, agents running on smart glasses could hold unprompted interactive conversations with their wearers based on their environment, reminding them to buy orange juice when they walk past a store, for example, or telling them the name of a coworker who passes them on the sidewalk. We already know Google is deeply interested in this agent-first approach: The unnamed smart glasses it first showed off at Google I/O in May 2024 were powered by its Astra AI agent system.Having worked on mixed reality for over 30 years, its the first time I can see an application that will really drive mass adoption, Rosenberg says.Meta and Google will likely tussle to be the sectors top dogIts unclear how far we are from that level of mass adoption. During a recent Meta earnings call, Zuckerberg said 2025 would be a defining year for understanding the future of AI glasses and whether they explode in popularity or represent a longer grind.He has reason to be optimistic, though: Meta is currently ahead of its competition thanks to the success of the Ray-Ban Meta smart glassesthe company sold more than 1 million units last year. It also is preparing to roll out new styles thanks to a partnership with Oakley, which, like Ray-Ban, is under the EssilorLuxottica umbrella of brands. And while its current second-generation specs cant show its wearer digital data and notifications, a third version complete with a small display is due for release this year, according to the Financial Times. The company is also reportedly working on a lighter, more advanced version of its Orion AR glasses, dubbed Artemis, that could go on sale as early as 2027, Bloomberg reports.Adding display capabilities will put the Ray-Ban Meta glasses on equal footing with Googles unnamed Android XR glasses project, which sports an in-lens display (the company has not yet announced a definite release date). The prototype the company demoed to journalists in September featured a version of its AI chatbot Gemini, and much they way Google built its Android OS to run on smartphones made by third parties, its Android XR software will eventually run on smart glasses made by other companies as well as its own.These two major players are competing to bring face-mounted AI to the masses in a race thats bound to intensify, adds Rosenbergespecially given that both Zuckerberg and Google cofounder Sergey Brin have called smart glasses the perfect hardware for AI. Google and Meta are really the big tech companies that are furthest ahead in the AI space on their own. Theyre very well positioned, he says. This is not just augmenting your world, its augmenting your brain.Its getting easier to make smart glassesbut its still hard to get them rightWhen the AR gaming company Niantics Michael Miller walked around CES, the gigantic consumer electronics exhibition that takes over Las Vegas each January, he says he was struck by the number of smaller companies developing their own glasses and systems to run on them, including Chinese brands DreamSmart, Thunderbird, and Rokid. While its still not a cheap endeavora business would probably need a couple of million dollars in investment to get a prototype off the ground, he saysit demonstrates that the future of the sector wont depend on Big Tech alone.On a hardware and software level, the barrier to entry has become very low, says Miller, the augmented reality hardware lead at Niantic, which has partnered with Meta, Snap, and Magic Leap, among others. But turning it into a viable consumer product is still tough. Meta caught the biggest fish in this world, and so they benefit from the Ray-Ban brand. Its hard to sell glasses when youre an unknown brand.Thats why its likely ambitious smart glasses makers in countries like Japan and China will increasingly partner with eyewear companies known locally for creating desirable frames, generating momentum in their home markets before expanding elsewhere, he suggests.More developers will start building for these devicesThese smaller players will also have an important role in creating new experiences for wearers of smart glasses. A big part of smart glasses usefulness hinges on their ability to send and receive information from a wearers smartphoneand third-party developers interest in building apps that run on them. The more the public can do with their glasses, the more likely they are to buy them.Developers are still waiting for Meta to release a software development kit (SDK) that would let them build new experiences for the Ray-Ban Meta glasses. While bigger brands are understandably wary about giving third parties access to smart glasses discreet cameras, it does limit the opportunities researchers and creatives have to push the envelope, says Paul Tennent, an associate professor in the Mixed Reality Laboratory at the University of Nottingham in the UK. But historically, Google has been a little less afraid of this, he adds.Elsewhere, Snap and smaller brands like Brilliant Labs, whose Frame glasses run multimodal AI models including Perplexity, ChatGPT, and Whisper, and Vuzix, which recently launched its AugmentOS universal operating system for smart glasses, have happily opened up their SDKs, to the delight of developers, says Patrick Chwalek, a student at the MIT Media Lab who worked on smart glasses platform Project Captivate as part of his PhD research. Vuzix is getting pretty popular at various universities and companies because people can start building experiences on top of them, he adds. Most of these are related to navigation and real-time translationI think were going to be seeing a lot of iterations of that over the next few years.
0 Comentários
·0 Compartilhamentos
·48 Visualizações