WWW.TECHSPOT.COM
Google demos Android XR smart glasses with Gemini AI, visual memory, and multilingual capabilities
Forward-looking: The race to define the future of wearable technology is heating up, with smart glasses emerging as the next major frontier. While Meta's Ray-Ban collaboration has already made waves, tech giants like Apple, Samsung, and Google are rapidly developing their own projects. The latest development comes from Google, which recently gave the public its most tangible look yet at Android XR-powered smart glasses during a live demonstration at the TED2025 conference. Until now, Google's Android XR glasses had only appeared in carefully curated teaser videos and limited hands-on previews shared with select publications. These early glimpses hinted at the potential of integrating artificial intelligence into everyday eyewear but left lingering questions about real-world performance. That changed when Shahram Izadi, Google's Android XR lead, took the TED stage – joined by Nishtha Bhatia – to demonstrate the prototype glasses in action. The live demo showcased a range of features that distinguish these glasses from previous smart eyewear attempts. At first glance, the device resembles an ordinary pair of glasses. However, it's packed with advanced technology, including a miniaturized camera, microphones, speakers, and a high-resolution color display embedded directly into the lens. The glasses are designed to be lightweight and discreet, with support for prescription lenses. They can also connect to a smartphone to leverage its processing power and access a broader range of apps. Izadi began the demo by using the glasses to display his speaker notes on stage, illustrating a practical, everyday use case. The real highlight, however, was the integration of Google's Gemini AI assistant. In a series of live interactions, Bhatia demonstrated how Gemini could generate a haiku on demand, recall the title of a book glimpsed just moments earlier, and locate a misplaced hotel key card – all through simple voice commands and real-time visual processing. But the glasses' capabilities extend well beyond these parlor tricks. The demo also featured on-the-fly translation: a sign was translated from English to Farsi, then seamlessly switched to Hindi when Bhatia addressed Gemini in that language – without any manual setting changes. // Related Stories Samsung is preparing to launch its smart glasses later this year. Other features demonstrated included visual explanations of diagrams, contextual object recognition – such as identifying a music album and offering to play a song – and heads-up navigation with a 3D map overlay projected directly into the wearer's field of view. Unveiled last December, the Android XR platform – developed in collaboration with Samsung and Qualcomm – is designed as an open, unified operating system for extended reality devices. It brings familiar Google apps into immersive environments: YouTube and Google TV on virtual big screens, Google Photos in 3D, immersive Google Maps, and Chrome with multiple floating windows. Users can interact with apps through hand gestures, voice commands, and visual cues. The platform is also compatible with existing Android apps, ensuring a robust ecosystem from the outset. Meanwhile, Samsung is preparing to launch its own smart glasses, codenamed Haean, later this year. The Haean glasses are reportedly designed for comfort and subtlety, resembling regular sunglasses and incorporating gesture-based controls via cameras and sensors. While final specifications are still being selected, the glasses are expected to feature integrated cameras, a lightweight frame, and possibly Qualcomm's Snapdragon XR2 Plus Gen 2 chip. Additional features under consideration include video recording, music playback, and voice calling.
0 Comentários 0 Compartilhamentos 64 Visualizações