What it’s like to wear Google’s Gemini-powered AI glasses
Google wants to give people access to its Gemini AI assistant with the blink of an eye: The company has struck a partnership with eyeglasses makers Warby Parker and Gentle Monster to make AI smart glasses, it announced at its Google I/O developer conference in Mountain View Tuesday. These glasses will be powered by Google’s new Android XR platform, and are expected to be released in 2026 at the earliest.
To show what Gemini-powered smart glasses can do, Google has also built a limited number of prototype devices in partnership with Samsung. These glasses use a small display in the right lens to show live translations, directions and similar lightweight assistance. They also feature an integrated camera that gives Gemini a real-time view of your surroundings and can also be used to capture photos and videos.
“Unlike Clark Kent, you can get superpowers when you put your glasses on,” joked Android XR GM and VP Shahram Izadi during Tuesday’s keynote presentation.Going hands-on
Google demonstrated its prototype device to reporters Tuesday afternoon. Compared to a regular pair of glasses, Google’s AI device still features notably thicker temples. These house microphones, a touch interface for input, and a capture button to take photos.Despite all of that, the glasses do feel light and comfortable, similar to Meta’s Ray-Ban smart glasses.
The Google glasses’ big difference compared to Meta’s reveals itself almost immediately after putting them on: At the center of the right lens is a small, rectangular see-through display. It doesn’t obstruct your view of the world when not actively in use. However, during the demo, I at times noticed a purple reflection from the waveguide that’s at the core of the display in the upper right corner of my field-of-view.
Google’s AI assistant can be summoned with a simple touch gesture. Once active, Gemini automatically accesses the outward-facing camera of the glasses, which makes it possible to ask about anything you see. During my short demo, the assistant correctly described the content of a painting, identified its painter, and offered some information about books hand-selected by Google for the demo.
In addition to AI assistance, the glasses can also be used for live translation and navigation. Google only showed the latter to members of the media. When in Google Maps mode, the glasses automatically display turn-by-turn walking directions while looking up. Look down, and the display includes a small, circular street map floating in front of you.
The display itself looked bright and legible, even when showing multiple lines of text at a time. However, Google conducted these demos indoors; it’s unclear how bright sunlight will impact legibility.
Also unknown at this point is how long the batteries of such a device will last. Android XR glasses are designed for all-day wear, according to Izadi, but that doesn’t really tell us how many hours they can be used at a time.
Lots of open questions
Third-party apps were also notably absent from the demo. Izadi said Tuesday that glasses running Android XR will work with your phone, “giving you access to your apps while keeping your hands free.” How exactly that will work is unclear, as the display integrated into the prototype was too small to display the full UI of most apps. Most likely, Android XR will render apps in a simplified, device-optimized fashion, similar to the way apps show up on smart watches such as the Apple Watch and Google’s Android Wear devices.
The emergence of these kinds of devices also raises more fundamental questions about privacy. The prototype device shown at Google’s event this week has an LED that’s supposed to signal to bystanders when it takes photos or records video, and an internal LED that signals to the wearer when footage is being captured.
However, the LED doesn’t turn on on while Google’s Gemini assistant observes the world through the camera. According to a Google spokesperson, that’s because any video ingested this way is not being stored, but only temporarily used to make sense of the world. Bystanders, however, may not be as receptive to that distinction. They may assume that a device that can “see” the world at all times also continuously captures video.
Lastly, it’s still unclear what Google’s vision for other form factors looks like. The company also announced plans to release a pair of tethered AR glasses in partnership with Chinese AR startup Xreal Tuesday. With displays in both eyes, that device will be able to render much more immersive experiences, and presumably emphasize entertainment and work applications over more basic assistance.
In addition, Google’s roadmap for Android XR-powered devices includes glasses without any display at all. These are likely going to be similar to Meta’s Ray-Ban smart glasses, albeit with access to Google’s Gemini assistant instead of Meta’s AI. Omitting a display brings down the manufacturing costs of smart glasses, while also helping with an important goal: To make devices that look and feel familiar to anyone who has ever worn a pair of glasses.
“We know that these need to be stylish glasses that you’ll want to wear all day,” Izadi said.
#what #its #like #wear #googles
What it’s like to wear Google’s Gemini-powered AI glasses
Google wants to give people access to its Gemini AI assistant with the blink of an eye: The company has struck a partnership with eyeglasses makers Warby Parker and Gentle Monster to make AI smart glasses, it announced at its Google I/O developer conference in Mountain View Tuesday. These glasses will be powered by Google’s new Android XR platform, and are expected to be released in 2026 at the earliest.
To show what Gemini-powered smart glasses can do, Google has also built a limited number of prototype devices in partnership with Samsung. These glasses use a small display in the right lens to show live translations, directions and similar lightweight assistance. They also feature an integrated camera that gives Gemini a real-time view of your surroundings and can also be used to capture photos and videos.
“Unlike Clark Kent, you can get superpowers when you put your glasses on,” joked Android XR GM and VP Shahram Izadi during Tuesday’s keynote presentation.Going hands-on
Google demonstrated its prototype device to reporters Tuesday afternoon. Compared to a regular pair of glasses, Google’s AI device still features notably thicker temples. These house microphones, a touch interface for input, and a capture button to take photos.Despite all of that, the glasses do feel light and comfortable, similar to Meta’s Ray-Ban smart glasses.
The Google glasses’ big difference compared to Meta’s reveals itself almost immediately after putting them on: At the center of the right lens is a small, rectangular see-through display. It doesn’t obstruct your view of the world when not actively in use. However, during the demo, I at times noticed a purple reflection from the waveguide that’s at the core of the display in the upper right corner of my field-of-view.
Google’s AI assistant can be summoned with a simple touch gesture. Once active, Gemini automatically accesses the outward-facing camera of the glasses, which makes it possible to ask about anything you see. During my short demo, the assistant correctly described the content of a painting, identified its painter, and offered some information about books hand-selected by Google for the demo.
In addition to AI assistance, the glasses can also be used for live translation and navigation. Google only showed the latter to members of the media. When in Google Maps mode, the glasses automatically display turn-by-turn walking directions while looking up. Look down, and the display includes a small, circular street map floating in front of you.
The display itself looked bright and legible, even when showing multiple lines of text at a time. However, Google conducted these demos indoors; it’s unclear how bright sunlight will impact legibility.
Also unknown at this point is how long the batteries of such a device will last. Android XR glasses are designed for all-day wear, according to Izadi, but that doesn’t really tell us how many hours they can be used at a time.
Lots of open questions
Third-party apps were also notably absent from the demo. Izadi said Tuesday that glasses running Android XR will work with your phone, “giving you access to your apps while keeping your hands free.” How exactly that will work is unclear, as the display integrated into the prototype was too small to display the full UI of most apps. Most likely, Android XR will render apps in a simplified, device-optimized fashion, similar to the way apps show up on smart watches such as the Apple Watch and Google’s Android Wear devices.
The emergence of these kinds of devices also raises more fundamental questions about privacy. The prototype device shown at Google’s event this week has an LED that’s supposed to signal to bystanders when it takes photos or records video, and an internal LED that signals to the wearer when footage is being captured.
However, the LED doesn’t turn on on while Google’s Gemini assistant observes the world through the camera. According to a Google spokesperson, that’s because any video ingested this way is not being stored, but only temporarily used to make sense of the world. Bystanders, however, may not be as receptive to that distinction. They may assume that a device that can “see” the world at all times also continuously captures video.
Lastly, it’s still unclear what Google’s vision for other form factors looks like. The company also announced plans to release a pair of tethered AR glasses in partnership with Chinese AR startup Xreal Tuesday. With displays in both eyes, that device will be able to render much more immersive experiences, and presumably emphasize entertainment and work applications over more basic assistance.
In addition, Google’s roadmap for Android XR-powered devices includes glasses without any display at all. These are likely going to be similar to Meta’s Ray-Ban smart glasses, albeit with access to Google’s Gemini assistant instead of Meta’s AI. Omitting a display brings down the manufacturing costs of smart glasses, while also helping with an important goal: To make devices that look and feel familiar to anyone who has ever worn a pair of glasses.
“We know that these need to be stylish glasses that you’ll want to wear all day,” Izadi said.
#what #its #like #wear #googles
·62 Views