Google just gave vision to AI, but its still not available for everyone
Google has just officially announced the roll out of a powerful Gemini AI feature that means the intelligence can now see.This started in March as Google began to show off Gemini Live, but its now become more widely available.Recommended VideosBefore you get too excited though, at this stage at least, its only available on the Google Pixel 9 and Samsung Galaxy S25.RelatedUp until now Gemini has been a little limited, albeit in an impressive way. Its been able to understand voice, images, PDFs and even YouTube videos. Now, thanks to Project Astra, Gemini can see whats on your screen too.This means you can simply give the AI access to your screen and then ask questions about whats going on for you and it will be able to understand and answer.Perhaps even more usefully, you can share your rear camera with Gemini to talk about what youre seeing in the physical world too.Sound familiar? Yup, this is very similar to the tech Apple Intelligence was being teased as getting last year. Yet Apple has been rumoured to be struggling with this release and we may have to wait until iOS 19, or longer, before we see it arrive on iPhones.While the release is limited right now, it will soon be available to all Gemini Live subscribers using Android devices.One way to open this is to launch the Gemini overlay and select the Share screen with Live.Another way is to launch Gemini Live then select the screen share icon.In either case there is a small red timer icon at the top of the screen to show youre being viewed and listened to by Gemini Live, that you can tap into for more details.The whole experience is a bit like being on a call with a real person blurring the lines between human and AI ever further.Editors Recommendations