• Everything We Think We Know About iOS 19 (or Is It iOS 26?)

    iOS 19—or is it iOS 26, as rumors suggest?—is nearly here. Apple will almost assuredly announce the latest version of the iPhone's OS next week at WWDC 2025. As such, rumors about iOS 26's features have been coming in fast, and only more so as we approach the big event. Although none of these rumors can be confirmed at the moment, they still give us a good idea about what Apple might be considering behind the scenes.Is Apple changing iOS' name?Seems that way. While wethought the next version of iOS would be called iOS 19, Apple reportedly has other plans in place. According to Bloomberg's Mark Gurman, iOS 19 will be iOS 26, taking the name of the following year. Apple is reportedly doing this with all of its OS titles, including iPadOS 26, macOS 26, watchOS 26, and visionOS 26. A fresh look for iOS 26While iOS has changed considerably in recent years, the overall design language still follows the last big UI overhaul: 2013's iOS 7. It's been nearly 12 years since Apple has mixed things up, leaving users to design their own Lock Screens and app icons. According to Gurman, however, that might change with iOS 26, as well as iPadOS 26 and macOS 26. The details are scarce, but Gurman reports sources within Apple say the company wants to better align the design languages across its various products, without merging those OSes entirely, while also simplifying the way you interact with these devices. That means iOS 26 could adopt the design of visionOS, which uses circles instead of squares for app icons, translucent window elements, and the adoption of 3D—though that latter element may be difficult to translate on a 2D display.You can see those elements on display in this concept video from Jon Prosser. If the rumors are correct, we could be looking at "one of the most dramatic software overhauls in the company’s history."

    In addition to a new look, iOS may be getting a little less buggy. Gurman says that stability is a big priority for Apple this year, which is music to my ears.Live translation for AirPodsGurman also says that Apple is working on a live translation feature for certain AirPods models. If you're having a conversation with someone who is speaking a language you don't know, your AirPods will translate and dictate those words in your target language automatically. When you speak, your words will be translated and dictated by your iPhone, via the Translate app. This feature isn't groundbreaking—Google's Pixel Buds have offered it for years. But it'd be a great addition to iOS 26, and to AirPods users. Accessibility featuresApple doesn't reveal much about its upcoming updates before officially announcing them, but accessibility features are an exception. Last month, the company unveiled a list of accessibility features coming to devices "later this year." While they don't name iOS 26 and other "26" updates, it's pretty obvious those are the updates we'll see them in. There's a new Accessibility Reader feature that makes text easier to read across iOS; Magnifier is coming to Mac; you'll see "Accessibility nutrition" labels on the App Store, to denote how accessible an app is; Apple Watch is getting Live Captions, and Vehicle Motion Cues are coming to the Mac. 'Desktop' modeRumor has it that Apple is working on a "Stage Manager-like" desktop mode for USB-C iPhones with iOS 26. The feature would let you plug your iPhone into an external monitor, so you can extend your iPhone's screen to the larger display.This might not be a true "desktop mode" experience, like Samsung DeX, in that you might not be able to use your iPhone as a portable computer this way. But it could make it easier to share your iPhone's display when you want to connect to a larger screen. You only need to sign into public wifi networks onceIf you use multiple devices on public wifi networks, it's a pain to connect each one manually. That might be changing with Apple's upcoming updates: Gurman says that once updated, you'll only need to log into the wifi with one Apple deviceand the rest will automatically connect.Battery upgradesHere's a great use for AI: optimizing battery life. Rumor has it iOS 26 will analyze your usage habits and determine the right times to lower performance in the name of preserving battery life. Extending the amount of time between charges is something we can all get behind. In addition, the company may add a charging indicator to the lock screen, so you know how long your battery has left to charge. This small feature has been sorely missing on iPhones for years. Apple offers it on MacBooks, but only in Activity Monitor. I hope its brings it back to the menu bar in a future update. Your iPhone willrun iOS 26With any luck, your current iPhone should be compatible with iOS 26, assuming you're currently running the latest software. Citing a source within Apple, French website iPhoneSoft.fr reports that any phone that runs iOS 18 should be compatible with iOS 26 as well. However, the iPad 7 will supposedly not be so lucky, as the website says it will not be included in the iPadOS 26 update.However, a MacRumors source says that Apple plans to drop the iPhone XR, XS, and XS Max this year. We'll have to wait and see what Apple announces on Monday to find out which phones will still be supported.A new gaming appAccording to Gurman, Apple is working on a dedicated gaming app for iOS, iPadOS, macOS, and tvOS, meant to replace the existing Game Center. The app will both let you launch titles, as well as check leader boards, chat with friends, and see your achievements. If true, it'll be interesting timing, considering the announcement will come four days after the launch of the Nintendo Switch 2. I have my doubts that such an app could compete with a gaming titan like Nintendo, or that this will really expand beyond the traditional short and sweet mobile game experience, but who knows. Maybe Apple is about to become a serious gaming company.Shortcuts get Apple Intelligence integrationThe Shortcuts app lets you set up "shortcuts," which you can use to automate tasks across your Apple devices. Gurman says with iOS 26, Apple is integrating Apple Intelligence into the Shortcuts app, which might let you create shortcuts with natural language—or, in other words, describe the shortcuts you want and have the AI build them for you. Small updatesAccording to 9to5Mac, Apple has plans to add a new feature or two to a handful of apps. That includes:Messages: Automatic translation for incoming and outgoing messages, as well as polls.Music: Full-screen animated artwork on the lock screenNotes: Markdown support, a huge plus for pro-notes usersCarPlay: A redesigned UI to compliment iOS 26The merging of Siri and Apple IntelligenceAccording to Gurman, Apple plans to merge Siri with Apple Intelligence sometime during the iOS 26 patch cycle. Yes, the assistant is currently listed as being part of Apple Intelligence, but behind the scenes, it supposedly has a new LLM in the works that would unify its currently split architecture and allow it to more frequently handle complex requests. As of now, its AI features are much more limited, and most of Siri doesn't use this type of AI at all.Gurman says he expects the merger to be completed by spring of 2026 with the launch of iOS 26.4. His report states that, originally, Apple's plan was to launch a more conversational Siri in the same update, but that's been delayed and is not expected to be unveiled at WWDC 2025.Gurman also indicates that because Apple has not yet completed last year's Apple Intelligence feature rollout, any unannounced features shouldn't be expected for a while.
    #everything #think #know #about #ios
    Everything We Think We Know About iOS 19 (or Is It iOS 26?)
    iOS 19—or is it iOS 26, as rumors suggest?—is nearly here. Apple will almost assuredly announce the latest version of the iPhone's OS next week at WWDC 2025. As such, rumors about iOS 26's features have been coming in fast, and only more so as we approach the big event. Although none of these rumors can be confirmed at the moment, they still give us a good idea about what Apple might be considering behind the scenes.Is Apple changing iOS' name?Seems that way. While wethought the next version of iOS would be called iOS 19, Apple reportedly has other plans in place. According to Bloomberg's Mark Gurman, iOS 19 will be iOS 26, taking the name of the following year. Apple is reportedly doing this with all of its OS titles, including iPadOS 26, macOS 26, watchOS 26, and visionOS 26. A fresh look for iOS 26While iOS has changed considerably in recent years, the overall design language still follows the last big UI overhaul: 2013's iOS 7. It's been nearly 12 years since Apple has mixed things up, leaving users to design their own Lock Screens and app icons. According to Gurman, however, that might change with iOS 26, as well as iPadOS 26 and macOS 26. The details are scarce, but Gurman reports sources within Apple say the company wants to better align the design languages across its various products, without merging those OSes entirely, while also simplifying the way you interact with these devices. That means iOS 26 could adopt the design of visionOS, which uses circles instead of squares for app icons, translucent window elements, and the adoption of 3D—though that latter element may be difficult to translate on a 2D display.You can see those elements on display in this concept video from Jon Prosser. If the rumors are correct, we could be looking at "one of the most dramatic software overhauls in the company’s history." In addition to a new look, iOS may be getting a little less buggy. Gurman says that stability is a big priority for Apple this year, which is music to my ears.Live translation for AirPodsGurman also says that Apple is working on a live translation feature for certain AirPods models. If you're having a conversation with someone who is speaking a language you don't know, your AirPods will translate and dictate those words in your target language automatically. When you speak, your words will be translated and dictated by your iPhone, via the Translate app. This feature isn't groundbreaking—Google's Pixel Buds have offered it for years. But it'd be a great addition to iOS 26, and to AirPods users. Accessibility featuresApple doesn't reveal much about its upcoming updates before officially announcing them, but accessibility features are an exception. Last month, the company unveiled a list of accessibility features coming to devices "later this year." While they don't name iOS 26 and other "26" updates, it's pretty obvious those are the updates we'll see them in. There's a new Accessibility Reader feature that makes text easier to read across iOS; Magnifier is coming to Mac; you'll see "Accessibility nutrition" labels on the App Store, to denote how accessible an app is; Apple Watch is getting Live Captions, and Vehicle Motion Cues are coming to the Mac. 'Desktop' modeRumor has it that Apple is working on a "Stage Manager-like" desktop mode for USB-C iPhones with iOS 26. The feature would let you plug your iPhone into an external monitor, so you can extend your iPhone's screen to the larger display.This might not be a true "desktop mode" experience, like Samsung DeX, in that you might not be able to use your iPhone as a portable computer this way. But it could make it easier to share your iPhone's display when you want to connect to a larger screen. You only need to sign into public wifi networks onceIf you use multiple devices on public wifi networks, it's a pain to connect each one manually. That might be changing with Apple's upcoming updates: Gurman says that once updated, you'll only need to log into the wifi with one Apple deviceand the rest will automatically connect.Battery upgradesHere's a great use for AI: optimizing battery life. Rumor has it iOS 26 will analyze your usage habits and determine the right times to lower performance in the name of preserving battery life. Extending the amount of time between charges is something we can all get behind. In addition, the company may add a charging indicator to the lock screen, so you know how long your battery has left to charge. This small feature has been sorely missing on iPhones for years. Apple offers it on MacBooks, but only in Activity Monitor. I hope its brings it back to the menu bar in a future update. Your iPhone willrun iOS 26With any luck, your current iPhone should be compatible with iOS 26, assuming you're currently running the latest software. Citing a source within Apple, French website iPhoneSoft.fr reports that any phone that runs iOS 18 should be compatible with iOS 26 as well. However, the iPad 7 will supposedly not be so lucky, as the website says it will not be included in the iPadOS 26 update.However, a MacRumors source says that Apple plans to drop the iPhone XR, XS, and XS Max this year. We'll have to wait and see what Apple announces on Monday to find out which phones will still be supported.A new gaming appAccording to Gurman, Apple is working on a dedicated gaming app for iOS, iPadOS, macOS, and tvOS, meant to replace the existing Game Center. The app will both let you launch titles, as well as check leader boards, chat with friends, and see your achievements. If true, it'll be interesting timing, considering the announcement will come four days after the launch of the Nintendo Switch 2. I have my doubts that such an app could compete with a gaming titan like Nintendo, or that this will really expand beyond the traditional short and sweet mobile game experience, but who knows. Maybe Apple is about to become a serious gaming company.Shortcuts get Apple Intelligence integrationThe Shortcuts app lets you set up "shortcuts," which you can use to automate tasks across your Apple devices. Gurman says with iOS 26, Apple is integrating Apple Intelligence into the Shortcuts app, which might let you create shortcuts with natural language—or, in other words, describe the shortcuts you want and have the AI build them for you. Small updatesAccording to 9to5Mac, Apple has plans to add a new feature or two to a handful of apps. That includes:Messages: Automatic translation for incoming and outgoing messages, as well as polls.Music: Full-screen animated artwork on the lock screenNotes: Markdown support, a huge plus for pro-notes usersCarPlay: A redesigned UI to compliment iOS 26The merging of Siri and Apple IntelligenceAccording to Gurman, Apple plans to merge Siri with Apple Intelligence sometime during the iOS 26 patch cycle. Yes, the assistant is currently listed as being part of Apple Intelligence, but behind the scenes, it supposedly has a new LLM in the works that would unify its currently split architecture and allow it to more frequently handle complex requests. As of now, its AI features are much more limited, and most of Siri doesn't use this type of AI at all.Gurman says he expects the merger to be completed by spring of 2026 with the launch of iOS 26.4. His report states that, originally, Apple's plan was to launch a more conversational Siri in the same update, but that's been delayed and is not expected to be unveiled at WWDC 2025.Gurman also indicates that because Apple has not yet completed last year's Apple Intelligence feature rollout, any unannounced features shouldn't be expected for a while. #everything #think #know #about #ios
    Everything We Think We Know About iOS 19 (or Is It iOS 26?)
    lifehacker.com
    iOS 19—or is it iOS 26, as rumors suggest?—is nearly here. Apple will almost assuredly announce the latest version of the iPhone's OS next week at WWDC 2025. As such, rumors about iOS 26's features have been coming in fast, and only more so as we approach the big event. Although none of these rumors can be confirmed at the moment, they still give us a good idea about what Apple might be considering behind the scenes.Is Apple changing iOS' name?Seems that way. While we (logically) thought the next version of iOS would be called iOS 19, Apple reportedly has other plans in place. According to Bloomberg's Mark Gurman, iOS 19 will be iOS 26, taking the name of the following year. Apple is reportedly doing this with all of its OS titles, including iPadOS 26, macOS 26, watchOS 26, and visionOS 26. A fresh look for iOS 26While iOS has changed considerably in recent years, the overall design language still follows the last big UI overhaul: 2013's iOS 7. It's been nearly 12 years since Apple has mixed things up, leaving users to design their own Lock Screens and app icons. According to Gurman, however, that might change with iOS 26, as well as iPadOS 26 and macOS 26. The details are scarce, but Gurman reports sources within Apple say the company wants to better align the design languages across its various products, without merging those OSes entirely, while also simplifying the way you interact with these devices. That means iOS 26 could adopt the design of visionOS, which uses circles instead of squares for app icons, translucent window elements, and the adoption of 3D—though that latter element may be difficult to translate on a 2D display.You can see those elements on display in this concept video from Jon Prosser. If the rumors are correct, we could be looking at "one of the most dramatic software overhauls in the company’s history." In addition to a new look, iOS may be getting a little less buggy. Gurman says that stability is a big priority for Apple this year, which is music to my ears.Live translation for AirPodsGurman also says that Apple is working on a live translation feature for certain AirPods models. If you're having a conversation with someone who is speaking a language you don't know, your AirPods will translate and dictate those words in your target language automatically. When you speak, your words will be translated and dictated by your iPhone, via the Translate app. This feature isn't groundbreaking—Google's Pixel Buds have offered it for years. But it'd be a great addition to iOS 26, and to AirPods users. Accessibility featuresApple doesn't reveal much about its upcoming updates before officially announcing them, but accessibility features are an exception. Last month, the company unveiled a list of accessibility features coming to devices "later this year." While they don't name iOS 26 and other "26" updates, it's pretty obvious those are the updates we'll see them in. There's a new Accessibility Reader feature that makes text easier to read across iOS; Magnifier is coming to Mac; you'll see "Accessibility nutrition" labels on the App Store, to denote how accessible an app is; Apple Watch is getting Live Captions, and Vehicle Motion Cues are coming to the Mac. 'Desktop' modeRumor has it that Apple is working on a "Stage Manager-like" desktop mode for USB-C iPhones with iOS 26. The feature would let you plug your iPhone into an external monitor, so you can extend your iPhone's screen to the larger display.This might not be a true "desktop mode" experience, like Samsung DeX, in that you might not be able to use your iPhone as a portable computer this way. But it could make it easier to share your iPhone's display when you want to connect to a larger screen. You only need to sign into public wifi networks onceIf you use multiple devices on public wifi networks, it's a pain to connect each one manually. That might be changing with Apple's upcoming updates: Gurman says that once updated, you'll only need to log into the wifi with one Apple device (your iPhone, iPad, or Mac) and the rest will automatically connect.Battery upgradesHere's a great use for AI: optimizing battery life. Rumor has it iOS 26 will analyze your usage habits and determine the right times to lower performance in the name of preserving battery life. Extending the amount of time between charges is something we can all get behind. In addition, the company may add a charging indicator to the lock screen, so you know how long your battery has left to charge. This small feature has been sorely missing on iPhones for years. Apple offers it on MacBooks, but only in Activity Monitor. I hope its brings it back to the menu bar in a future update. Your iPhone will (probably) run iOS 26With any luck, your current iPhone should be compatible with iOS 26, assuming you're currently running the latest software. Citing a source within Apple, French website iPhoneSoft.fr reports that any phone that runs iOS 18 should be compatible with iOS 26 as well. However, the iPad 7 will supposedly not be so lucky, as the website says it will not be included in the iPadOS 26 update.However, a MacRumors source says that Apple plans to drop the iPhone XR, XS, and XS Max this year. We'll have to wait and see what Apple announces on Monday to find out which phones will still be supported.A new gaming appAccording to Gurman, Apple is working on a dedicated gaming app for iOS, iPadOS, macOS, and tvOS, meant to replace the existing Game Center. The app will both let you launch titles, as well as check leader boards, chat with friends, and see your achievements. If true, it'll be interesting timing, considering the announcement will come four days after the launch of the Nintendo Switch 2. I have my doubts that such an app could compete with a gaming titan like Nintendo, or that this will really expand beyond the traditional short and sweet mobile game experience, but who knows. Maybe Apple is about to become a serious gaming company. (I doubt it.)Shortcuts get Apple Intelligence integrationThe Shortcuts app lets you set up "shortcuts," which you can use to automate tasks across your Apple devices. Gurman says with iOS 26, Apple is integrating Apple Intelligence into the Shortcuts app, which might let you create shortcuts with natural language—or, in other words, describe the shortcuts you want and have the AI build them for you. Small updatesAccording to 9to5Mac, Apple has plans to add a new feature or two to a handful of apps. That includes:Messages: Automatic translation for incoming and outgoing messages, as well as polls.Music: Full-screen animated artwork on the lock screenNotes: Markdown support, a huge plus for pro-notes usersCarPlay: A redesigned UI to compliment iOS 26The merging of Siri and Apple IntelligenceAccording to Gurman, Apple plans to merge Siri with Apple Intelligence sometime during the iOS 26 patch cycle. Yes, the assistant is currently listed as being part of Apple Intelligence, but behind the scenes, it supposedly has a new LLM in the works that would unify its currently split architecture and allow it to more frequently handle complex requests. As of now, its AI features are much more limited, and most of Siri doesn't use this type of AI at all.Gurman says he expects the merger to be completed by spring of 2026 with the launch of iOS 26.4. His report states that, originally, Apple's plan was to launch a more conversational Siri in the same update, but that's been delayed and is not expected to be unveiled at WWDC 2025.Gurman also indicates that because Apple has not yet completed last year's Apple Intelligence feature rollout, any unannounced features shouldn't be expected for a while.
    Like
    Love
    Wow
    Sad
    Angry
    364
    · 0 Comments ·0 Shares ·0 Reviews
  • I Don't Usually Enjoy Meditation, but Peloton's Meditation Classes Are Surprisingly Helpful

    I do not consider myself a very woo-woo person, someone who's in touch with their spirituality, or even someone particularly sentimental. The concept of meditating, like so many other things I regard "too mystical," has never appealed to me, but I'll tell you what does: Working out, being physically healthy, and staying on top of my goals. That's why Peloton's approach to meditation sessions appealed to me more than others have before.I'm always browsing the Peloton app for new workout options and recently stumbled across the guided meditation classes it offers alongside cycling, walking, yoga, strength training, and more. At first, I didn't see the appeal. I use the app and its classes to get sweaty, burn calories, and enhance my body's performance, after all. But as it turns out, these are really cool and can put you in a better mental space, which clears the way for you to do all that other stuff. Since discovering them, I've been streaming them quite a bit. Here's why you should, too.What are Peloton's meditation classes all about?Using Peloton's app—which is included on the touchscreens of its at-home workout equipment, can be downloaded to your phone, or even streamed on devices like a Roku—you can access a variety of class types. Tap Meditation from the home screen and you'll be shown hundreds of meditation options that range in length from five minutes to 30. As with any Peloton offering, they're led by a number of different instructors; if you take enough of them, you'll find a favorite or two, but what really stands out is that there are different categories available, such as:sleep mindfulness anxiety focus recoverygratitudehappinessrelaxingEach class is designed for a specific purpose, so you can choose if you want to "flow and let go," embrace a bright morning, or even take one designed for use on your evening commute. You can filter by class type, which lets you break down the classes by categories like "Daily Meditation," "Meditation Basics," "Emotions," "Theme," or "Walking Meditation." There are even some for pre- and post-natal meditation. You don't need any special equipment; the instructors usually lead off simply by suggesting how you should position your body. Meditations can be added to class Stacks, which are Peloton's version of playlists that cycles through pre-selected classes, allowing you to customize your entire workout before it begins. If you have your Apple Watch linked up to your Peloton accountthe app will track your heart rate and input the meditation into your Apple Health tracker, listing it as "Mind & Body" under your sessions. Why I like Peloton's meditationsAs I said, I'm not a very spiritual or soulful person, so I appreciate that the meditation classes I've taken through the app aren't overly mushy. Rather, they're pretty straightforward: The instructors speak clearly and plainly, don't rely too much on frivolous imagery, and instead, draw your attention to your breathing and body in a way that actually helps you feel more connected to both. For as much energy as I put into working out and strengthening my body, I do struggle with things like the "mind-muscle connection" or just identifying how different parts of my body are feeling, so these sessions, where I'm asked to focus intently on certain areas and connect to how I'm feeling in a given moment, are actually pretty beneficial to my quest to become stronger and healthier overall.I first tried the Peloton meditations a few weeks ago while waiting for the bus. I was having a very busy day and had had absolutely no time to work out, not even on my Peloton bike, which was adding to my stress while I waited for the bus to pull up and take me to more busy activities. I didn't want to lose my Peloton streak, so I opened the app to see if there was a quick walking workout I could get in, maybe by strolling to the next-farthest bus stop, and saw the meditations option. I picked a five-minute meditation and listened to it while I stood on the sidewalk—and it did chill me out, help me focus, and reinvigorate me a little, almost the same as a traditional workout would have, but without taking up as much time or making me a sweaty mess. I tried a few others over the next few days and found them really useful, especially during long hours in the car on my Memorial Day road trip or earlier this week and when my team was losing a baseball game and I was not enjoying the experience of watching. Obviously, these classes are a lot different from Peloton's usual offerings. I'm not sweating, straining myself, or enhancing my cardiovascular or respiratory function. Instead, I'm strengthening my mind, training myself to focus on my breathing and feelings. Those abilities translate really well to being able to continue my fitness journey as well as just handle whatever is going on in an average day. I think these are especially useful as a pick-me-up, a kickstart for the day, or a post-workout wind-down. I also appreciate how accessible the classes are. You can toggle on closed captions, for instance, and the audio and video components are high-quality, making the instructors easy to understand. The background music never muffles the instructors' voices, there is a diverse selection of instructors and class types, and there really does seem to be something for everyone, whether you want to walk and listen or only have five minutes to devote to grounding yourself.
    #don039t #usually #enjoy #meditation #but
    I Don't Usually Enjoy Meditation, but Peloton's Meditation Classes Are Surprisingly Helpful
    I do not consider myself a very woo-woo person, someone who's in touch with their spirituality, or even someone particularly sentimental. The concept of meditating, like so many other things I regard "too mystical," has never appealed to me, but I'll tell you what does: Working out, being physically healthy, and staying on top of my goals. That's why Peloton's approach to meditation sessions appealed to me more than others have before.I'm always browsing the Peloton app for new workout options and recently stumbled across the guided meditation classes it offers alongside cycling, walking, yoga, strength training, and more. At first, I didn't see the appeal. I use the app and its classes to get sweaty, burn calories, and enhance my body's performance, after all. But as it turns out, these are really cool and can put you in a better mental space, which clears the way for you to do all that other stuff. Since discovering them, I've been streaming them quite a bit. Here's why you should, too.What are Peloton's meditation classes all about?Using Peloton's app—which is included on the touchscreens of its at-home workout equipment, can be downloaded to your phone, or even streamed on devices like a Roku—you can access a variety of class types. Tap Meditation from the home screen and you'll be shown hundreds of meditation options that range in length from five minutes to 30. As with any Peloton offering, they're led by a number of different instructors; if you take enough of them, you'll find a favorite or two, but what really stands out is that there are different categories available, such as:sleep mindfulness anxiety focus recoverygratitudehappinessrelaxingEach class is designed for a specific purpose, so you can choose if you want to "flow and let go," embrace a bright morning, or even take one designed for use on your evening commute. You can filter by class type, which lets you break down the classes by categories like "Daily Meditation," "Meditation Basics," "Emotions," "Theme," or "Walking Meditation." There are even some for pre- and post-natal meditation. You don't need any special equipment; the instructors usually lead off simply by suggesting how you should position your body. Meditations can be added to class Stacks, which are Peloton's version of playlists that cycles through pre-selected classes, allowing you to customize your entire workout before it begins. If you have your Apple Watch linked up to your Peloton accountthe app will track your heart rate and input the meditation into your Apple Health tracker, listing it as "Mind & Body" under your sessions. Why I like Peloton's meditationsAs I said, I'm not a very spiritual or soulful person, so I appreciate that the meditation classes I've taken through the app aren't overly mushy. Rather, they're pretty straightforward: The instructors speak clearly and plainly, don't rely too much on frivolous imagery, and instead, draw your attention to your breathing and body in a way that actually helps you feel more connected to both. For as much energy as I put into working out and strengthening my body, I do struggle with things like the "mind-muscle connection" or just identifying how different parts of my body are feeling, so these sessions, where I'm asked to focus intently on certain areas and connect to how I'm feeling in a given moment, are actually pretty beneficial to my quest to become stronger and healthier overall.I first tried the Peloton meditations a few weeks ago while waiting for the bus. I was having a very busy day and had had absolutely no time to work out, not even on my Peloton bike, which was adding to my stress while I waited for the bus to pull up and take me to more busy activities. I didn't want to lose my Peloton streak, so I opened the app to see if there was a quick walking workout I could get in, maybe by strolling to the next-farthest bus stop, and saw the meditations option. I picked a five-minute meditation and listened to it while I stood on the sidewalk—and it did chill me out, help me focus, and reinvigorate me a little, almost the same as a traditional workout would have, but without taking up as much time or making me a sweaty mess. I tried a few others over the next few days and found them really useful, especially during long hours in the car on my Memorial Day road trip or earlier this week and when my team was losing a baseball game and I was not enjoying the experience of watching. Obviously, these classes are a lot different from Peloton's usual offerings. I'm not sweating, straining myself, or enhancing my cardiovascular or respiratory function. Instead, I'm strengthening my mind, training myself to focus on my breathing and feelings. Those abilities translate really well to being able to continue my fitness journey as well as just handle whatever is going on in an average day. I think these are especially useful as a pick-me-up, a kickstart for the day, or a post-workout wind-down. I also appreciate how accessible the classes are. You can toggle on closed captions, for instance, and the audio and video components are high-quality, making the instructors easy to understand. The background music never muffles the instructors' voices, there is a diverse selection of instructors and class types, and there really does seem to be something for everyone, whether you want to walk and listen or only have five minutes to devote to grounding yourself. #don039t #usually #enjoy #meditation #but
    I Don't Usually Enjoy Meditation, but Peloton's Meditation Classes Are Surprisingly Helpful
    lifehacker.com
    I do not consider myself a very woo-woo person, someone who's in touch with their spirituality, or even someone particularly sentimental. The concept of meditating, like so many other things I regard "too mystical," has never appealed to me, but I'll tell you what does: Working out, being physically healthy, and staying on top of my goals. That's why Peloton's approach to meditation sessions appealed to me more than others have before.I'm always browsing the Peloton app for new workout options and recently stumbled across the guided meditation classes it offers alongside cycling, walking, yoga, strength training, and more. At first, I didn't see the appeal. I use the app and its classes to get sweaty, burn calories, and enhance my body's performance, after all. But as it turns out, these are really cool and can put you in a better mental space, which clears the way for you to do all that other stuff. Since discovering them, I've been streaming them quite a bit. Here's why you should, too.What are Peloton's meditation classes all about?Using Peloton's app—which is included on the touchscreens of its at-home workout equipment, can be downloaded to your phone, or even streamed on devices like a Roku—you can access a variety of class types. Tap Meditation from the home screen and you'll be shown hundreds of meditation options that range in length from five minutes to 30. As with any Peloton offering, they're led by a number of different instructors; if you take enough of them, you'll find a favorite or two, but what really stands out is that there are different categories available, such as:sleep mindfulness anxiety focus recoverygratitudehappinessrelaxingEach class is designed for a specific purpose, so you can choose if you want to "flow and let go," embrace a bright morning, or even take one designed for use on your evening commute. You can filter by class type, which lets you break down the classes by categories like "Daily Meditation," "Meditation Basics," "Emotions," "Theme," or "Walking Meditation." There are even some for pre- and post-natal meditation. You don't need any special equipment; the instructors usually lead off simply by suggesting how you should position your body. Meditations can be added to class Stacks, which are Peloton's version of playlists that cycles through pre-selected classes, allowing you to customize your entire workout before it begins. If you have your Apple Watch linked up to your Peloton account (and you should!) the app will track your heart rate and input the meditation into your Apple Health tracker, listing it as "Mind & Body" under your sessions. Why I like Peloton's meditationsAs I said, I'm not a very spiritual or soulful person, so I appreciate that the meditation classes I've taken through the app aren't overly mushy. Rather, they're pretty straightforward: The instructors speak clearly and plainly, don't rely too much on frivolous imagery, and instead, draw your attention to your breathing and body in a way that actually helps you feel more connected to both. For as much energy as I put into working out and strengthening my body, I do struggle with things like the "mind-muscle connection" or just identifying how different parts of my body are feeling, so these sessions, where I'm asked to focus intently on certain areas and connect to how I'm feeling in a given moment, are actually pretty beneficial to my quest to become stronger and healthier overall.I first tried the Peloton meditations a few weeks ago while waiting for the bus. I was having a very busy day and had had absolutely no time to work out, not even on my Peloton bike, which was adding to my stress while I waited for the bus to pull up and take me to more busy activities. I didn't want to lose my Peloton streak, so I opened the app to see if there was a quick walking workout I could get in, maybe by strolling to the next-farthest bus stop, and saw the meditations option. I picked a five-minute meditation and listened to it while I stood on the sidewalk—and it did chill me out, help me focus, and reinvigorate me a little, almost the same as a traditional workout would have, but without taking up as much time or making me a sweaty mess. I tried a few others over the next few days and found them really useful, especially during long hours in the car on my Memorial Day road trip or earlier this week and when my team was losing a baseball game and I was not enjoying the experience of watching. Obviously, these classes are a lot different from Peloton's usual offerings. I'm not sweating, straining myself, or enhancing my cardiovascular or respiratory function. Instead, I'm strengthening my mind, training myself to focus on my breathing and feelings. Those abilities translate really well to being able to continue my fitness journey as well as just handle whatever is going on in an average day. I think these are especially useful as a pick-me-up, a kickstart for the day, or a post-workout wind-down. I also appreciate how accessible the classes are. You can toggle on closed captions, for instance, and the audio and video components are high-quality, making the instructors easy to understand. The background music never muffles the instructors' voices, there is a diverse selection of instructors and class types, and there really does seem to be something for everyone, whether you want to walk and listen or only have five minutes to devote to grounding yourself.
    0 Comments ·0 Shares ·0 Reviews
  • Real TikTokers are pretending to be Veo 3 AI creations for fun, attention

    The turing test in reverse

    Real TikTokers are pretending to be Veo 3 AI creations for fun, attention

    From music videos to "Are you a prompt?" stunts, "real" videos are presenting as AI

    Kyle Orland



    May 31, 2025 7:08 am

    |

    13

    Of course I'm an AI creation! Why would you even doubt it?

    Credit:

    Getty Images

    Of course I'm an AI creation! Why would you even doubt it?

    Credit:

    Getty Images

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    Since Google released its Veo 3 AI model last week, social media users have been having fun with its ability to quickly generate highly realistic eight-second clips complete with sound and lip-synced dialogue. TikTok's algorithm has been serving me plenty of Veo-generated videos featuring impossible challenges, fake news reports, and even surreal short narrative films, to name just a few popular archetypes.
    However, among all the AI-generated video experiments spreading around, I've also noticed a surprising counter-trend on my TikTok feed. Amid all the videos of Veo-generated avatars pretending to be real people, there are now also a bunch of videos of real people pretending to be Veo-generated avatars.
    “This has to be real. There’s no way it's AI.”
    I stumbled on this trend when the TikTok algorithm fed me this video topped with the extra-large caption "Google VEO 3 THIS IS 100% AI." As I watched and listened to the purported AI-generated band that appeared to be playing in the crowded corner of someone's living room, I read the caption containing the supposed prompt that had generated the clip: "a band of brothers with beards playing rock music in 6/8 with an accordion."

    @kongosmusicWe are so cooked. This took 3 mins to generate. Simple prompt: “a band of brothers playing rock music in 6/8 with an accordion”♬ original sound - KONGOS

    After a few seconds of taking those captions at face value, something started to feel a little off. After a few more seconds, I finally noticed the video was posted by Kongos, an indie band that you might recognize from their minor 2012 hit "Come With Me Now." And after a little digging, I discovered the band in the video was actually just Kongos, and the tune was a 9-year-old song that the band had dressed up as an AI creation to get attention.
    Here's the sad thing: It worked! Without the "Look what Veo 3 did!" hook, I might have quickly scrolled by this video before I took the time to listen to thesong. The novel AI angle made me stop just long enough to pay attention to a Kongos song for the first time in over a decade.

    Kongos isn't the only musical act trying to grab attention by claiming their real performances are AI creations. Darden Bela posted that Veo 3 had "created a realistic AI music video" over a clip from what is actually a 2-year-old music video with some unremarkable special effects. Rapper GameBoi Pat dressed up an 11-month-old song with a new TikTok clip captioned "Google's Veo 3 created a realistic sounding rapper... This has to be real. There's no way it's AI". I could go on, but you get the idea.

    @gameboi_pat This has got to be real. There’s no way it’s AI #google #veo3 #googleveo3 #AI #prompts #areweprompts? ♬ original sound - GameBoi_pat

    I know it's tough to get noticed on TikTok, and that creators will go to great lengths to gain attention from the fickle algorithm. Still, there's something more than a little off-putting about flesh-and-blood musicians pretending to be AI creations just to make social media users pause their scrolling for a few extra seconds before they catch on to the joke.
    The whole thing evokes last year's stunt where a couple of podcast hosts released a posthumous "AI-generated" George Carlin routine before admitting that it had been written by a human after legal threats started flying. As an attention-grabbing stunt, the conceit still works. You want AI-generated content? I can pretend to be that!

    Are we just prompts?
    Some of the most existentially troubling Veo-generated videos floating around TikTok these days center around a gag known as "the prompt theory." These clips focus on various AI-generated people reacting to the idea that they are "just prompts" with various levels of skepticism, fear, or even conspiratorial paranoia.
    On the other side of that gag, some humans are making joke videos playing off the idea that they're merely prompts. RedondoKid used the conceit in a basketball trick shot video, saying "of course I'm going to make this. This is AI, you put that I'm going to make this in the prompt." User thisisamurica thanked his faux prompters for putting him in "a world with such delicious food" before theatrically choking on a forkful of meat. And comedian Drake Cummings developed TikTok skits pretending that it was actually AI video prompts forcing him to indulge in vices like shots of alcohol or online gambling.

    @justdrakenaround Goolgle’s New A.I. Veo 3 is at it again!! When will the prompts end?! #veo3 #google #ai #aivideo #skit ♬ original sound - Drake Cummings

    Beyond the obvious jokes, though, I've also seen a growing trend of TikTok creators approaching friends or strangers and asking them to react to the idea that "we're all just prompts." The reactions run the gamut from "get the fuck away from me" to "I blame that, I now have to pay taxes" to solipsistic philosophical musings from convenience store employees.
    I'm loath to call this a full-blown TikTok trend based on a few stray examples. Still, these attempts to exploit the confusion between real and AI-generated video are interesting to see. As one commenter on an "Are you a prompt?" ambush video put it: "New trend: Do normal videos and write 'Google Veo 3' on top of the video."
    Which one is real?
    The best Veo-related TikTok engagement hack I've stumbled on so far, though, might be the videos that show multiple short clips and ask the viewer to decide which are real and which are fake. One video I stumbled on shows an increasing number of "Veo 3 Goth Girls" across four clips, challenging in the caption that "one of these videos is real... can you guess which one?" In another example, two similar sets of kids are shown hanging out in cars while the caption asks, "Are you able to identify which scene is real and which one is from veo3?"

    @spongibobbu2 One of these videos is real… can you guess which one? #veo3 ♬ original sound - Jett

    After watching both of these videos on loop a few times, I'm relativelyconvinced that every single clip in them is a Veo creation. The fact that I watched these videos multiple times shows how effective the "Real or Veo" challenge framing is at grabbing my attention. Additionally, I'm still not 100 percent confident in my assessments, which is a testament to just how good Google's new model is at creating convincing videos.

    There are still some telltale signs for distinguishing a real video from a Veo creation, though. For one, Veo clips are still limited to just eight seconds, so any video that runs longeris almost certainly not generated by Google's AI. Looking back at a creator's other videos can also provide some clues—if the same person was appearing in "normal" videos two weeks ago, it's unlikely they would be appearing in Veo creations suddenly.
    There's also a subtle but distinctive style to most Veo creations that can distinguish them from the kind of candid handheld smartphone videos that usually fill TikTok. The lighting in a Veo video tends to be too bright, the camera movements a bit too smooth, and the edges of people and objects a little too polished. After you watch enough "genuine" Veo creations, you can start to pick out the patterns.
    Regardless, TikTokers trying to pass off real videos as fakes—even as a joke or engagement hack—is a recognition that video sites are now deep in the "deep doubt" era, where you have to be extra skeptical of even legitimate-looking video footage. And the mere existence of convincing AI fakes makes it easier than ever to claim real events captured on video didn't really happen, a problem that political scientists call the liar's dividend. We saw this when then-candidate Trump accused Democratic nominee Kamala Harris of "A.I.'d" crowds in real photos of her Detroit airport rally.
    For now, TikTokers of all stripes are having fun playing with that idea to gain social media attention. In the long term, though, the implications for discerning truth from reality are more troubling.

    Kyle Orland
    Senior Gaming Editor

    Kyle Orland
    Senior Gaming Editor

    Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

    13 Comments
    #real #tiktokers #are #pretending #veo
    Real TikTokers are pretending to be Veo 3 AI creations for fun, attention
    The turing test in reverse Real TikTokers are pretending to be Veo 3 AI creations for fun, attention From music videos to "Are you a prompt?" stunts, "real" videos are presenting as AI Kyle Orland – May 31, 2025 7:08 am | 13 Of course I'm an AI creation! Why would you even doubt it? Credit: Getty Images Of course I'm an AI creation! Why would you even doubt it? Credit: Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Since Google released its Veo 3 AI model last week, social media users have been having fun with its ability to quickly generate highly realistic eight-second clips complete with sound and lip-synced dialogue. TikTok's algorithm has been serving me plenty of Veo-generated videos featuring impossible challenges, fake news reports, and even surreal short narrative films, to name just a few popular archetypes. However, among all the AI-generated video experiments spreading around, I've also noticed a surprising counter-trend on my TikTok feed. Amid all the videos of Veo-generated avatars pretending to be real people, there are now also a bunch of videos of real people pretending to be Veo-generated avatars. “This has to be real. There’s no way it's AI.” I stumbled on this trend when the TikTok algorithm fed me this video topped with the extra-large caption "Google VEO 3 THIS IS 100% AI." As I watched and listened to the purported AI-generated band that appeared to be playing in the crowded corner of someone's living room, I read the caption containing the supposed prompt that had generated the clip: "a band of brothers with beards playing rock music in 6/8 with an accordion." @kongosmusicWe are so cooked. This took 3 mins to generate. Simple prompt: “a band of brothers playing rock music in 6/8 with an accordion”♬ original sound - KONGOS After a few seconds of taking those captions at face value, something started to feel a little off. After a few more seconds, I finally noticed the video was posted by Kongos, an indie band that you might recognize from their minor 2012 hit "Come With Me Now." And after a little digging, I discovered the band in the video was actually just Kongos, and the tune was a 9-year-old song that the band had dressed up as an AI creation to get attention. Here's the sad thing: It worked! Without the "Look what Veo 3 did!" hook, I might have quickly scrolled by this video before I took the time to listen to thesong. The novel AI angle made me stop just long enough to pay attention to a Kongos song for the first time in over a decade. Kongos isn't the only musical act trying to grab attention by claiming their real performances are AI creations. Darden Bela posted that Veo 3 had "created a realistic AI music video" over a clip from what is actually a 2-year-old music video with some unremarkable special effects. Rapper GameBoi Pat dressed up an 11-month-old song with a new TikTok clip captioned "Google's Veo 3 created a realistic sounding rapper... This has to be real. There's no way it's AI". I could go on, but you get the idea. @gameboi_pat This has got to be real. There’s no way it’s AI 😩 #google #veo3 #googleveo3 #AI #prompts #areweprompts? ♬ original sound - GameBoi_pat I know it's tough to get noticed on TikTok, and that creators will go to great lengths to gain attention from the fickle algorithm. Still, there's something more than a little off-putting about flesh-and-blood musicians pretending to be AI creations just to make social media users pause their scrolling for a few extra seconds before they catch on to the joke. The whole thing evokes last year's stunt where a couple of podcast hosts released a posthumous "AI-generated" George Carlin routine before admitting that it had been written by a human after legal threats started flying. As an attention-grabbing stunt, the conceit still works. You want AI-generated content? I can pretend to be that! Are we just prompts? Some of the most existentially troubling Veo-generated videos floating around TikTok these days center around a gag known as "the prompt theory." These clips focus on various AI-generated people reacting to the idea that they are "just prompts" with various levels of skepticism, fear, or even conspiratorial paranoia. On the other side of that gag, some humans are making joke videos playing off the idea that they're merely prompts. RedondoKid used the conceit in a basketball trick shot video, saying "of course I'm going to make this. This is AI, you put that I'm going to make this in the prompt." User thisisamurica thanked his faux prompters for putting him in "a world with such delicious food" before theatrically choking on a forkful of meat. And comedian Drake Cummings developed TikTok skits pretending that it was actually AI video prompts forcing him to indulge in vices like shots of alcohol or online gambling. @justdrakenaround Goolgle’s New A.I. Veo 3 is at it again!! When will the prompts end?! #veo3 #google #ai #aivideo #skit ♬ original sound - Drake Cummings Beyond the obvious jokes, though, I've also seen a growing trend of TikTok creators approaching friends or strangers and asking them to react to the idea that "we're all just prompts." The reactions run the gamut from "get the fuck away from me" to "I blame that, I now have to pay taxes" to solipsistic philosophical musings from convenience store employees. I'm loath to call this a full-blown TikTok trend based on a few stray examples. Still, these attempts to exploit the confusion between real and AI-generated video are interesting to see. As one commenter on an "Are you a prompt?" ambush video put it: "New trend: Do normal videos and write 'Google Veo 3' on top of the video." Which one is real? The best Veo-related TikTok engagement hack I've stumbled on so far, though, might be the videos that show multiple short clips and ask the viewer to decide which are real and which are fake. One video I stumbled on shows an increasing number of "Veo 3 Goth Girls" across four clips, challenging in the caption that "one of these videos is real... can you guess which one?" In another example, two similar sets of kids are shown hanging out in cars while the caption asks, "Are you able to identify which scene is real and which one is from veo3?" @spongibobbu2 One of these videos is real… can you guess which one? #veo3 ♬ original sound - Jett After watching both of these videos on loop a few times, I'm relativelyconvinced that every single clip in them is a Veo creation. The fact that I watched these videos multiple times shows how effective the "Real or Veo" challenge framing is at grabbing my attention. Additionally, I'm still not 100 percent confident in my assessments, which is a testament to just how good Google's new model is at creating convincing videos. There are still some telltale signs for distinguishing a real video from a Veo creation, though. For one, Veo clips are still limited to just eight seconds, so any video that runs longeris almost certainly not generated by Google's AI. Looking back at a creator's other videos can also provide some clues—if the same person was appearing in "normal" videos two weeks ago, it's unlikely they would be appearing in Veo creations suddenly. There's also a subtle but distinctive style to most Veo creations that can distinguish them from the kind of candid handheld smartphone videos that usually fill TikTok. The lighting in a Veo video tends to be too bright, the camera movements a bit too smooth, and the edges of people and objects a little too polished. After you watch enough "genuine" Veo creations, you can start to pick out the patterns. Regardless, TikTokers trying to pass off real videos as fakes—even as a joke or engagement hack—is a recognition that video sites are now deep in the "deep doubt" era, where you have to be extra skeptical of even legitimate-looking video footage. And the mere existence of convincing AI fakes makes it easier than ever to claim real events captured on video didn't really happen, a problem that political scientists call the liar's dividend. We saw this when then-candidate Trump accused Democratic nominee Kamala Harris of "A.I.'d" crowds in real photos of her Detroit airport rally. For now, TikTokers of all stripes are having fun playing with that idea to gain social media attention. In the long term, though, the implications for discerning truth from reality are more troubling. Kyle Orland Senior Gaming Editor Kyle Orland Senior Gaming Editor Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper. 13 Comments #real #tiktokers #are #pretending #veo
    Real TikTokers are pretending to be Veo 3 AI creations for fun, attention
    arstechnica.com
    The turing test in reverse Real TikTokers are pretending to be Veo 3 AI creations for fun, attention From music videos to "Are you a prompt?" stunts, "real" videos are presenting as AI Kyle Orland – May 31, 2025 7:08 am | 13 Of course I'm an AI creation! Why would you even doubt it? Credit: Getty Images Of course I'm an AI creation! Why would you even doubt it? Credit: Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Since Google released its Veo 3 AI model last week, social media users have been having fun with its ability to quickly generate highly realistic eight-second clips complete with sound and lip-synced dialogue. TikTok's algorithm has been serving me plenty of Veo-generated videos featuring impossible challenges, fake news reports, and even surreal short narrative films, to name just a few popular archetypes. However, among all the AI-generated video experiments spreading around, I've also noticed a surprising counter-trend on my TikTok feed. Amid all the videos of Veo-generated avatars pretending to be real people, there are now also a bunch of videos of real people pretending to be Veo-generated avatars. “This has to be real. There’s no way it's AI.” I stumbled on this trend when the TikTok algorithm fed me this video topped with the extra-large caption "Google VEO 3 THIS IS 100% AI." As I watched and listened to the purported AI-generated band that appeared to be playing in the crowded corner of someone's living room, I read the caption containing the supposed prompt that had generated the clip: "a band of brothers with beards playing rock music in 6/8 with an accordion." @kongosmusicWe are so cooked. This took 3 mins to generate. Simple prompt: “a band of brothers playing rock music in 6/8 with an accordion”♬ original sound - KONGOS After a few seconds of taking those captions at face value, something started to feel a little off. After a few more seconds, I finally noticed the video was posted by Kongos, an indie band that you might recognize from their minor 2012 hit "Come With Me Now." And after a little digging, I discovered the band in the video was actually just Kongos, and the tune was a 9-year-old song that the band had dressed up as an AI creation to get attention. Here's the sad thing: It worked! Without the "Look what Veo 3 did!" hook, I might have quickly scrolled by this video before I took the time to listen to the (pretty good!) song. The novel AI angle made me stop just long enough to pay attention to a Kongos song for the first time in over a decade. Kongos isn't the only musical act trying to grab attention by claiming their real performances are AI creations. Darden Bela posted that Veo 3 had "created a realistic AI music video" over a clip from what is actually a 2-year-old music video with some unremarkable special effects. Rapper GameBoi Pat dressed up an 11-month-old song with a new TikTok clip captioned "Google's Veo 3 created a realistic sounding rapper... This has to be real. There's no way it's AI" (that last part is true, at least). I could go on, but you get the idea. @gameboi_pat This has got to be real. There’s no way it’s AI 😩 #google #veo3 #googleveo3 #AI #prompts #areweprompts? ♬ original sound - GameBoi_pat I know it's tough to get noticed on TikTok, and that creators will go to great lengths to gain attention from the fickle algorithm. Still, there's something more than a little off-putting about flesh-and-blood musicians pretending to be AI creations just to make social media users pause their scrolling for a few extra seconds before they catch on to the joke (or don't, based on some of the comments). The whole thing evokes last year's stunt where a couple of podcast hosts released a posthumous "AI-generated" George Carlin routine before admitting that it had been written by a human after legal threats started flying. As an attention-grabbing stunt, the conceit still works. You want AI-generated content? I can pretend to be that! Are we just prompts? Some of the most existentially troubling Veo-generated videos floating around TikTok these days center around a gag known as "the prompt theory." These clips focus on various AI-generated people reacting to the idea that they are "just prompts" with various levels of skepticism, fear, or even conspiratorial paranoia. On the other side of that gag, some humans are making joke videos playing off the idea that they're merely prompts. RedondoKid used the conceit in a basketball trick shot video, saying "of course I'm going to make this. This is AI, you put that I'm going to make this in the prompt." User thisisamurica thanked his faux prompters for putting him in "a world with such delicious food" before theatrically choking on a forkful of meat. And comedian Drake Cummings developed TikTok skits pretending that it was actually AI video prompts forcing him to indulge in vices like shots of alcohol or online gambling ("Goolgle’s [sic] New A.I. Veo 3 is at it again!! When will the prompts end?!" Cummings jokes in the caption). @justdrakenaround Goolgle’s New A.I. Veo 3 is at it again!! When will the prompts end?! #veo3 #google #ai #aivideo #skit ♬ original sound - Drake Cummings Beyond the obvious jokes, though, I've also seen a growing trend of TikTok creators approaching friends or strangers and asking them to react to the idea that "we're all just prompts." The reactions run the gamut from "get the fuck away from me" to "I blame that [prompter], I now have to pay taxes" to solipsistic philosophical musings from convenience store employees. I'm loath to call this a full-blown TikTok trend based on a few stray examples. Still, these attempts to exploit the confusion between real and AI-generated video are interesting to see. As one commenter on an "Are you a prompt?" ambush video put it: "New trend: Do normal videos and write 'Google Veo 3' on top of the video." Which one is real? The best Veo-related TikTok engagement hack I've stumbled on so far, though, might be the videos that show multiple short clips and ask the viewer to decide which are real and which are fake. One video I stumbled on shows an increasing number of "Veo 3 Goth Girls" across four clips, challenging in the caption that "one of these videos is real... can you guess which one?" In another example, two similar sets of kids are shown hanging out in cars while the caption asks, "Are you able to identify which scene is real and which one is from veo3?" @spongibobbu2 One of these videos is real… can you guess which one? #veo3 ♬ original sound - Jett After watching both of these videos on loop a few times, I'm relatively (but not entirely) convinced that every single clip in them is a Veo creation. The fact that I watched these videos multiple times shows how effective the "Real or Veo" challenge framing is at grabbing my attention. Additionally, I'm still not 100 percent confident in my assessments, which is a testament to just how good Google's new model is at creating convincing videos. There are still some telltale signs for distinguishing a real video from a Veo creation, though. For one, Veo clips are still limited to just eight seconds, so any video that runs longer (without an apparent change in camera angle) is almost certainly not generated by Google's AI. Looking back at a creator's other videos can also provide some clues—if the same person was appearing in "normal" videos two weeks ago, it's unlikely they would be appearing in Veo creations suddenly. There's also a subtle but distinctive style to most Veo creations that can distinguish them from the kind of candid handheld smartphone videos that usually fill TikTok. The lighting in a Veo video tends to be too bright, the camera movements a bit too smooth, and the edges of people and objects a little too polished. After you watch enough "genuine" Veo creations, you can start to pick out the patterns. Regardless, TikTokers trying to pass off real videos as fakes—even as a joke or engagement hack—is a recognition that video sites are now deep in the "deep doubt" era, where you have to be extra skeptical of even legitimate-looking video footage. And the mere existence of convincing AI fakes makes it easier than ever to claim real events captured on video didn't really happen, a problem that political scientists call the liar's dividend. We saw this when then-candidate Trump accused Democratic nominee Kamala Harris of "A.I.'d" crowds in real photos of her Detroit airport rally. For now, TikTokers of all stripes are having fun playing with that idea to gain social media attention. In the long term, though, the implications for discerning truth from reality are more troubling. Kyle Orland Senior Gaming Editor Kyle Orland Senior Gaming Editor Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper. 13 Comments
    0 Comments ·0 Shares ·0 Reviews
  • Ready or not, the EAA is here

    Strategies to future-proof UX that meets EU legal standardsThe EAA requires digital accessibility to accommodate all users with different needsAccessibility for digital products is no longer optional — it’s a necessity with the upcoming enforcement of the European Accessibility Act .Until now, European standards such as EN 301 549 have required only the public sector to comply with the Web Content Accessibility Guidelines. So only government agencies or businesses that sell information and communication technologyto government agencies have needed to meet accessibility specifications.As of June 2025, this will be expanded into the private sector in the European Union — including e-commerce, restaurants, and banking services.Are you thinking any of the following?Does the EAA apply to me or the business I work for?What is the WCAG and what does it require?How much work is needed make my UX designs compliant?I get it — it’s time-consuming to understand the grueling details of a legislative act, but it’s vital to realize how it directly impacts you and your business.What is the European Accessibility Act ?The EAA is a “directive” that aims to improve accessibility in products and services in EU member states. This ensures people with disabilities can successfully access any digital product — ranging from writing an email on a laptop to checking-in at the doctor’s office with an iPad.The directive was enacted in 2019, but will become enforced on June 28, 2025 for any new products.The EAA is a EU directive that follows EN 301 549 and WCAG 2.1 requirementsEAA requirementsTo comply with the EAA, you need to follow the European “standard,” EN 301 549. EN 301 549 includes many clauses covering accessibility for a broad range of ICT — from native mobile applications to electronic hardware.The clauses circling the web and software incorporate the Web Content Accessibility Guidelines2.1, Level AA.Who does the EAA apply to?The EAA impacts all 27 member states of the EU. This includes countries such as France, Spain, and Sweden.Not only do EU member states need to comply, but any company that does business with the member states — even if the company isn’t based in the EU. For example, if a person in Italy accesses an e-commerce website based in the United States, the website must comply with the EAA.What does WCAG require?WCAG offers internationally recognized standards for digital accessibility. These standards are developed by the World Wide Web Consortiumand are constantly evolving to account for changes in HTML and assistive technologies.The EAA and EN 301 549 require conformance to WCAG 2.1, Level AA standards. But what does that mean exactly?WCAG versionsNew versions of WCAG are periodically released to accommodate to the internet’s evolution. As methods and technologies are deprecated and replaced, it’s important to update standards to ensure accessibility is met.The most current version of WCAG is 2.2, which was released in October 2023. Though the EAA and EN 301 549 currently require WCAG 2.1, it’s expected they will be updated to include the WCAG 2.2 version.WCAG has released 3 versions between 2008 and 2023WCAG conformance “levels”WCAG includes 3 levels of conformance: Level A, Level AA, and Level AAA. Level A offers guidelines for the most basic accessibility considerations, while Level AAA reaches the widest degree of accessibility.Each success criteriain WCAG has a conformance level. For instance, SC 1.1.1is Level A, while SC 2.4.12is Level AAA.To conform to a certain WCAG level, the digital product must also conform to the level below it. So if your website is Level AA, it passes both Level A and AA success criteria.WCAG’s “levels” refer to the degree of accessibility your digital product complies withDesign strategies for EAA complianceWCAG standards are dense, and it takes time to incorporate them into your UX design process. But there are strategies you can start using now to meet most of the requirements to comply with the EAA.1. Color contrast ratioDesigners must create color palettes that support a high color contrast ratio for content like text or UI components. You can check color contrast ratios with tools such as WebAIM’s Contrast Checker.Small textmust have a color contrast ratio of 4.5:1 with its backgroundLarge textmust have a contrast ratio of 3:1 with its backgroundUI components, like buttons, should have a color contrast ratio of 3:1 with its backgroundColor should not be used alone to convey meaning; semantic color also needs a text label or appropriate iconFor more details, visit WCAG 1.4-Distinguishable.Don’t use color alone to convey meaning, such as errors or warnings on text fields2. Keyboard functionalitySome users can’t use a mouse or their laptop’s trackpad. Users who are blind or have limited hand mobility use their keyboard or other assistive technologies, and must be able to operate the product with their preferred input method.All user actions are doable from a keyboard, except for freehand movementsKeyboard users must not encounter a keyboard trapKeyboard users have a way to turn off or remap keyboard shortcuts made up of single-character keysFor more details, visit WCAG 2.1-Keyboard accessible.Example of keyboard tab/ focus order for a restaurant’s website3. Multimedia featuresAll users, whether they are blind, hard-of-hearing, or have a learning disability, must be able to access the information any multimedia provides.Provide captions for any pre-recorded audio that is time-basedProvide an audio description for pre-recorded videosProvide captions for all live video contentFor more details, visit WCAG 1.2-Time based media.Example of a live news broadcast with closed captioning4. Headings and labelsWhen headings and labels aren’t used properly or aren’t used at all, users have a hard time processing content and completing tasks — from reading an article to filling out a form.Provide clear headings and labels that describe the page content or input fieldEnsure labels and headings that visually convey structure and relationships are programmatically associated to their contentEnsure individual cells in a data tables are programmatically associated to their parent column or row headerFor more details, visit WCAG 1.3-Adaptable.Ensure the heading tags properly convey the web-page’s content structure5. Support screen readersMany people use screen readers, such as NVDA or JAWS, so they can use any website or software. Ensuring digital products are compatible with screen readers seem challenging, but there are ways to proactively support screen readers.Use semantic HTML elements, like <button>, and avoid non-semantic elements, like <div> and <span>, that don’t contain built-in meaningProvide a text alternative for meaningful images using the alt attributeEnsure all interactive elements have a corresponding name, role, and valueEnsure the focus indicator is always visible so the user knows where they are in the interfaceFor more details, visit WCAG 4.1-Compatible.Tools to use for an accessible design processTo help meet WCAG 2.1, Level AA standards, there are tools you can incorporate in your UX process to create accessible designs and hand them over to your development team.1. WAVEWAVE is a free accessibility evaluation tool developed by WebAIM. WAVE offers multiple browser extensions for Chrome, Firefox, and Edge. After adding the extension to your preferred browser, all you have to do is visit your website and activate WAVE to get an automated evaluation.Note: Automated tools only find about a third of accessibility issues, so a manual evaluation is still needed after using WAVE.WAVE is a free accessibility evaluation tool that works on any live websitePros of WAVE:Offers multiple extensions for your preferred browserAllows you to easily see the tab order and structure of a web-pageProvides recommendations on how to correct accessibility issuesCons of WAVE:The icons representing potential accessibility issues are overwhelming and difficult to understand which icon goes to which element2. StarkStark offers a plug-in for both Figma and Sketch that designers can use to thoroughly check and annotate UX designs for developer hand-off. This ensures that developers know the specifics for alt text, tab order, and heading levels.Stark’s plug-in for Figma allows you to annotate designs and check color contrast ratiosPros of Stark:Offers a range of features to check designs for contrast and typography issues, as well as annotate for developer hand-offProvides color suggestions when the color contrast ratio doesn’t meet WCAG, Level AA requirementsProvides a vision simulator to test designs against different types of color blindnessCons of Stark:It can be difficult to select the correct layer when annotating designsSome advanced features in the plug-in are paid to use3. JAWSJAWSis one of the most commonly used screen readers. JAWS allows users with limited vision to access and use digital products, and is beneficial to test your digital product with to ensure it’s compatible with screen readers.Note: JAWS’ free version only allows you to use it for 45 minutes before restarting, and is best used on Chrome or Firefox browsers.JAWS screen reader running on a Mac through the Parallels virtual machinePros of JAWS:Ability to highly customize the JAWS settings, such as the voice synthesizerProvides output both through audio and braille devicesFreedom Scientific offers trainings to learn how to use JAWSCons of JAWS:Includes a steep learning curve compared to other screen readersOnly accessible through a Windows operating systemHas a limited free version–must pay to access the full versionConclusionReady or note, the European Accessibility Actwill be enforced on June 28, 2025. So any companythat provides ICT or operates digital products that an EU citizen can use is subject to the EAA.Any new product introduced to the market must comply with WCAG 2.1, Level AA to meet the requirements of the EAA and EN 301 549.Web accessibility is no longer optional — it’s essential. Though building inclusive products is the right thing to do, you may be subject to legal action if you avoid it. Is your digital product ready?SourcesWCAG by Level Access, “EN 301 549 Conformance: An Overview”Accessibility Works, “European Accessibility Act Compliance Requirements: The Next GDPR”WCAG by Level Access, “The European Accessibility Act: Technical Aspects of Compliance”European Union, “Types of legislation”W3C Web Accessibility Initiative, “WCAG 2 Overview”W3C, “WCAG 2.1 Guidelines”Ready or not, the EAA is here was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
    #ready #not #eaa #here
    Ready or not, the EAA is here
    Strategies to future-proof UX that meets EU legal standardsThe EAA requires digital accessibility to accommodate all users with different needsAccessibility for digital products is no longer optional — it’s a necessity with the upcoming enforcement of the European Accessibility Act .Until now, European standards such as EN 301 549 have required only the public sector to comply with the Web Content Accessibility Guidelines. So only government agencies or businesses that sell information and communication technologyto government agencies have needed to meet accessibility specifications.As of June 2025, this will be expanded into the private sector in the European Union — including e-commerce, restaurants, and banking services.Are you thinking any of the following?Does the EAA apply to me or the business I work for?What is the WCAG and what does it require?How much work is needed make my UX designs compliant?I get it — it’s time-consuming to understand the grueling details of a legislative act, but it’s vital to realize how it directly impacts you and your business.What is the European Accessibility Act ?The EAA is a “directive” that aims to improve accessibility in products and services in EU member states. This ensures people with disabilities can successfully access any digital product — ranging from writing an email on a laptop to checking-in at the doctor’s office with an iPad.The directive was enacted in 2019, but will become enforced on June 28, 2025 for any new products.The EAA is a EU directive that follows EN 301 549 and WCAG 2.1 requirementsEAA requirementsTo comply with the EAA, you need to follow the European “standard,” EN 301 549. EN 301 549 includes many clauses covering accessibility for a broad range of ICT — from native mobile applications to electronic hardware.The clauses circling the web and software incorporate the Web Content Accessibility Guidelines2.1, Level AA.Who does the EAA apply to?The EAA impacts all 27 member states of the EU. This includes countries such as France, Spain, and Sweden.Not only do EU member states need to comply, but any company that does business with the member states — even if the company isn’t based in the EU. For example, if a person in Italy accesses an e-commerce website based in the United States, the website must comply with the EAA.What does WCAG require?WCAG offers internationally recognized standards for digital accessibility. These standards are developed by the World Wide Web Consortiumand are constantly evolving to account for changes in HTML and assistive technologies.The EAA and EN 301 549 require conformance to WCAG 2.1, Level AA standards. But what does that mean exactly?WCAG versionsNew versions of WCAG are periodically released to accommodate to the internet’s evolution. As methods and technologies are deprecated and replaced, it’s important to update standards to ensure accessibility is met.The most current version of WCAG is 2.2, which was released in October 2023. Though the EAA and EN 301 549 currently require WCAG 2.1, it’s expected they will be updated to include the WCAG 2.2 version.WCAG has released 3 versions between 2008 and 2023WCAG conformance “levels”WCAG includes 3 levels of conformance: Level A, Level AA, and Level AAA. Level A offers guidelines for the most basic accessibility considerations, while Level AAA reaches the widest degree of accessibility.Each success criteriain WCAG has a conformance level. For instance, SC 1.1.1is Level A, while SC 2.4.12is Level AAA.To conform to a certain WCAG level, the digital product must also conform to the level below it. So if your website is Level AA, it passes both Level A and AA success criteria.WCAG’s “levels” refer to the degree of accessibility your digital product complies withDesign strategies for EAA complianceWCAG standards are dense, and it takes time to incorporate them into your UX design process. But there are strategies you can start using now to meet most of the requirements to comply with the EAA.1. Color contrast ratioDesigners must create color palettes that support a high color contrast ratio for content like text or UI components. You can check color contrast ratios with tools such as WebAIM’s Contrast Checker.Small textmust have a color contrast ratio of 4.5:1 with its backgroundLarge textmust have a contrast ratio of 3:1 with its backgroundUI components, like buttons, should have a color contrast ratio of 3:1 with its backgroundColor should not be used alone to convey meaning; semantic color also needs a text label or appropriate iconFor more details, visit WCAG 1.4-Distinguishable.Don’t use color alone to convey meaning, such as errors or warnings on text fields2. Keyboard functionalitySome users can’t use a mouse or their laptop’s trackpad. Users who are blind or have limited hand mobility use their keyboard or other assistive technologies, and must be able to operate the product with their preferred input method.All user actions are doable from a keyboard, except for freehand movementsKeyboard users must not encounter a keyboard trapKeyboard users have a way to turn off or remap keyboard shortcuts made up of single-character keysFor more details, visit WCAG 2.1-Keyboard accessible.Example of keyboard tab/ focus order for a restaurant’s website3. Multimedia featuresAll users, whether they are blind, hard-of-hearing, or have a learning disability, must be able to access the information any multimedia provides.Provide captions for any pre-recorded audio that is time-basedProvide an audio description for pre-recorded videosProvide captions for all live video contentFor more details, visit WCAG 1.2-Time based media.Example of a live news broadcast with closed captioning4. Headings and labelsWhen headings and labels aren’t used properly or aren’t used at all, users have a hard time processing content and completing tasks — from reading an article to filling out a form.Provide clear headings and labels that describe the page content or input fieldEnsure labels and headings that visually convey structure and relationships are programmatically associated to their contentEnsure individual cells in a data tables are programmatically associated to their parent column or row headerFor more details, visit WCAG 1.3-Adaptable.Ensure the heading tags properly convey the web-page’s content structure5. Support screen readersMany people use screen readers, such as NVDA or JAWS, so they can use any website or software. Ensuring digital products are compatible with screen readers seem challenging, but there are ways to proactively support screen readers.Use semantic HTML elements, like <button>, and avoid non-semantic elements, like <div> and <span>, that don’t contain built-in meaningProvide a text alternative for meaningful images using the alt attributeEnsure all interactive elements have a corresponding name, role, and valueEnsure the focus indicator is always visible so the user knows where they are in the interfaceFor more details, visit WCAG 4.1-Compatible.Tools to use for an accessible design processTo help meet WCAG 2.1, Level AA standards, there are tools you can incorporate in your UX process to create accessible designs and hand them over to your development team.1. WAVEWAVE is a free accessibility evaluation tool developed by WebAIM. WAVE offers multiple browser extensions for Chrome, Firefox, and Edge. After adding the extension to your preferred browser, all you have to do is visit your website and activate WAVE to get an automated evaluation.Note: Automated tools only find about a third of accessibility issues, so a manual evaluation is still needed after using WAVE.WAVE is a free accessibility evaluation tool that works on any live websitePros of WAVE:Offers multiple extensions for your preferred browserAllows you to easily see the tab order and structure of a web-pageProvides recommendations on how to correct accessibility issuesCons of WAVE:The icons representing potential accessibility issues are overwhelming and difficult to understand which icon goes to which element2. StarkStark offers a plug-in for both Figma and Sketch that designers can use to thoroughly check and annotate UX designs for developer hand-off. This ensures that developers know the specifics for alt text, tab order, and heading levels.Stark’s plug-in for Figma allows you to annotate designs and check color contrast ratiosPros of Stark:Offers a range of features to check designs for contrast and typography issues, as well as annotate for developer hand-offProvides color suggestions when the color contrast ratio doesn’t meet WCAG, Level AA requirementsProvides a vision simulator to test designs against different types of color blindnessCons of Stark:It can be difficult to select the correct layer when annotating designsSome advanced features in the plug-in are paid to use3. JAWSJAWSis one of the most commonly used screen readers. JAWS allows users with limited vision to access and use digital products, and is beneficial to test your digital product with to ensure it’s compatible with screen readers.Note: JAWS’ free version only allows you to use it for 45 minutes before restarting, and is best used on Chrome or Firefox browsers.JAWS screen reader running on a Mac through the Parallels virtual machinePros of JAWS:Ability to highly customize the JAWS settings, such as the voice synthesizerProvides output both through audio and braille devicesFreedom Scientific offers trainings to learn how to use JAWSCons of JAWS:Includes a steep learning curve compared to other screen readersOnly accessible through a Windows operating systemHas a limited free version–must pay to access the full versionConclusionReady or note, the European Accessibility Actwill be enforced on June 28, 2025. So any companythat provides ICT or operates digital products that an EU citizen can use is subject to the EAA.Any new product introduced to the market must comply with WCAG 2.1, Level AA to meet the requirements of the EAA and EN 301 549.Web accessibility is no longer optional — it’s essential. Though building inclusive products is the right thing to do, you may be subject to legal action if you avoid it. Is your digital product ready?SourcesWCAG by Level Access, “EN 301 549 Conformance: An Overview”Accessibility Works, “European Accessibility Act Compliance Requirements: The Next GDPR”WCAG by Level Access, “The European Accessibility Act: Technical Aspects of Compliance”European Union, “Types of legislation”W3C Web Accessibility Initiative, “WCAG 2 Overview”W3C, “WCAG 2.1 Guidelines”Ready or not, the EAA is here was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story. #ready #not #eaa #here
    Ready or not, the EAA is here
    uxdesign.cc
    Strategies to future-proof UX that meets EU legal standardsThe EAA requires digital accessibility to accommodate all users with different needsAccessibility for digital products is no longer optional — it’s a necessity with the upcoming enforcement of the European Accessibility Act (EAA).Until now, European standards such as EN 301 549 have required only the public sector to comply with the Web Content Accessibility Guidelines (WCAG). So only government agencies or businesses that sell information and communication technology (ICT) to government agencies have needed to meet accessibility specifications.As of June 2025, this will be expanded into the private sector in the European Union (EU) — including e-commerce, restaurants, and banking services.Are you thinking any of the following?Does the EAA apply to me or the business I work for?What is the WCAG and what does it require?How much work is needed make my UX designs compliant?I get it — it’s time-consuming to understand the grueling details of a legislative act, but it’s vital to realize how it directly impacts you and your business (whether you’re an employee or employer).What is the European Accessibility Act (EAA)?The EAA is a “directive” that aims to improve accessibility in products and services in EU member states. This ensures people with disabilities can successfully access any digital product — ranging from writing an email on a laptop to checking-in at the doctor’s office with an iPad.The directive was enacted in 2019, but will become enforced on June 28, 2025 for any new products.The EAA is a EU directive that follows EN 301 549 and WCAG 2.1 requirementsEAA requirementsTo comply with the EAA, you need to follow the European “standard,” EN 301 549. EN 301 549 includes many clauses covering accessibility for a broad range of ICT — from native mobile applications to electronic hardware.The clauses circling the web and software incorporate the Web Content Accessibility Guidelines (WCAG) 2.1, Level AA.Who does the EAA apply to?The EAA impacts all 27 member states of the EU. This includes countries such as France, Spain, and Sweden.Not only do EU member states need to comply, but any company that does business with the member states — even if the company isn’t based in the EU. For example, if a person in Italy accesses an e-commerce website based in the United States, the website must comply with the EAA.What does WCAG require?WCAG offers internationally recognized standards for digital accessibility. These standards are developed by the World Wide Web Consortium (W3C) and are constantly evolving to account for changes in HTML and assistive technologies.The EAA and EN 301 549 require conformance to WCAG 2.1, Level AA standards. But what does that mean exactly?WCAG versionsNew versions of WCAG are periodically released to accommodate to the internet’s evolution. As methods and technologies are deprecated and replaced, it’s important to update standards to ensure accessibility is met.The most current version of WCAG is 2.2, which was released in October 2023. Though the EAA and EN 301 549 currently require WCAG 2.1, it’s expected they will be updated to include the WCAG 2.2 version.WCAG has released 3 versions between 2008 and 2023WCAG conformance “levels”WCAG includes 3 levels of conformance: Level A, Level AA, and Level AAA. Level A offers guidelines for the most basic accessibility considerations, while Level AAA reaches the widest degree of accessibility.Each success criteria (SC) in WCAG has a conformance level. For instance, SC 1.1.1 (Non-text content) is Level A, while SC 2.4.12 (Focus not obscured–Enhanced) is Level AAA.To conform to a certain WCAG level, the digital product must also conform to the level below it. So if your website is Level AA, it passes both Level A and AA success criteria (which is what’s required to comply with the EAA).WCAG’s “levels” refer to the degree of accessibility your digital product complies withDesign strategies for EAA complianceWCAG standards are dense, and it takes time to incorporate them into your UX design process. But there are strategies you can start using now to meet most of the requirements to comply with the EAA.1. Color contrast ratioDesigners must create color palettes that support a high color contrast ratio for content like text or UI components. You can check color contrast ratios with tools such as WebAIM’s Contrast Checker.Small text (less than 18px) must have a color contrast ratio of 4.5:1 with its backgroundLarge text (larger than 18px) must have a contrast ratio of 3:1 with its backgroundUI components, like buttons, should have a color contrast ratio of 3:1 with its backgroundColor should not be used alone to convey meaning; semantic color also needs a text label or appropriate iconFor more details, visit WCAG 1.4-Distinguishable.Don’t use color alone to convey meaning, such as errors or warnings on text fields2. Keyboard functionalitySome users can’t use a mouse or their laptop’s trackpad. Users who are blind or have limited hand mobility use their keyboard or other assistive technologies, and must be able to operate the product with their preferred input method.All user actions are doable from a keyboard, except for freehand movements (I.E., digital painting)Keyboard users must not encounter a keyboard trap (I.E., the user can’t navigate away from elements like a modal)Keyboard users have a way to turn off or remap keyboard shortcuts made up of single-character keys (I.E., use “D” to delete an item)For more details, visit WCAG 2.1-Keyboard accessible.Example of keyboard tab/ focus order for a restaurant’s website3. Multimedia featuresAll users, whether they are blind, hard-of-hearing, or have a learning disability, must be able to access the information any multimedia provides (I.E., videos or audio).Provide captions for any pre-recorded audio that is time-based (I.E., syncing audio with text-based captions)Provide an audio description for pre-recorded videos (I.E., an animation without audio showing how to tie your shoes)Provide captions for all live video content (I.E., a news organization’s live broadcast)For more details, visit WCAG 1.2-Time based media.Example of a live news broadcast with closed captioning4. Headings and labelsWhen headings and labels aren’t used properly or aren’t used at all, users have a hard time processing content and completing tasks — from reading an article to filling out a form.Provide clear headings and labels that describe the page content or input fieldEnsure labels and headings that visually convey structure and relationships are programmatically associated to their content (I.E., the page heading includes a <h1> tag)Ensure individual cells in a data tables are programmatically associated to their parent column or row header (I.E., the cell named “Blue” is associated to its parent column named “Colors”)For more details, visit WCAG 1.3-Adaptable.Ensure the heading tags properly convey the web-page’s content structure5. Support screen readersMany people use screen readers, such as NVDA or JAWS, so they can use any website or software. Ensuring digital products are compatible with screen readers seem challenging, but there are ways to proactively support screen readers.Use semantic HTML elements, like <button>, and avoid non-semantic elements, like <div> and <span>, that don’t contain built-in meaningProvide a text alternative for meaningful images using the alt attributeEnsure all interactive elements have a corresponding name, role, and valueEnsure the focus indicator is always visible so the user knows where they are in the interfaceFor more details, visit WCAG 4.1-Compatible.Tools to use for an accessible design processTo help meet WCAG 2.1, Level AA standards, there are tools you can incorporate in your UX process to create accessible designs and hand them over to your development team.1. WAVEWAVE is a free accessibility evaluation tool developed by WebAIM. WAVE offers multiple browser extensions for Chrome, Firefox, and Edge. After adding the extension to your preferred browser, all you have to do is visit your website and activate WAVE to get an automated evaluation.Note: Automated tools only find about a third of accessibility issues, so a manual evaluation is still needed after using WAVE.WAVE is a free accessibility evaluation tool that works on any live websitePros of WAVE:Offers multiple extensions for your preferred browserAllows you to easily see the tab order and structure of a web-pageProvides recommendations on how to correct accessibility issuesCons of WAVE:The icons representing potential accessibility issues are overwhelming and difficult to understand which icon goes to which element2. StarkStark offers a plug-in for both Figma and Sketch that designers can use to thoroughly check and annotate UX designs for developer hand-off. This ensures that developers know the specifics for alt text, tab order, and heading levels.Stark’s plug-in for Figma allows you to annotate designs and check color contrast ratiosPros of Stark:Offers a range of features to check designs for contrast and typography issues, as well as annotate for developer hand-offProvides color suggestions when the color contrast ratio doesn’t meet WCAG, Level AA requirementsProvides a vision simulator to test designs against different types of color blindness (I.E., protanopia)Cons of Stark:It can be difficult to select the correct layer when annotating designsSome advanced features in the plug-in are paid to use3. JAWSJAWS (Job Access with Speech) is one of the most commonly used screen readers. JAWS allows users with limited vision to access and use digital products, and is beneficial to test your digital product with to ensure it’s compatible with screen readers.Note: JAWS’ free version only allows you to use it for 45 minutes before restarting, and is best used on Chrome or Firefox browsers.JAWS screen reader running on a Mac through the Parallels virtual machinePros of JAWS:Ability to highly customize the JAWS settings, such as the voice synthesizerProvides output both through audio and braille devicesFreedom Scientific offers trainings to learn how to use JAWSCons of JAWS:Includes a steep learning curve compared to other screen readersOnly accessible through a Windows operating systemHas a limited free version–must pay to access the full versionConclusionReady or note, the European Accessibility Act (EAA) will be enforced on June 28, 2025. So any company (public or private) that provides ICT or operates digital products that an EU citizen can use is subject to the EAA.Any new product introduced to the market must comply with WCAG 2.1, Level AA to meet the requirements of the EAA and EN 301 549.Web accessibility is no longer optional — it’s essential. Though building inclusive products is the right thing to do, you may be subject to legal action if you avoid it. Is your digital product ready?SourcesWCAG by Level Access, “EN 301 549 Conformance: An Overview”Accessibility Works, “European Accessibility Act Compliance Requirements: The Next GDPR”WCAG by Level Access, “The European Accessibility Act: Technical Aspects of Compliance”European Union, “Types of legislation”W3C Web Accessibility Initiative, “WCAG 2 Overview”W3C, “WCAG 2.1 Guidelines”Ready or not, the EAA is here was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
    0 Comments ·0 Shares ·0 Reviews
  • 5 AI prompts to put serious money in your pocket

    close A majority of small businesses are using artificial intelligence A majority of small businesses are using artificial intelligence and finding out it can save time and money. So, you want to start making money using AI but you’re not trying to build Skynet or learn 15 coding languages first? Good, because neither am I. You don’t need to become the next Sam Altman or have a Ph.D. in machine learning to turn artificial intelligence into real income. What you do need is curiosity, a dash of creativity, and the right prompts. Enter to win for you and for your favorite person or charity in our Pay It Forward Sweepstakes. Hurry, ends soon!I’ve pulled together five powerful, practical prompts you can throw into ChatGPTto help you start earning extra cash this week. These aren’t pie-in-the-sky dreams or K-a-month YouTube ad schemes. They’re doable, even if your calendar is already packed.5-MINUTE CLEANUP FOR YOUR PHONE AND COMPUTERLet’s get to it.1. Fast-Track Your Freelance LifePrompt to use:"Act as a freelance business coach. Suggest 3 services I can offer on Fiverr or Upwork using AI tools like ChatGPT, Midjourney or Canva. I haveexperience."Why this works:Freelance work is exploding right now. Platforms like Upwork and Fiverr are filled with small businesses and entrepreneurs who need help—but don’t have the budget to hire full-time staff. If you’ve got any kind of professional background, you can use AI tools to turbocharge your services. Writing blog posts? ChatGPT can give you a draft. Creating logos or social media templates? Midjourney and Canva are your new best friends.You don’t need a team. You don’t need fancy software. You just need a good prompt and the confidence to say, "Yes, I can do that." AI helps you scale what you already know how to do. A man is pictured with a smartphone and laptop computer on January 31, 2019. 2. Make Product Descriptions Sexy AgainPrompt to use:"Rewrite this Etsy or Shopify product description to make it more compelling and SEO-friendly. Target audience:. Here’s the original:."Why this works:Let’s face it—most product descriptions online are a snooze. But good copy sells. Whether you’re running your own shop or helping someone else with theirs, compelling product descriptions convert clicks into customers. Use ChatGPT to punch up the language, fine-tune for SEO, and speak directly to your ideal buyer.DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARERemember: people don’t just want to buy a weird mug. They want to buy what it says about them. That’s where a smart rewrite can turn browsers into buyers.3. Social Posts That SellPrompt to use:"Create 5 attention-grabbing Instagram captions to promote this. Keep the toneand include a strong call to action."Why this works:We live in a scroll-happy world. Your social captions need to grab attention in less than three seconds. But not everyone’s a copywriter—and not everyone has time to be. AI can help you crank out engaging content in the tone and style that fits your brand. Add a great photo, post consistently, and you’re suddenly a one-person content agency without the overhead. A photo taken on October 4, 2023 in Manta, near Turin, shows a smartphone and a laptop displaying the logos of the artificial intelligence OpenAI research company and ChatGPT chatbot.If you’re managing social for clients or your own biz, this prompt is gold. Use it to build content calendars, write reels scripts, or even draft ad copy.4. Polite Emails That You MoneyPrompt to use:"Write a short, polite email to ask for a lower rate or discount on. Mention that I’m a loyal customer comparing alternatives."Why this works:Negotiating discounts doesn’t always feel comfortable but it absolutely works. Companies often have unpublished deals, especially for longtime users or small businesses. And customer service reps? They're human beings. A kind, well-written email might be all it takes to get a discount on that software you’re using every month.20 TECH TRICKS TO MAKE LIFE BETTER, SAFER OR EASIERI’ve personally saved hundreds of dollars just by sending quick, respectful emails like this. AI can help you strike the perfect tone confident but kind, assertive but not pushy.5. Your Passive Income KitPrompt to use:"Give me 3 high-demand, low-competition ideas for a short e-book or low-content book I can sell on Amazon. I have experience in."Why this works:You have knowledge people want. Package it. Sell it. Repeat. Whether it’s a short guide on starting a backyard garden or a workbook for productivity hacks, e-books and low-content bookssell surprisingly well. And AI can help you brainstorm ideas, outline chapters, even draft content to polish up. In this photo illustration the logo of Apple Mail Programme Mail can be seen on a smartphone next to a finger on March 27, 2024 in Berlin, Germany.Upload it to Amazon KDP or Gumroad, and now you’ve got a digital product that can earn money in your sleep. People pay for convenience, and you have life experience worth sharing.Final ThoughtYou don’t need to master AI to start earning with it. You just need to start using it. These five prompts are a low-risk, high-potential way to get your feet wet. And if you need a hand turning these sparks into something bigger, I’m here.I built my multimillion-dollar business with no investors and no debt. I’ve done this without a big team or expensive consultants. And I’d love to help you do the same.Drop me a note. I read every one.Get tech-smarter on your scheduleAward-winning host Kim Komando is your secret weapon for navigating tech.National radio: Airing on 500+ stations across the US - Find yours or get the free podcast.Daily newsletter: Join 650,000 people who read the CurrentWatch: On Kim’s YouTube channelCopyright 2025, WestStar Multimedia Entertainment. All rights reserved. 
    #prompts #put #serious #money #your
    5 AI prompts to put serious money in your pocket
    close A majority of small businesses are using artificial intelligence A majority of small businesses are using artificial intelligence and finding out it can save time and money. So, you want to start making money using AI but you’re not trying to build Skynet or learn 15 coding languages first? Good, because neither am I. You don’t need to become the next Sam Altman or have a Ph.D. in machine learning to turn artificial intelligence into real income. What you do need is curiosity, a dash of creativity, and the right prompts.💸 Enter to win for you and for your favorite person or charity in our Pay It Forward Sweepstakes. Hurry, ends soon!I’ve pulled together five powerful, practical prompts you can throw into ChatGPTto help you start earning extra cash this week. These aren’t pie-in-the-sky dreams or K-a-month YouTube ad schemes. They’re doable, even if your calendar is already packed.5-MINUTE CLEANUP FOR YOUR PHONE AND COMPUTERLet’s get to it.1. Fast-Track Your Freelance LifePrompt to use:"Act as a freelance business coach. Suggest 3 services I can offer on Fiverr or Upwork using AI tools like ChatGPT, Midjourney or Canva. I haveexperience."Why this works:Freelance work is exploding right now. Platforms like Upwork and Fiverr are filled with small businesses and entrepreneurs who need help—but don’t have the budget to hire full-time staff. If you’ve got any kind of professional background, you can use AI tools to turbocharge your services. Writing blog posts? ChatGPT can give you a draft. Creating logos or social media templates? Midjourney and Canva are your new best friends.You don’t need a team. You don’t need fancy software. You just need a good prompt and the confidence to say, "Yes, I can do that." AI helps you scale what you already know how to do. A man is pictured with a smartphone and laptop computer on January 31, 2019. 2. Make Product Descriptions Sexy AgainPrompt to use:"Rewrite this Etsy or Shopify product description to make it more compelling and SEO-friendly. Target audience:. Here’s the original:."Why this works:Let’s face it—most product descriptions online are a snooze. But good copy sells. Whether you’re running your own shop or helping someone else with theirs, compelling product descriptions convert clicks into customers. Use ChatGPT to punch up the language, fine-tune for SEO, and speak directly to your ideal buyer.DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARERemember: people don’t just want to buy a weird mug. They want to buy what it says about them. That’s where a smart rewrite can turn browsers into buyers.3. Social Posts That SellPrompt to use:"Create 5 attention-grabbing Instagram captions to promote this. Keep the toneand include a strong call to action."Why this works:We live in a scroll-happy world. Your social captions need to grab attention in less than three seconds. But not everyone’s a copywriter—and not everyone has time to be. AI can help you crank out engaging content in the tone and style that fits your brand. Add a great photo, post consistently, and you’re suddenly a one-person content agency without the overhead. A photo taken on October 4, 2023 in Manta, near Turin, shows a smartphone and a laptop displaying the logos of the artificial intelligence OpenAI research company and ChatGPT chatbot.If you’re managing social for clients or your own biz, this prompt is gold. Use it to build content calendars, write reels scripts, or even draft ad copy.4. Polite Emails That You MoneyPrompt to use:"Write a short, polite email to ask for a lower rate or discount on. Mention that I’m a loyal customer comparing alternatives."Why this works:Negotiating discounts doesn’t always feel comfortable but it absolutely works. Companies often have unpublished deals, especially for longtime users or small businesses. And customer service reps? They're human beings. A kind, well-written email might be all it takes to get a discount on that software you’re using every month.20 TECH TRICKS TO MAKE LIFE BETTER, SAFER OR EASIERI’ve personally saved hundreds of dollars just by sending quick, respectful emails like this. AI can help you strike the perfect tone confident but kind, assertive but not pushy.5. Your Passive Income KitPrompt to use:"Give me 3 high-demand, low-competition ideas for a short e-book or low-content book I can sell on Amazon. I have experience in."Why this works:You have knowledge people want. Package it. Sell it. Repeat. Whether it’s a short guide on starting a backyard garden or a workbook for productivity hacks, e-books and low-content bookssell surprisingly well. And AI can help you brainstorm ideas, outline chapters, even draft content to polish up. In this photo illustration the logo of Apple Mail Programme Mail can be seen on a smartphone next to a finger on March 27, 2024 in Berlin, Germany.Upload it to Amazon KDP or Gumroad, and now you’ve got a digital product that can earn money in your sleep. People pay for convenience, and you have life experience worth sharing.Final ThoughtYou don’t need to master AI to start earning with it. You just need to start using it. These five prompts are a low-risk, high-potential way to get your feet wet. And if you need a hand turning these sparks into something bigger, I’m here.I built my multimillion-dollar business with no investors and no debt. I’ve done this without a big team or expensive consultants. And I’d love to help you do the same.Drop me a note. I read every one.Get tech-smarter on your scheduleAward-winning host Kim Komando is your secret weapon for navigating tech.National radio: Airing on 500+ stations across the US - Find yours or get the free podcast.Daily newsletter: Join 650,000 people who read the CurrentWatch: On Kim’s YouTube channelCopyright 2025, WestStar Multimedia Entertainment. All rights reserved.  #prompts #put #serious #money #your
    5 AI prompts to put serious money in your pocket
    www.foxnews.com
    close A majority of small businesses are using artificial intelligence A majority of small businesses are using artificial intelligence and finding out it can save time and money. So, you want to start making money using AI but you’re not trying to build Skynet or learn 15 coding languages first? Good, because neither am I. You don’t need to become the next Sam Altman or have a Ph.D. in machine learning to turn artificial intelligence into real income. What you do need is curiosity, a dash of creativity, and the right prompts.💸 Enter to win $500 for you and $500 for your favorite person or charity in our Pay It Forward Sweepstakes. Hurry, ends soon!I’ve pulled together five powerful, practical prompts you can throw into ChatGPT (or your AI tool of choice) to help you start earning extra cash this week. These aren’t pie-in-the-sky dreams or $10K-a-month YouTube ad schemes. They’re doable, even if your calendar is already packed.5-MINUTE CLEANUP FOR YOUR PHONE AND COMPUTERLet’s get to it.1. Fast-Track Your Freelance LifePrompt to use:"Act as a freelance business coach. Suggest 3 services I can offer on Fiverr or Upwork using AI tools like ChatGPT, Midjourney or Canva. I have [insert skill: writing/design/admin/accounting/managerial] experience."Why this works:Freelance work is exploding right now. Platforms like Upwork and Fiverr are filled with small businesses and entrepreneurs who need help—but don’t have the budget to hire full-time staff. If you’ve got any kind of professional background, you can use AI tools to turbocharge your services. Writing blog posts? ChatGPT can give you a draft. Creating logos or social media templates? Midjourney and Canva are your new best friends.You don’t need a team. You don’t need fancy software. You just need a good prompt and the confidence to say, "Yes, I can do that." AI helps you scale what you already know how to do. A man is pictured with a smartphone and laptop computer on January 31, 2019.  (Neil Godwin/Future via Getty Images)2. Make Product Descriptions Sexy AgainPrompt to use:"Rewrite this Etsy or Shopify product description to make it more compelling and SEO-friendly. Target audience: [insert group]. Here’s the original: [paste description]."Why this works:Let’s face it—most product descriptions online are a snooze. But good copy sells. Whether you’re running your own shop or helping someone else with theirs, compelling product descriptions convert clicks into customers. Use ChatGPT to punch up the language, fine-tune for SEO, and speak directly to your ideal buyer.DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARERemember: people don’t just want to buy a weird mug. They want to buy what it says about them. That’s where a smart rewrite can turn browsers into buyers.3. Social Posts That SellPrompt to use:"Create 5 attention-grabbing Instagram captions to promote this [product/service]. Keep the tone [fun, confident, expert] and include a strong call to action."Why this works:We live in a scroll-happy world. Your social captions need to grab attention in less than three seconds. But not everyone’s a copywriter—and not everyone has time to be. AI can help you crank out engaging content in the tone and style that fits your brand. Add a great photo, post consistently, and you’re suddenly a one-person content agency without the overhead (or endless Zoom meetings). A photo taken on October 4, 2023 in Manta, near Turin, shows a smartphone and a laptop displaying the logos of the artificial intelligence OpenAI research company and ChatGPT chatbot. (MARCO BERTORELLO/AFP via Getty Images)If you’re managing social for clients or your own biz, this prompt is gold. Use it to build content calendars, write reels scripts, or even draft ad copy.4. Polite Emails That Save You MoneyPrompt to use:"Write a short, polite email to ask for a lower rate or discount on [tool/service/platform]. Mention that I’m a loyal customer comparing alternatives."Why this works:Negotiating discounts doesn’t always feel comfortable but it absolutely works. Companies often have unpublished deals, especially for longtime users or small businesses. And customer service reps? They're human beings. A kind, well-written email might be all it takes to get a discount on that software you’re using every month.20 TECH TRICKS TO MAKE LIFE BETTER, SAFER OR EASIERI’ve personally saved hundreds of dollars just by sending quick, respectful emails like this. AI can help you strike the perfect tone confident but kind, assertive but not pushy.5. Your Passive Income KitPrompt to use:"Give me 3 high-demand, low-competition ideas for a short e-book or low-content book I can sell on Amazon. I have experience in [insert topic]."Why this works:You have knowledge people want. Package it. Sell it. Repeat. Whether it’s a short guide on starting a backyard garden or a workbook for productivity hacks, e-books and low-content books (like journals or planners) sell surprisingly well. And AI can help you brainstorm ideas, outline chapters, even draft content to polish up. In this photo illustration the logo of Apple Mail Programme Mail can be seen on a smartphone next to a finger on March 27, 2024 in Berlin, Germany. (Photo Illustration by Thomas Trutschel/Photothek via Getty Images)Upload it to Amazon KDP or Gumroad, and now you’ve got a digital product that can earn money in your sleep. People pay for convenience, and you have life experience worth sharing.Final ThoughtYou don’t need to master AI to start earning with it. You just need to start using it. These five prompts are a low-risk, high-potential way to get your feet wet. And if you need a hand turning these sparks into something bigger, I’m here.I built my multimillion-dollar business with no investors and no debt. I’ve done this without a big team or expensive consultants. And I’d love to help you do the same.Drop me a note. I read every one.Get tech-smarter on your scheduleAward-winning host Kim Komando is your secret weapon for navigating tech.National radio: Airing on 500+ stations across the US - Find yours or get the free podcast.Daily newsletter: Join 650,000 people who read the Current (free!)Watch: On Kim’s YouTube channelCopyright 2025, WestStar Multimedia Entertainment. All rights reserved. 
    0 Comments ·0 Shares ·0 Reviews
  • Google Announces Live Translation Yet Again, This Time in Google Meet

    It's that time again, for Google to announce that real-time translation has come to one of its communication apps. This time, it's Google Meet, which can translate between English and Spanish as you speak in a video call. If that sounds familiar, it's because it's not the first time Google has announced something like this.Google Translate has had features that let you speak to someone in another language in real time for a while. For example, back in 2019, there was a real-time translation feature called Interpreter Mode built into Google Assistant. It's also been possible on Pixel phones for a while. Most of these, however, have been either text-to-text, or speech-to-text. You can use the Google Translate app for a speech-to-speech experience, but like with Google Assistant's Interpreter Mode, that only works in person. So, what's different here? Well, during its I/O keynote, Google demoed two users in a video chat speaking in their native languages. Google Meet then translates and speaks the translation back in a relatively human-sounding voice. This new feature is available now for Google Workspace subscribers, but unfortunately, it's not in the free version. On the plus side, additional languages are promised to start coming out in just a few weeks.While I haven't tested it out yet, it does seem to be a more convenient way to access a feature that you might otherwise have to hack together with another tab, or by opening your phone and holding it up to a speaker. Plus, it can be a bit more natural to hear translations spoken out for you, rather than having to rely on translated captions. I do wonder whether it can keep up with the natural speed and flow of a conversation, though—nobody likes to feel interrupted.
    #google #announces #live #translation #yet
    Google Announces Live Translation Yet Again, This Time in Google Meet
    It's that time again, for Google to announce that real-time translation has come to one of its communication apps. This time, it's Google Meet, which can translate between English and Spanish as you speak in a video call. If that sounds familiar, it's because it's not the first time Google has announced something like this.Google Translate has had features that let you speak to someone in another language in real time for a while. For example, back in 2019, there was a real-time translation feature called Interpreter Mode built into Google Assistant. It's also been possible on Pixel phones for a while. Most of these, however, have been either text-to-text, or speech-to-text. You can use the Google Translate app for a speech-to-speech experience, but like with Google Assistant's Interpreter Mode, that only works in person. So, what's different here? Well, during its I/O keynote, Google demoed two users in a video chat speaking in their native languages. Google Meet then translates and speaks the translation back in a relatively human-sounding voice. This new feature is available now for Google Workspace subscribers, but unfortunately, it's not in the free version. On the plus side, additional languages are promised to start coming out in just a few weeks.While I haven't tested it out yet, it does seem to be a more convenient way to access a feature that you might otherwise have to hack together with another tab, or by opening your phone and holding it up to a speaker. Plus, it can be a bit more natural to hear translations spoken out for you, rather than having to rely on translated captions. I do wonder whether it can keep up with the natural speed and flow of a conversation, though—nobody likes to feel interrupted. #google #announces #live #translation #yet
    Google Announces Live Translation Yet Again, This Time in Google Meet
    lifehacker.com
    It's that time again, for Google to announce that real-time translation has come to one of its communication apps. This time, it's Google Meet, which can translate between English and Spanish as you speak in a video call. If that sounds familiar, it's because it's not the first time Google has announced something like this.Google Translate has had features that let you speak to someone in another language in real time for a while. For example, back in 2019, there was a real-time translation feature called Interpreter Mode built into Google Assistant. It's also been possible on Pixel phones for a while (and even Samsung phones). Most of these, however, have been either text-to-text, or speech-to-text. You can use the Google Translate app for a speech-to-speech experience, but like with Google Assistant's Interpreter Mode, that only works in person. So, what's different here? Well, during its I/O keynote, Google demoed two users in a video chat speaking in their native languages. Google Meet then translates and speaks the translation back in a relatively human-sounding voice. This new feature is available now for Google Workspace subscribers (plans start at $7/month), but unfortunately, it's not in the free version. On the plus side, additional languages are promised to start coming out in just a few weeks.While I haven't tested it out yet, it does seem to be a more convenient way to access a feature that you might otherwise have to hack together with another tab, or by opening your phone and holding it up to a speaker. Plus, it can be a bit more natural to hear translations spoken out for you, rather than having to rely on translated captions. I do wonder whether it can keep up with the natural speed and flow of a conversation, though—nobody likes to feel interrupted.
    0 Comments ·0 Shares ·0 Reviews
  • Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day

    PROS:
    Discreet, elegant, and unobtrusive design that doesn't scream "tech"
    Lightweight and comfortable premium frame
    Focuses on essential experiences without the unnecessary cruft
    Impressive transcription and teleprompter features
    Long battery life and effortless charging case design
    CONS:
    No speakers for calls or audio feedbackTemple tips touch controls can be a bit cumbersome
    A bit expensive

    RATINGS:
    AESTHETICSERGONOMICSPERFORMANCESUSTAINABILITY / REPAIRABILITYVALUE FOR MONEYEDITOR'S QUOTE:With a simple design and useful features, the Even Realities G1 smart glasses prove that you don't need all the bells and whistles to provide an experience.
    Every day, we’re flooded with more information than our already overworked minds can handle. Our smartphones and computers put all this information at our fingertips, connecting us to the rest of the world while ironically disconnecting us from the people around us. Smart glasses and XR headsets promise to bring all this information right in front of us, bridging the gap that divides physical and virtual realities. And yet at the same time, they erect a wall that separates us from the here and now.
    It’s against this backdrop that Even Realities chose to take a bold step in the opposite direction. In both form and function, the Even Realities G1 smart glasses cut down on the cruft and promise a distilled experience that focuses only on what you really need to get through a busy day. More importantly, it delivers it in a minimalist design that doesn’t get in your way. Or at least that’s the spiel. Just in time for the upcoming Father’s Day celebration, we got to test what the Even Realities G1 has to offer, especially to some of the busiest people in our families: the dads juggling work responsibilities while trying to stay present for their loved ones.
    Designer: Even Realities
    Click Here to Buy Now: Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025.
    Aesthetics

    You probably wouldn’t even be able to tell the Even Realities G1 is wearable tech if you meet someone on the street wearing a pair. Sure, they might look like slightly retro Pantos, but they’re a far cry from even the slimmest XR glasses from the likes of Xreal or Viture. You can clearly see the eyes of the person wearing them, and the tech is practically invisible, which is exactly the point.
    The design of the Even Realities G1 is on the plain and minimal side, a stark contrast to the majority of smart glasses and XR/AR headsets currently in the market, even those claiming to be fashionable and stylish. Sure, it’s not going to compete with high-end luxury spectacles, but they’re not entirely off the mark either. Unless you look really closely, you might simply presume them to be a pair of thick-framed glasses.

    The form of the glasses might be simple, but their construction is anything but. The frame is made from magnesium alloy with a coating that’s fused with sandstone, while the temples use a titanium alloy on the outer sides and soft silicone on the inner surfaces. The mixture of quality materials not only gives the Even Realities G1 a premium character but also a lightweight form that’s only ever so slightly heavier than your run-of-the-mill prescription eyeglasses.
    While the G1 most looks like normal eyewear, the temple tips are dead giveaways that things are not what they seem. The blocky, paddle-shaped tips that house batteries and electronics are definitely larger than what you’d find on most glasses. They’re not obnoxiously big, but they do tend to stick out a bit, and they’re hard to “unsee” once you’ve noticed their presence.
    Despite looking quite aesthetic, the Even Realities G1 isn’t pretending to be some posh fashion accessory. After all, the circular G1A and rectangular G1B options hardly cover all possible eyewear designs, and the limited color selection won’t suit everyone’s tastes. Rather than something you flaunt or call attention to, these smart glasses are designed to be an “everyday wear” and disappear into the background, making tech invisible without making it unusable, perfect for the dad who wants to stay connected without looking like he’s wearing a gadget at the family barbecue.
    Ergonomics

    If you’ve ever tried any of those hi-tech wearables promising the next wave of computing, then you’d probably know that you’d never wear any of those glasses or visors for more than just an hour or two every day. They may have impressive technologies and apps, but they become practically useless once you take them off, especially when you have to step out into the real world.
    In contrast, the Even Realities G1 is something you’d be able to wear for hours on end, indoors or outdoors. Made from lightweight materials with a construction that even throws away screws to reduce the heft, it’s almost mind-blowing to think that the glasses houses any electronics at all. This level of comfort is honestly the G1’s most important asset, because it allows people to experience its smart features far longer than any Quest or Viture.

    When it comes to eyewear, however, prescription lenses have always been a sore point for many consumers, and this is no exception. Because it integrates waveguide optics into the lens, you’ll have to pay extra to have customized prescription lenses when you buy an Even Realities G1. It can be a bit nerve-wracking to ensure you get all the measurements and figures right, especially since you can’t return or exchange glasses with customized lenses.
    While the G1 eyeglasses are definitely comfortable to wear, the same can’t exactly be said when it comes to manually interacting with them. While most smart glasses and headsets have controls near your temples, the G1’s touch-sensitive areas are at the temple tips, which would be sitting behind your ears when you’re wearing the glasses. They might feel awkward to reach, and those with long hairstyles might find it difficult to use. Fortunately, you will rarely touch those tips except to activate some functions, but it can still be an unsatisfactory experience when you do.
    Performance

    The Even Realities G1 takes a brilliantly focused approach to smart eyewear, prioritizing elegant design and practical functionality over unnecessary tech bloat. The 640×200 green monochrome display may seem modest, but it’s deliberate choice that enables the G1 to maintain a sleek, stylish profile. The absence of cameras and speakers isn’t a limitation but a thoughtful design decision that enhances both wearability and privacy, allowing users to seamlessly integrate this technology into their daily lives without social awkwardness. The magic of the G1 lies in its delivery of information directly to your field of vision in a way that not only delights but also transforms how you interact with digital content.

    The core Even Realities G1 experience revolves around bringing only critical information to your attention and keeping distractions away, all without disconnecting you from reality and the people around you. Its text-centric interface, displayed by two micro-LED displays, one on each lens, ensures that information is distilled down to its most essential. And there’s no denying the retro charm of a green dot-matrix screen in front of your eyes, even if the color won’t work well against light or bright objects.
    The Even Realities G1 experience starts with the dashboard, which you can summon just by tilting your head up a bit, an angle that you can set on the companion mobile app. One side shows the date and time, temperature, number of notifications, and your next appointment. The other side can be configured to show one of your saved quick notes, news, stocks, or even your current location. None of these items are interactive, and you’ll have to dive into the mobile app to actually get any further information.

    With Father’s Day approaching, it’s worth noting how the G1’s floating heads-up display, visible only to the wearer, helps dads stay effortlessly connected, organized, and present. The QuickNote and Calendar features are particularly valuable for fathers juggling work and family responsibilities, allowing them to process their to-do lists perfectly on schedule without missing a beat of family time. Spending quality time with your child then suddenly remembering you need to buy batteries on your next errand run? No more frantically scampering for pen and paper or even your phone; just tap and speak.
    Of course, the smart glasses really shine when it comes to the, well, smart functionality, most of which unsurprisingly revolve around words, both spoken and displayed. Transcription, which is used when making Quick Notes, records your voice and saves it alongside the transcribed text. Fathers who find themselves in never-ending meetings no longer need to worry about missing a beat. Not only do they get to keep notes, but they also receive a summary and recap thanks to the G1’s AI capabilities, a game-changer for busy dads who need to process information efficiently.

    Translation can make international trips quite fun, at least for some interactions, as you’ll be able to see actual translated captions floating in the air like subtitles on a video. Dads who give a lot of talks, business presentations, interviews, or broadcast videos will definitely love the Teleprompter feature, which can advance the script just based on the words you’re speaking. No more worrying about missing important points during that big presentation, leaving more mental bandwidth for what really matters. It’s also perfect for a captivating Career Day show that will do your kid proud.

    The accuracy of Even Realities’ speech recognition and AI is fairly good, though there are times when it will require a bit of patience and understanding. There’s a noticeable delay when translating what people say in real time, for example, and it might miss words if the person is speaking too quickly. Navigation can be a hit or miss, depending on your location, and the visual direction prompts are not always reliable.

    The latter is also one of the cases where the absence of built-in speakers feels a bit more pronounced. There’s no audio feedback, which could be useful for guided turn-by-turn navigation. Even AI can hear you, but it can’t talk back to you. Everything will be delivered only through text you have to read, which might not always be possible in some cases. Admittedly, the addition of such hardware, no matter how small, will also add weight to the glasses, so Even Realities chose their battles wisely.

    The Even Realities G1 is advertised to last for 1.5 days, and it indeed lasts at least more than a day. The stylish wireless charging case, which has a built-in 2,000mAh battery, extends that uptime to five days. Charging the glasses is as simple as putting them inside the case, no need to align any contact points, as long as you remember to fold the left arm first before the right arm. Oddly enough, there’s no battery level indicator on the glasses, even in the dashboard HUD.
    Even Realities focused on making the G1 simple, both in design and in operation. Sometimes even to the point of oversimplification. To reduce complexity, for example, each side of the glasses connects to a smartphone separately via Bluetooth, which unfortunately increases the risk of the two sides being out of sync if one or the other connection drops. Turning the glasses into shades is a simple case of slapping on clip-on shades that are not only an additional expense but also something you could lose somewhere.
    Sustainability

    By cutting down on the volume of the product, Even Realities also helps cut down waste material, especially the use of plastics. The G1 utilizes more metals than plastic, not only delivering a premium design but also preferring more renewable materials. The company is particularly proud of its packaging as well, which uses 100% recyclable, eco-friendly cardboard.
    While magnesium and titanium alloys contribute to the durability of the product, the Even Realities G1 is not exactly what you might consider to be a weather-proof piece of wearable tech. It has no formal IP rating, and the glasses are only said to be resistant to splashes and light rain. It can accompany you on your runs, sure, but you’ll have to treat it with much care. Not that it will have much practical use during your workouts in the first place.
    Value

    Discreet, useful, and simple, the Even Realities G1 smart glasses proudly stand in opposition to the literal heavyweights of the smart eyewear market that are practically strapping a computer on your face. It offers an experience that focuses on the most important functions and information you’d want to have in front of your eyes and pushes unnecessary distractions out of your sight. Most importantly, however, it keeps the whole world clearly in view, allowing you to connect to your digital life without disconnecting you from the people around you.

    The Even Realities G1 would almost be perfect for this hyper-focused use case if not for its price tag. At it’s easily one of the more expensive pairs of smart spectacles you’ll see on the market, and that’s only for the glasses themselves. For custom prescription lenses, you need to add another on top, not to mention theclip-on shades for those extra bright days. Given its limited functionality, the G1 definitely feels a bit overpriced. But when you consider how lightweight, distraction-free, and useful it can be, it comes off more as an investment for the future.
    For family and friends looking for a meaningful tech gift this Father’s Day, the G1 offers something truly unique: a way to stay on top of work responsibilities while remaining fully present for family moments. Whether capturing quick thoughts during a child’s soccer game or discreetly checking calendar reminders during family dinner, these glasses help dads maintain that delicate balance between connectivity and presence.
    Verdict

    It’s hard to escape the overabundance of information that we deal with every day, both from the world around us, as well as our own stash of notes and to-do lists. Unfortunately, the tools that we always have with us, our smartphones, computers, and smartwatches, are poor guardians against this flood. And now smart glasses are coming, promising access to all of that and threatening to further drown us with information we don’t really need.

    The Even Realities G1 is both a breath of fresh air and a bold statement against that trend. Not only is it lightweight and comfortable, but it even looks like normal glasses! Rather than throw everything and the kitchen sink into it, its design and functionality are completely intentional, focusing only on essential experiences and features to keep you productive. It’s not trying to turn you into Tony Stark, but it will help make you feel like a superhero as you breeze through your tasks while still being present to the people who really matter the most in your life.

    For the dad who wants to stay connected without being distracted, who needs to manage information without being overwhelmed by it, the Even Realities G1 might just be the perfect Father’s Day gift: a tool that helps him be both the professional he needs to be and the father he wants to be, all without missing a moment of what truly matters.
    Click Here to Buy Now: Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025.The post Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day first appeared on Yanko Design.
    #even #realities #glasses #review #smart
    Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day
    PROS: Discreet, elegant, and unobtrusive design that doesn't scream "tech" Lightweight and comfortable premium frame Focuses on essential experiences without the unnecessary cruft Impressive transcription and teleprompter features Long battery life and effortless charging case design CONS: No speakers for calls or audio feedbackTemple tips touch controls can be a bit cumbersome A bit expensive RATINGS: AESTHETICSERGONOMICSPERFORMANCESUSTAINABILITY / REPAIRABILITYVALUE FOR MONEYEDITOR'S QUOTE:With a simple design and useful features, the Even Realities G1 smart glasses prove that you don't need all the bells and whistles to provide an experience. Every day, we’re flooded with more information than our already overworked minds can handle. Our smartphones and computers put all this information at our fingertips, connecting us to the rest of the world while ironically disconnecting us from the people around us. Smart glasses and XR headsets promise to bring all this information right in front of us, bridging the gap that divides physical and virtual realities. And yet at the same time, they erect a wall that separates us from the here and now. It’s against this backdrop that Even Realities chose to take a bold step in the opposite direction. In both form and function, the Even Realities G1 smart glasses cut down on the cruft and promise a distilled experience that focuses only on what you really need to get through a busy day. More importantly, it delivers it in a minimalist design that doesn’t get in your way. Or at least that’s the spiel. Just in time for the upcoming Father’s Day celebration, we got to test what the Even Realities G1 has to offer, especially to some of the busiest people in our families: the dads juggling work responsibilities while trying to stay present for their loved ones. Designer: Even Realities Click Here to Buy Now: Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025. Aesthetics You probably wouldn’t even be able to tell the Even Realities G1 is wearable tech if you meet someone on the street wearing a pair. Sure, they might look like slightly retro Pantos, but they’re a far cry from even the slimmest XR glasses from the likes of Xreal or Viture. You can clearly see the eyes of the person wearing them, and the tech is practically invisible, which is exactly the point. The design of the Even Realities G1 is on the plain and minimal side, a stark contrast to the majority of smart glasses and XR/AR headsets currently in the market, even those claiming to be fashionable and stylish. Sure, it’s not going to compete with high-end luxury spectacles, but they’re not entirely off the mark either. Unless you look really closely, you might simply presume them to be a pair of thick-framed glasses. The form of the glasses might be simple, but their construction is anything but. The frame is made from magnesium alloy with a coating that’s fused with sandstone, while the temples use a titanium alloy on the outer sides and soft silicone on the inner surfaces. The mixture of quality materials not only gives the Even Realities G1 a premium character but also a lightweight form that’s only ever so slightly heavier than your run-of-the-mill prescription eyeglasses. While the G1 most looks like normal eyewear, the temple tips are dead giveaways that things are not what they seem. The blocky, paddle-shaped tips that house batteries and electronics are definitely larger than what you’d find on most glasses. They’re not obnoxiously big, but they do tend to stick out a bit, and they’re hard to “unsee” once you’ve noticed their presence. Despite looking quite aesthetic, the Even Realities G1 isn’t pretending to be some posh fashion accessory. After all, the circular G1A and rectangular G1B options hardly cover all possible eyewear designs, and the limited color selection won’t suit everyone’s tastes. Rather than something you flaunt or call attention to, these smart glasses are designed to be an “everyday wear” and disappear into the background, making tech invisible without making it unusable, perfect for the dad who wants to stay connected without looking like he’s wearing a gadget at the family barbecue. Ergonomics If you’ve ever tried any of those hi-tech wearables promising the next wave of computing, then you’d probably know that you’d never wear any of those glasses or visors for more than just an hour or two every day. They may have impressive technologies and apps, but they become practically useless once you take them off, especially when you have to step out into the real world. In contrast, the Even Realities G1 is something you’d be able to wear for hours on end, indoors or outdoors. Made from lightweight materials with a construction that even throws away screws to reduce the heft, it’s almost mind-blowing to think that the glasses houses any electronics at all. This level of comfort is honestly the G1’s most important asset, because it allows people to experience its smart features far longer than any Quest or Viture. When it comes to eyewear, however, prescription lenses have always been a sore point for many consumers, and this is no exception. Because it integrates waveguide optics into the lens, you’ll have to pay extra to have customized prescription lenses when you buy an Even Realities G1. It can be a bit nerve-wracking to ensure you get all the measurements and figures right, especially since you can’t return or exchange glasses with customized lenses. While the G1 eyeglasses are definitely comfortable to wear, the same can’t exactly be said when it comes to manually interacting with them. While most smart glasses and headsets have controls near your temples, the G1’s touch-sensitive areas are at the temple tips, which would be sitting behind your ears when you’re wearing the glasses. They might feel awkward to reach, and those with long hairstyles might find it difficult to use. Fortunately, you will rarely touch those tips except to activate some functions, but it can still be an unsatisfactory experience when you do. Performance The Even Realities G1 takes a brilliantly focused approach to smart eyewear, prioritizing elegant design and practical functionality over unnecessary tech bloat. The 640×200 green monochrome display may seem modest, but it’s deliberate choice that enables the G1 to maintain a sleek, stylish profile. The absence of cameras and speakers isn’t a limitation but a thoughtful design decision that enhances both wearability and privacy, allowing users to seamlessly integrate this technology into their daily lives without social awkwardness. The magic of the G1 lies in its delivery of information directly to your field of vision in a way that not only delights but also transforms how you interact with digital content. The core Even Realities G1 experience revolves around bringing only critical information to your attention and keeping distractions away, all without disconnecting you from reality and the people around you. Its text-centric interface, displayed by two micro-LED displays, one on each lens, ensures that information is distilled down to its most essential. And there’s no denying the retro charm of a green dot-matrix screen in front of your eyes, even if the color won’t work well against light or bright objects. The Even Realities G1 experience starts with the dashboard, which you can summon just by tilting your head up a bit, an angle that you can set on the companion mobile app. One side shows the date and time, temperature, number of notifications, and your next appointment. The other side can be configured to show one of your saved quick notes, news, stocks, or even your current location. None of these items are interactive, and you’ll have to dive into the mobile app to actually get any further information. With Father’s Day approaching, it’s worth noting how the G1’s floating heads-up display, visible only to the wearer, helps dads stay effortlessly connected, organized, and present. The QuickNote and Calendar features are particularly valuable for fathers juggling work and family responsibilities, allowing them to process their to-do lists perfectly on schedule without missing a beat of family time. Spending quality time with your child then suddenly remembering you need to buy batteries on your next errand run? No more frantically scampering for pen and paper or even your phone; just tap and speak. Of course, the smart glasses really shine when it comes to the, well, smart functionality, most of which unsurprisingly revolve around words, both spoken and displayed. Transcription, which is used when making Quick Notes, records your voice and saves it alongside the transcribed text. Fathers who find themselves in never-ending meetings no longer need to worry about missing a beat. Not only do they get to keep notes, but they also receive a summary and recap thanks to the G1’s AI capabilities, a game-changer for busy dads who need to process information efficiently. Translation can make international trips quite fun, at least for some interactions, as you’ll be able to see actual translated captions floating in the air like subtitles on a video. Dads who give a lot of talks, business presentations, interviews, or broadcast videos will definitely love the Teleprompter feature, which can advance the script just based on the words you’re speaking. No more worrying about missing important points during that big presentation, leaving more mental bandwidth for what really matters. It’s also perfect for a captivating Career Day show that will do your kid proud. The accuracy of Even Realities’ speech recognition and AI is fairly good, though there are times when it will require a bit of patience and understanding. There’s a noticeable delay when translating what people say in real time, for example, and it might miss words if the person is speaking too quickly. Navigation can be a hit or miss, depending on your location, and the visual direction prompts are not always reliable. The latter is also one of the cases where the absence of built-in speakers feels a bit more pronounced. There’s no audio feedback, which could be useful for guided turn-by-turn navigation. Even AI can hear you, but it can’t talk back to you. Everything will be delivered only through text you have to read, which might not always be possible in some cases. Admittedly, the addition of such hardware, no matter how small, will also add weight to the glasses, so Even Realities chose their battles wisely. The Even Realities G1 is advertised to last for 1.5 days, and it indeed lasts at least more than a day. The stylish wireless charging case, which has a built-in 2,000mAh battery, extends that uptime to five days. Charging the glasses is as simple as putting them inside the case, no need to align any contact points, as long as you remember to fold the left arm first before the right arm. Oddly enough, there’s no battery level indicator on the glasses, even in the dashboard HUD. Even Realities focused on making the G1 simple, both in design and in operation. Sometimes even to the point of oversimplification. To reduce complexity, for example, each side of the glasses connects to a smartphone separately via Bluetooth, which unfortunately increases the risk of the two sides being out of sync if one or the other connection drops. Turning the glasses into shades is a simple case of slapping on clip-on shades that are not only an additional expense but also something you could lose somewhere. Sustainability By cutting down on the volume of the product, Even Realities also helps cut down waste material, especially the use of plastics. The G1 utilizes more metals than plastic, not only delivering a premium design but also preferring more renewable materials. The company is particularly proud of its packaging as well, which uses 100% recyclable, eco-friendly cardboard. While magnesium and titanium alloys contribute to the durability of the product, the Even Realities G1 is not exactly what you might consider to be a weather-proof piece of wearable tech. It has no formal IP rating, and the glasses are only said to be resistant to splashes and light rain. It can accompany you on your runs, sure, but you’ll have to treat it with much care. Not that it will have much practical use during your workouts in the first place. Value Discreet, useful, and simple, the Even Realities G1 smart glasses proudly stand in opposition to the literal heavyweights of the smart eyewear market that are practically strapping a computer on your face. It offers an experience that focuses on the most important functions and information you’d want to have in front of your eyes and pushes unnecessary distractions out of your sight. Most importantly, however, it keeps the whole world clearly in view, allowing you to connect to your digital life without disconnecting you from the people around you. The Even Realities G1 would almost be perfect for this hyper-focused use case if not for its price tag. At it’s easily one of the more expensive pairs of smart spectacles you’ll see on the market, and that’s only for the glasses themselves. For custom prescription lenses, you need to add another on top, not to mention theclip-on shades for those extra bright days. Given its limited functionality, the G1 definitely feels a bit overpriced. But when you consider how lightweight, distraction-free, and useful it can be, it comes off more as an investment for the future. For family and friends looking for a meaningful tech gift this Father’s Day, the G1 offers something truly unique: a way to stay on top of work responsibilities while remaining fully present for family moments. Whether capturing quick thoughts during a child’s soccer game or discreetly checking calendar reminders during family dinner, these glasses help dads maintain that delicate balance between connectivity and presence. Verdict It’s hard to escape the overabundance of information that we deal with every day, both from the world around us, as well as our own stash of notes and to-do lists. Unfortunately, the tools that we always have with us, our smartphones, computers, and smartwatches, are poor guardians against this flood. And now smart glasses are coming, promising access to all of that and threatening to further drown us with information we don’t really need. The Even Realities G1 is both a breath of fresh air and a bold statement against that trend. Not only is it lightweight and comfortable, but it even looks like normal glasses! Rather than throw everything and the kitchen sink into it, its design and functionality are completely intentional, focusing only on essential experiences and features to keep you productive. It’s not trying to turn you into Tony Stark, but it will help make you feel like a superhero as you breeze through your tasks while still being present to the people who really matter the most in your life. For the dad who wants to stay connected without being distracted, who needs to manage information without being overwhelmed by it, the Even Realities G1 might just be the perfect Father’s Day gift: a tool that helps him be both the professional he needs to be and the father he wants to be, all without missing a moment of what truly matters. Click Here to Buy Now: Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025.The post Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day first appeared on Yanko Design. #even #realities #glasses #review #smart
    Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day
    www.yankodesign.com
    PROS: Discreet, elegant, and unobtrusive design that doesn't scream "tech" Lightweight and comfortable premium frame Focuses on essential experiences without the unnecessary cruft Impressive transcription and teleprompter features Long battery life and effortless charging case design CONS: No speakers for calls or audio feedback (especially during navigation) Temple tips touch controls can be a bit cumbersome A bit expensive RATINGS: AESTHETICSERGONOMICSPERFORMANCESUSTAINABILITY / REPAIRABILITYVALUE FOR MONEYEDITOR'S QUOTE:With a simple design and useful features, the Even Realities G1 smart glasses prove that you don't need all the bells and whistles to provide an experience. Every day, we’re flooded with more information than our already overworked minds can handle. Our smartphones and computers put all this information at our fingertips, connecting us to the rest of the world while ironically disconnecting us from the people around us. Smart glasses and XR headsets promise to bring all this information right in front of us, bridging the gap that divides physical and virtual realities. And yet at the same time, they erect a wall that separates us from the here and now. It’s against this backdrop that Even Realities chose to take a bold step in the opposite direction. In both form and function, the Even Realities G1 smart glasses cut down on the cruft and promise a distilled experience that focuses only on what you really need to get through a busy day. More importantly, it delivers it in a minimalist design that doesn’t get in your way. Or at least that’s the spiel. Just in time for the upcoming Father’s Day celebration, we got to test what the Even Realities G1 has to offer, especially to some of the busiest people in our families: the dads juggling work responsibilities while trying to stay present for their loved ones. Designer: Even Realities Click Here to Buy Now: $599. Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025. Aesthetics You probably wouldn’t even be able to tell the Even Realities G1 is wearable tech if you meet someone on the street wearing a pair. Sure, they might look like slightly retro Pantos, but they’re a far cry from even the slimmest XR glasses from the likes of Xreal or Viture. You can clearly see the eyes of the person wearing them, and the tech is practically invisible, which is exactly the point. The design of the Even Realities G1 is on the plain and minimal side, a stark contrast to the majority of smart glasses and XR/AR headsets currently in the market, even those claiming to be fashionable and stylish. Sure, it’s not going to compete with high-end luxury spectacles, but they’re not entirely off the mark either. Unless you look really closely, you might simply presume them to be a pair of thick-framed glasses. The form of the glasses might be simple, but their construction is anything but. The frame is made from magnesium alloy with a coating that’s fused with sandstone, while the temples use a titanium alloy on the outer sides and soft silicone on the inner surfaces. The mixture of quality materials not only gives the Even Realities G1 a premium character but also a lightweight form that’s only ever so slightly heavier than your run-of-the-mill prescription eyeglasses. While the G1 most looks like normal eyewear, the temple tips are dead giveaways that things are not what they seem. The blocky, paddle-shaped tips that house batteries and electronics are definitely larger than what you’d find on most glasses. They’re not obnoxiously big, but they do tend to stick out a bit, and they’re hard to “unsee” once you’ve noticed their presence. Despite looking quite aesthetic, the Even Realities G1 isn’t pretending to be some posh fashion accessory. After all, the circular G1A and rectangular G1B options hardly cover all possible eyewear designs, and the limited color selection won’t suit everyone’s tastes. Rather than something you flaunt or call attention to, these smart glasses are designed to be an “everyday wear” and disappear into the background, making tech invisible without making it unusable, perfect for the dad who wants to stay connected without looking like he’s wearing a gadget at the family barbecue. Ergonomics If you’ve ever tried any of those hi-tech wearables promising the next wave of computing, then you’d probably know that you’d never wear any of those glasses or visors for more than just an hour or two every day. They may have impressive technologies and apps, but they become practically useless once you take them off, especially when you have to step out into the real world. In contrast, the Even Realities G1 is something you’d be able to wear for hours on end, indoors or outdoors. Made from lightweight materials with a construction that even throws away screws to reduce the heft, it’s almost mind-blowing to think that the glasses houses any electronics at all. This level of comfort is honestly the G1’s most important asset, because it allows people to experience its smart features far longer than any Quest or Viture. When it comes to eyewear, however, prescription lenses have always been a sore point for many consumers, and this is no exception. Because it integrates waveguide optics into the lens, you’ll have to pay extra to have customized prescription lenses when you buy an Even Realities G1. It can be a bit nerve-wracking to ensure you get all the measurements and figures right, especially since you can’t return or exchange glasses with customized lenses. While the G1 eyeglasses are definitely comfortable to wear, the same can’t exactly be said when it comes to manually interacting with them. While most smart glasses and headsets have controls near your temples, the G1’s touch-sensitive areas are at the temple tips, which would be sitting behind your ears when you’re wearing the glasses. They might feel awkward to reach, and those with long hairstyles might find it difficult to use. Fortunately, you will rarely touch those tips except to activate some functions, but it can still be an unsatisfactory experience when you do. Performance The Even Realities G1 takes a brilliantly focused approach to smart eyewear, prioritizing elegant design and practical functionality over unnecessary tech bloat. The 640×200 green monochrome display may seem modest, but it’s deliberate choice that enables the G1 to maintain a sleek, stylish profile. The absence of cameras and speakers isn’t a limitation but a thoughtful design decision that enhances both wearability and privacy, allowing users to seamlessly integrate this technology into their daily lives without social awkwardness. The magic of the G1 lies in its delivery of information directly to your field of vision in a way that not only delights but also transforms how you interact with digital content. The core Even Realities G1 experience revolves around bringing only critical information to your attention and keeping distractions away, all without disconnecting you from reality and the people around you. Its text-centric interface, displayed by two micro-LED displays, one on each lens, ensures that information is distilled down to its most essential. And there’s no denying the retro charm of a green dot-matrix screen in front of your eyes, even if the color won’t work well against light or bright objects. The Even Realities G1 experience starts with the dashboard, which you can summon just by tilting your head up a bit, an angle that you can set on the companion mobile app. One side shows the date and time, temperature, number of notifications, and your next appointment. The other side can be configured to show one of your saved quick notes, news, stocks, or even your current location. None of these items are interactive, and you’ll have to dive into the mobile app to actually get any further information. With Father’s Day approaching, it’s worth noting how the G1’s floating heads-up display, visible only to the wearer, helps dads stay effortlessly connected, organized, and present. The QuickNote and Calendar features are particularly valuable for fathers juggling work and family responsibilities, allowing them to process their to-do lists perfectly on schedule without missing a beat of family time. Spending quality time with your child then suddenly remembering you need to buy batteries on your next errand run? No more frantically scampering for pen and paper or even your phone; just tap and speak. Of course, the smart glasses really shine when it comes to the, well, smart functionality, most of which unsurprisingly revolve around words, both spoken and displayed. Transcription, which is used when making Quick Notes, records your voice and saves it alongside the transcribed text. Fathers who find themselves in never-ending meetings no longer need to worry about missing a beat. Not only do they get to keep notes, but they also receive a summary and recap thanks to the G1’s AI capabilities, a game-changer for busy dads who need to process information efficiently. Translation can make international trips quite fun, at least for some interactions, as you’ll be able to see actual translated captions floating in the air like subtitles on a video. Dads who give a lot of talks, business presentations, interviews, or broadcast videos will definitely love the Teleprompter feature, which can advance the script just based on the words you’re speaking. No more worrying about missing important points during that big presentation, leaving more mental bandwidth for what really matters. It’s also perfect for a captivating Career Day show that will do your kid proud. The accuracy of Even Realities’ speech recognition and AI is fairly good, though there are times when it will require a bit of patience and understanding. There’s a noticeable delay when translating what people say in real time, for example, and it might miss words if the person is speaking too quickly. Navigation can be a hit or miss, depending on your location, and the visual direction prompts are not always reliable. The latter is also one of the cases where the absence of built-in speakers feels a bit more pronounced. There’s no audio feedback, which could be useful for guided turn-by-turn navigation. Even AI can hear you, but it can’t talk back to you. Everything will be delivered only through text you have to read, which might not always be possible in some cases. Admittedly, the addition of such hardware, no matter how small, will also add weight to the glasses, so Even Realities chose their battles wisely. The Even Realities G1 is advertised to last for 1.5 days, and it indeed lasts at least more than a day. The stylish wireless charging case, which has a built-in 2,000mAh battery, extends that uptime to five days. Charging the glasses is as simple as putting them inside the case, no need to align any contact points, as long as you remember to fold the left arm first before the right arm. Oddly enough, there’s no battery level indicator on the glasses, even in the dashboard HUD. Even Realities focused on making the G1 simple, both in design and in operation. Sometimes even to the point of oversimplification. To reduce complexity, for example, each side of the glasses connects to a smartphone separately via Bluetooth, which unfortunately increases the risk of the two sides being out of sync if one or the other connection drops. Turning the glasses into shades is a simple case of slapping on clip-on shades that are not only an additional expense but also something you could lose somewhere. Sustainability By cutting down on the volume of the product, Even Realities also helps cut down waste material, especially the use of plastics. The G1 utilizes more metals than plastic, not only delivering a premium design but also preferring more renewable materials. The company is particularly proud of its packaging as well, which uses 100% recyclable, eco-friendly cardboard. While magnesium and titanium alloys contribute to the durability of the product, the Even Realities G1 is not exactly what you might consider to be a weather-proof piece of wearable tech. It has no formal IP rating, and the glasses are only said to be resistant to splashes and light rain. It can accompany you on your runs, sure, but you’ll have to treat it with much care. Not that it will have much practical use during your workouts in the first place. Value Discreet, useful, and simple, the Even Realities G1 smart glasses proudly stand in opposition to the literal heavyweights of the smart eyewear market that are practically strapping a computer on your face. It offers an experience that focuses on the most important functions and information you’d want to have in front of your eyes and pushes unnecessary distractions out of your sight. Most importantly, however, it keeps the whole world clearly in view, allowing you to connect to your digital life without disconnecting you from the people around you. The Even Realities G1 would almost be perfect for this hyper-focused use case if not for its price tag. At $599, it’s easily one of the more expensive pairs of smart spectacles you’ll see on the market, and that’s only for the glasses themselves. For custom prescription lenses, you need to add another $150 on top, not to mention the $50 (normally $100) clip-on shades for those extra bright days. Given its limited functionality, the G1 definitely feels a bit overpriced. But when you consider how lightweight, distraction-free, and useful it can be, it comes off more as an investment for the future. For family and friends looking for a meaningful tech gift this Father’s Day, the G1 offers something truly unique: a way to stay on top of work responsibilities while remaining fully present for family moments. Whether capturing quick thoughts during a child’s soccer game or discreetly checking calendar reminders during family dinner, these glasses help dads maintain that delicate balance between connectivity and presence. Verdict It’s hard to escape the overabundance of information that we deal with every day, both from the world around us, as well as our own stash of notes and to-do lists. Unfortunately, the tools that we always have with us, our smartphones, computers, and smartwatches, are poor guardians against this flood. And now smart glasses are coming, promising access to all of that and threatening to further drown us with information we don’t really need. The Even Realities G1 is both a breath of fresh air and a bold statement against that trend. Not only is it lightweight and comfortable, but it even looks like normal glasses! Rather than throw everything and the kitchen sink into it, its design and functionality are completely intentional, focusing only on essential experiences and features to keep you productive. It’s not trying to turn you into Tony Stark, but it will help make you feel like a superhero as you breeze through your tasks while still being present to the people who really matter the most in your life. For the dad who wants to stay connected without being distracted, who needs to manage information without being overwhelmed by it, the Even Realities G1 might just be the perfect Father’s Day gift: a tool that helps him be both the professional he needs to be and the father he wants to be, all without missing a moment of what truly matters. Click Here to Buy Now: $599. Exclusive Father’s Day Special – Get 50% Off the G1 Clip + Clip Pouch! Hurry, offer ends June 15, 2025.The post Even Realities G1 Glasses Review: Smart, Subtle, and Perfect for Father’s Day first appeared on Yanko Design.
    0 Comments ·0 Shares ·0 Reviews
CGShares https://cgshares.com