• Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’

    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One.
    By Jay Stobie
    Visual effects supervisor John Knollconfers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact.
    Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contactand Rogue One: A Star Wars Storypropelled their respective franchises to new heights. While Star Trek Generationswelcomed Captain Jean-Luc Picard’screw to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk. Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope, it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story, The Mandalorian, Andor, Ahsoka, The Acolyte, and more.
    The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif.
    A final frame from the Battle of Scarif in Rogue One: A Star Wars Story.
    A Context for Conflict
    In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design.
    On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Ersoand Cassian Andorand the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival.
    From Physical to Digital
    By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical modelsfor its features was gradually giving way to innovative computer graphicsmodels, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001.
    Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com.
    However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.”
    John Knollconfers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact.
    Legendary Lineages
    In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.”
    Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet.
    While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got fromVER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.”
    The U.S.S. Enterprise-E in Star Trek: First Contact.
    Familiar Foes
    To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generationand Star Trek: Deep Space Nine, creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin.
    As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.”
    Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back, respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.”
    A final frame from Rogue One: A Star Wars Story.
    Forming Up the Fleets
    In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics.
    Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs, live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples. These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’spersonal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography…
    Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized.
    Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story.
    Tough Little Ships
    The Federation and Rebel Alliance each deployed “tough little ships”in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001!
    Exploration and Hope
    The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire.
    The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope?

    Jay Stobieis a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.
    #looking #back #two #classics #ilm
    Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’
    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One. By Jay Stobie Visual effects supervisor John Knollconfers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact. Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contactand Rogue One: A Star Wars Storypropelled their respective franchises to new heights. While Star Trek Generationswelcomed Captain Jean-Luc Picard’screw to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk. Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope, it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story, The Mandalorian, Andor, Ahsoka, The Acolyte, and more. The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif. A final frame from the Battle of Scarif in Rogue One: A Star Wars Story. A Context for Conflict In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design. On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Ersoand Cassian Andorand the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival. From Physical to Digital By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical modelsfor its features was gradually giving way to innovative computer graphicsmodels, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001. Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com. However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.” John Knollconfers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact. Legendary Lineages In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.” Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet. While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got fromVER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.” The U.S.S. Enterprise-E in Star Trek: First Contact. Familiar Foes To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generationand Star Trek: Deep Space Nine, creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin. As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.” Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back, respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.” A final frame from Rogue One: A Star Wars Story. Forming Up the Fleets In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics. Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs, live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples. These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’spersonal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography… Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized. Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story. Tough Little Ships The Federation and Rebel Alliance each deployed “tough little ships”in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001! Exploration and Hope The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire. The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope? – Jay Stobieis a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy. #looking #back #two #classics #ilm
    WWW.ILM.COM
    Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’
    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One. By Jay Stobie Visual effects supervisor John Knoll (right) confers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM). Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contact (1996) and Rogue One: A Star Wars Story (2016) propelled their respective franchises to new heights. While Star Trek Generations (1994) welcomed Captain Jean-Luc Picard’s (Patrick Stewart) crew to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk (William Shatner). Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope (1977), it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story (2018), The Mandalorian (2019-23), Andor (2022-25), Ahsoka (2023), The Acolyte (2024), and more. The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif. A final frame from the Battle of Scarif in Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). A Context for Conflict In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design. On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Erso (Felicity Jones) and Cassian Andor (Diego Luna) and the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival. From Physical to Digital By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical models (many of which were built by ILM) for its features was gradually giving way to innovative computer graphics (CG) models, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001. Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com. However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.” John Knoll (second from left) confers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM). Legendary Lineages In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.” Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet. While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got from [equipment vendor] VER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.” The U.S.S. Enterprise-E in Star Trek: First Contact (Credit: Paramount). Familiar Foes To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generation (1987) and Star Trek: Deep Space Nine (1993), creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin. As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.” Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back (1980), respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.” A final frame from Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). Forming Up the Fleets In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics. Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs (the MC75 cruiser Profundity and U-wings), live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples (Nebulon-B frigates, X-wings, Y-wings, and more). These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’s (Carrie Fisher and Ingvild Deila) personal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography… Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized. Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). Tough Little Ships The Federation and Rebel Alliance each deployed “tough little ships” (an endearing description Commander William T. Riker [Jonathan Frakes] bestowed upon the U.S.S. Defiant in First Contact) in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001! Exploration and Hope The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire. The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope? – Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.
    0 Commentarii 0 Distribuiri
  • ‘Star Wars: Starfighter’ Casts Mia Goth as Villain

    In the grand tradition of Darth Vader, the Emperor, and Kylo Ren, meet the latest Star Wars villain: Mia Goth, the star of X, Pearl, and MaXXXine.Goth will appear opposite Ryan Gosling in the recently announced Star Wars: Starfighter, a standalone film that is being directed by Deadpool & Wolverine’s Shawn Levy.According to The Hollywood Reporter, “details on the project are scant, but it does involve Gosling playing a character that must protect a young charge against evil pursuers.” Goth plays one of the “evil pursuers.” They note that she will play “the same role that Mikey Madison had been circling before her deal blew up like a Death Star — over money matters.”STAR WARS: SKELETON CREWLucasfilm Ltd.loading...READ MORE: 12 Great Actors Wasted in Star Wars RolesAfter many years in cinematic limbo, the Star Wars franchise is finally ramping up film production again. The most recent Star Wars feature, The Rise of Skywalker, opened in theaters way back in 2019. Since then, the series has focused entirely on TV shows for Disney+. Lucasfilm announced some potential film projects during this period, but every single one of them wound up trapped in development hell.In fact, the only Star Wars film that actually made it into production grew out of the TV side of the business: The upcoming The Mandalorian & Grogu, which will continue the story of the popular Disney+ Mandalorian show. That film is expected to open in theaters in May of 2026.Lucasfilm formally announced Starfighter earlierStar Wars movie.Star Wars: Starfighter is currently scheduled to open in theaters on May 28, 2027.I Ate Everything on Burger King’s ‘How to Train Your Dragon’ MenuIn honor of the live-action How to Train Your Dragon movie, Burger King now has an entire menu of “fiery” items. I ate all of them.
    #star #wars #starfighter #casts #mia
    ‘Star Wars: Starfighter’ Casts Mia Goth as Villain
    In the grand tradition of Darth Vader, the Emperor, and Kylo Ren, meet the latest Star Wars villain: Mia Goth, the star of X, Pearl, and MaXXXine.Goth will appear opposite Ryan Gosling in the recently announced Star Wars: Starfighter, a standalone film that is being directed by Deadpool & Wolverine’s Shawn Levy.According to The Hollywood Reporter, “details on the project are scant, but it does involve Gosling playing a character that must protect a young charge against evil pursuers.” Goth plays one of the “evil pursuers.” They note that she will play “the same role that Mikey Madison had been circling before her deal blew up like a Death Star — over money matters.”STAR WARS: SKELETON CREWLucasfilm Ltd.loading...READ MORE: 12 Great Actors Wasted in Star Wars RolesAfter many years in cinematic limbo, the Star Wars franchise is finally ramping up film production again. The most recent Star Wars feature, The Rise of Skywalker, opened in theaters way back in 2019. Since then, the series has focused entirely on TV shows for Disney+. Lucasfilm announced some potential film projects during this period, but every single one of them wound up trapped in development hell.In fact, the only Star Wars film that actually made it into production grew out of the TV side of the business: The upcoming The Mandalorian & Grogu, which will continue the story of the popular Disney+ Mandalorian show. That film is expected to open in theaters in May of 2026.Lucasfilm formally announced Starfighter earlierStar Wars movie.Star Wars: Starfighter is currently scheduled to open in theaters on May 28, 2027.I Ate Everything on Burger King’s ‘How to Train Your Dragon’ MenuIn honor of the live-action How to Train Your Dragon movie, Burger King now has an entire menu of “fiery” items. I ate all of them. #star #wars #starfighter #casts #mia
    SCREENCRUSH.COM
    ‘Star Wars: Starfighter’ Casts Mia Goth as Villain
    In the grand tradition of Darth Vader, the Emperor, and Kylo Ren, meet the latest Star Wars villain: Mia Goth, the star of X, Pearl, and MaXXXine.Goth will appear opposite Ryan Gosling in the recently announced Star Wars: Starfighter, a standalone film that is being directed by Deadpool & Wolverine’s Shawn Levy.According to The Hollywood Reporter, “details on the project are scant, but it does involve Gosling playing a character that must protect a young charge against evil pursuers.” Goth plays one of the “evil pursuers.” They note that she will play “the same role that Mikey Madison had been circling before her deal blew up like a Death Star — over money matters.”STAR WARS: SKELETON CREWLucasfilm Ltd.loading...READ MORE: 12 Great Actors Wasted in Star Wars RolesAfter many years in cinematic limbo, the Star Wars franchise is finally ramping up film production again. The most recent Star Wars feature, The Rise of Skywalker, opened in theaters way back in 2019. Since then, the series has focused entirely on TV shows for Disney+. Lucasfilm announced some potential film projects during this period, but every single one of them wound up trapped in development hell.In fact, the only Star Wars film that actually made it into production grew out of the TV side of the business: The upcoming The Mandalorian & Grogu, which will continue the story of the popular Disney+ Mandalorian show. That film is expected to open in theaters in May of 2026.Lucasfilm formally announced Starfighter earlierStar Wars movie.Star Wars: Starfighter is currently scheduled to open in theaters on May 28, 2027.I Ate Everything on Burger King’s ‘How to Train Your Dragon’ MenuIn honor of the live-action How to Train Your Dragon movie, Burger King now has an entire menu of “fiery” items. I ate all of them.
    Like
    Love
    Wow
    Sad
    Angry
    664
    0 Commentarii 0 Distribuiri
  • Everything new at Summer Game Fest 2025: Marvel Tōkon, Resident Evil Requiem and more

    It's early June, which means it's time for a ton of video game events! Rising from the ashes of E3, Geoff Keighley's Summer Game Fest is now the premium gaming event of the year, just inching ahead of… Geoff Keighley's Game Awards in December. Unlike the show it replaced, Summer Game Fest is an egalitarian affair, spotlighting games from AAA developers and small indies across a diverse set of livestreams. SGF 2025 includes 15 individual events running from June 3-9 — you can find the full Summer Game Fest 2025 schedule here — and we're smack dab in the middle of that programming right now.
    We're covering SGF 2025 with a small team on the ground in LA and a far larger group of writers tuning in remotely to the various livestreams. Expect game previews, interviews and reactions to arrive over the coming days, and a boatload of new trailers and release date announcements in between.
    Through it all, we're collating the biggest announcements right here, with links out to more in-depth coverage where we have it, in chronological order.
    Tuesday, June 3
    State of Unreal: The Witcher IV and Fortnite AI
    Epic hitched its wagon to SGF this year, aligning its annual developer Unreal Fest conference, which last took place in the fall of 2024, with the consumer event. The conference was held in Orlando, Florida, from June 2-5, with well over a hundred developer sessions focused on Unreal Engine. The highlight was State of Unreal, which was the first event on the official Summer Game Fest schedule. Amid a bunch of very cool tech demos and announcements, we got some meaningful updates on Epic's own Fortnite and CD PROJEKT RED's upcoming The Witcher IV.

    The Witcher IV was first unveiled at The Game Awards last year, and we've heard very little about it since. At State of Unreal, we got a tech demo for Unreal Engine 5.6, played in real time on a base PS5. The roughly 10-minute slot featured a mix of gameplay and cinematics, and showed off a detailed, bustling world. Perhaps the technical highlight was Nanite Foliage, an extension of UE5's Nanite system for geometry that renders foliage without the level of detail pop-in that is perhaps the most widespread graphical aberration still plaguing games today. On the game side, we saw a town filled with hundreds of NPCs going about their business. The town itself wasn't quite on the scale of The Witcher III's Novigrad City, but nonetheless felt alive in a way beyond anything the last game achieved.
    It's fair to say that Fortnite's moment in the spotlight was… less impressive. Hot on the heels of smooshing a profane Darth Vader AI into the game, Epic announced that creators will be able to roll their own AI NPCs into the game later this year.
    Wednesday, June 4
    PlayStation State of Play: Marvel Tōkon, Silent Hill f and the return of Lumines
    Another company getting a headstart on proceedings was Sony, who threw its third State of Play of the year onto the Summer Game Fest schedule a couple days ahead of the opening night event. It was a packed stream by Sony's standards, with over 20 games and even a surprise hardware announcement.

    The most time was given to Marvel Tōkon: Fighting Souls, a new PlayStation Studios tag fighter that fuses Marvel Superheroes with anime visuals. It's also 4 versus 4, which is wild. It's being developed by Arc System Works, the team perhaps best known for the Guilty Gear series. It's coming to PS5 and PC in 2026. Not-so-coincidentally, Sony also announced Project Defiant, a wireless fight stick that'll support PS5 and PC and arrive in… 2026.
    Elsewhere, we got a parade of release dates, with concrete dates for Sword of the Sea Baby Steps and Silent Hill f. We also got confirmation of that Final Fantasy Tactics remaster, an an all-new... let's call it aspirational "2026" date for Pragmata, which, if you're keeping score, was advertised alongside the launch of the PS5. Great going, Capcom!

    Rounding out the show was a bunch of smaller announcements. We heard about a new Nioh game, Nioh 3, coming in 2026; Suda51's new weirdness Romeo is a Dead Man; and Lumines Arise, a long-awaited return to the Lumines series from the developer behind Tetris Effect.
    Thursday, June 5
    Diddly squat
    There were absolutely no Summer Game Fest events scheduled on Thursday. We assume that's out of respect for antipodean trees, as June 5 was Arbor Day in New Zealand.Friday, June 6
    Summer Game Fest Live: Resident Evil Requiem, Stranger Than Heaven and sequels abound
    It's fair to say that previous Summer Game Fest opening night streams have been… whelming at best. This year's showing was certainly an improvement, not least because there were exponentially fewer mobile game and MMO ads littering the presentation. Yes, folks tracking Gabe Newell's yacht were disappointed that Half-Life 3 didn't show up, and the Silksong crowd remains sad, alone and unloved, but there were nonetheless some huge announcements.

    Perhaps the biggest of all was the "ninth"Resident Evil game. Resident Evil Requiem is said to be a tonal shift compared to the last game, Resident Evil Village. Here's hoping it reinvigorates the series in the same way Resident Evil VII did following the disappointing 6.
    We also heard more from Sega studio Ryu Ga Gotoku about Project Century, which seems to be a 1943 take on the Yakuza series. It's now called Stranger Than Heaven, and there's ajazzy new trailer for your consideration.

    Outside of those big swings, there were sequels to a bunch of mid-sized games, like Atomic Heart, Code Vein and Mortal Shell, and a spiritual sequel of sorts: Scott Pilgrim EX, a beat-em-up that takes the baton from the 2010 Ubisoft brawler Scott Pilgrim vs. the World: The Game.
    There were countless other announcements at the show, including:

    Troy Baker is the big cheese in Mouse: P.I. for Hire
    Here's a silly puppet boxing game you never knew you needed
    Killer Inn turns Werewolf into a multiplayer action game
    Out of Words is a cozy stop-motion co-op adventure from Epic Games
    Lego Voyagers is a co-op puzzle game from the studio behind Builder's Journey
    Mina the Hollower, from the makers of Shovel Knight, arrives on Halloween
    Wu-Tang Clan's new game blends anime with Afro-surrealism

    Day of the Devs: Blighted, Snap & Grab, Blighted and Escape Academy II
    As always, the kickoff show was followed by a Day of the Devs stream, which focused on smaller projects and indie games. You can watch the full stream here.
    Escape Academy has been firmly on our best couch co-op games list for some time, and now it's got a sequel on the way. Escape Academy 2: Back 2 School takes the same basic co-op escape room fun and expands on it, moving away from a level-select map screen and towards a fully 3D school campus for players to explore. So long as the puzzles themselves are as fun as the original, it seems like a winner. 

    Semblance studio Nyamakop is back with new jam called Relooted, a heist game with a unique twist. As in the real world, museums in the West are full of items plundered from African nations under colonialism. Unlike the real world, in Relooted the colonial powers have signed a treaty to return these items to their places of origin, but things aren't going to plan, as many artifacts are finding their way into private collections. It's your job to steal them back. The British Museum is quaking in its boots.

    Here are some of the other games that caught our eye:

    Snap & Grab is No Goblin's campy, photography-based heist game
    Please, Watch the Artwork is a puzzle game with eerie paintings and a sad clown
    Bask in the grotesque pixel-art beauty of Neverway
    Pocket Boss turns corporate data manipulation into a puzzle game
    Tire Boy is a wacky open-world adventure game you can tread all over

    The rest: Ball x Pit, Hitman and 007 First Light

    After Day of the Devs came Devolver. Its Summer Game Fest show was a little more muted than usual, focusing on a single game: Ball x Pit. It's the next game from Kenny Sun, an indie developer who previously made the sleeper hit Mr. Sun's Hatbox. Ball x Pit is being made by a team of more than half a dozen devs, in contrast to Sun's mostly solo prior works. It looks like an interesting mashup of Breakout and base-building mechanics, and there's a demo on Steam available right now.

    Then came IOI, the makers of Hitman, who put together a classic E3-style cringefest, full of awkward pauses, ill-paced demos and repetitive trailers. Honestly, as someone who's been watching game company presentations for two decades or so, it was a nice moment of nostalgia. 
    Away from the marvel of a presenter trying to cope with everything going wrong, the show did have some actual content, with an extended demo of the new James Bond-themed Hitman mission, an announcement that Hitman is coming to iOS and table tops, and a presentation on MindsEye, a game from former GTA producer Leslie Benzies that IOI is publishing. 
    Saturday-Sunday: Xbox and much, much more
    Now you're all caught up. We're expecting a lot of news this weekend, mostly from Xbox on Sunday. We'll be updating this article through the weekend and beyond, but you can find the latest announcements from Summer Game Fest 2025 on our front page.This article originally appeared on Engadget at
    #everything #new #summer #game #fest
    Everything new at Summer Game Fest 2025: Marvel Tōkon, Resident Evil Requiem and more
    It's early June, which means it's time for a ton of video game events! Rising from the ashes of E3, Geoff Keighley's Summer Game Fest is now the premium gaming event of the year, just inching ahead of… Geoff Keighley's Game Awards in December. Unlike the show it replaced, Summer Game Fest is an egalitarian affair, spotlighting games from AAA developers and small indies across a diverse set of livestreams. SGF 2025 includes 15 individual events running from June 3-9 — you can find the full Summer Game Fest 2025 schedule here — and we're smack dab in the middle of that programming right now. We're covering SGF 2025 with a small team on the ground in LA and a far larger group of writers tuning in remotely to the various livestreams. Expect game previews, interviews and reactions to arrive over the coming days, and a boatload of new trailers and release date announcements in between. Through it all, we're collating the biggest announcements right here, with links out to more in-depth coverage where we have it, in chronological order. Tuesday, June 3 State of Unreal: The Witcher IV and Fortnite AI Epic hitched its wagon to SGF this year, aligning its annual developer Unreal Fest conference, which last took place in the fall of 2024, with the consumer event. The conference was held in Orlando, Florida, from June 2-5, with well over a hundred developer sessions focused on Unreal Engine. The highlight was State of Unreal, which was the first event on the official Summer Game Fest schedule. Amid a bunch of very cool tech demos and announcements, we got some meaningful updates on Epic's own Fortnite and CD PROJEKT RED's upcoming The Witcher IV. The Witcher IV was first unveiled at The Game Awards last year, and we've heard very little about it since. At State of Unreal, we got a tech demo for Unreal Engine 5.6, played in real time on a base PS5. The roughly 10-minute slot featured a mix of gameplay and cinematics, and showed off a detailed, bustling world. Perhaps the technical highlight was Nanite Foliage, an extension of UE5's Nanite system for geometry that renders foliage without the level of detail pop-in that is perhaps the most widespread graphical aberration still plaguing games today. On the game side, we saw a town filled with hundreds of NPCs going about their business. The town itself wasn't quite on the scale of The Witcher III's Novigrad City, but nonetheless felt alive in a way beyond anything the last game achieved. It's fair to say that Fortnite's moment in the spotlight was… less impressive. Hot on the heels of smooshing a profane Darth Vader AI into the game, Epic announced that creators will be able to roll their own AI NPCs into the game later this year. Wednesday, June 4 PlayStation State of Play: Marvel Tōkon, Silent Hill f and the return of Lumines Another company getting a headstart on proceedings was Sony, who threw its third State of Play of the year onto the Summer Game Fest schedule a couple days ahead of the opening night event. It was a packed stream by Sony's standards, with over 20 games and even a surprise hardware announcement. The most time was given to Marvel Tōkon: Fighting Souls, a new PlayStation Studios tag fighter that fuses Marvel Superheroes with anime visuals. It's also 4 versus 4, which is wild. It's being developed by Arc System Works, the team perhaps best known for the Guilty Gear series. It's coming to PS5 and PC in 2026. Not-so-coincidentally, Sony also announced Project Defiant, a wireless fight stick that'll support PS5 and PC and arrive in… 2026. Elsewhere, we got a parade of release dates, with concrete dates for Sword of the Sea Baby Steps and Silent Hill f. We also got confirmation of that Final Fantasy Tactics remaster, an an all-new... let's call it aspirational "2026" date for Pragmata, which, if you're keeping score, was advertised alongside the launch of the PS5. Great going, Capcom! Rounding out the show was a bunch of smaller announcements. We heard about a new Nioh game, Nioh 3, coming in 2026; Suda51's new weirdness Romeo is a Dead Man; and Lumines Arise, a long-awaited return to the Lumines series from the developer behind Tetris Effect. Thursday, June 5 Diddly squat There were absolutely no Summer Game Fest events scheduled on Thursday. We assume that's out of respect for antipodean trees, as June 5 was Arbor Day in New Zealand.Friday, June 6 Summer Game Fest Live: Resident Evil Requiem, Stranger Than Heaven and sequels abound It's fair to say that previous Summer Game Fest opening night streams have been… whelming at best. This year's showing was certainly an improvement, not least because there were exponentially fewer mobile game and MMO ads littering the presentation. Yes, folks tracking Gabe Newell's yacht were disappointed that Half-Life 3 didn't show up, and the Silksong crowd remains sad, alone and unloved, but there were nonetheless some huge announcements. Perhaps the biggest of all was the "ninth"Resident Evil game. Resident Evil Requiem is said to be a tonal shift compared to the last game, Resident Evil Village. Here's hoping it reinvigorates the series in the same way Resident Evil VII did following the disappointing 6. We also heard more from Sega studio Ryu Ga Gotoku about Project Century, which seems to be a 1943 take on the Yakuza series. It's now called Stranger Than Heaven, and there's ajazzy new trailer for your consideration. Outside of those big swings, there were sequels to a bunch of mid-sized games, like Atomic Heart, Code Vein and Mortal Shell, and a spiritual sequel of sorts: Scott Pilgrim EX, a beat-em-up that takes the baton from the 2010 Ubisoft brawler Scott Pilgrim vs. the World: The Game. There were countless other announcements at the show, including: Troy Baker is the big cheese in Mouse: P.I. for Hire Here's a silly puppet boxing game you never knew you needed Killer Inn turns Werewolf into a multiplayer action game Out of Words is a cozy stop-motion co-op adventure from Epic Games Lego Voyagers is a co-op puzzle game from the studio behind Builder's Journey Mina the Hollower, from the makers of Shovel Knight, arrives on Halloween Wu-Tang Clan's new game blends anime with Afro-surrealism Day of the Devs: Blighted, Snap & Grab, Blighted and Escape Academy II As always, the kickoff show was followed by a Day of the Devs stream, which focused on smaller projects and indie games. You can watch the full stream here. Escape Academy has been firmly on our best couch co-op games list for some time, and now it's got a sequel on the way. Escape Academy 2: Back 2 School takes the same basic co-op escape room fun and expands on it, moving away from a level-select map screen and towards a fully 3D school campus for players to explore. So long as the puzzles themselves are as fun as the original, it seems like a winner.  Semblance studio Nyamakop is back with new jam called Relooted, a heist game with a unique twist. As in the real world, museums in the West are full of items plundered from African nations under colonialism. Unlike the real world, in Relooted the colonial powers have signed a treaty to return these items to their places of origin, but things aren't going to plan, as many artifacts are finding their way into private collections. It's your job to steal them back. The British Museum is quaking in its boots. Here are some of the other games that caught our eye: Snap & Grab is No Goblin's campy, photography-based heist game Please, Watch the Artwork is a puzzle game with eerie paintings and a sad clown Bask in the grotesque pixel-art beauty of Neverway Pocket Boss turns corporate data manipulation into a puzzle game Tire Boy is a wacky open-world adventure game you can tread all over The rest: Ball x Pit, Hitman and 007 First Light After Day of the Devs came Devolver. Its Summer Game Fest show was a little more muted than usual, focusing on a single game: Ball x Pit. It's the next game from Kenny Sun, an indie developer who previously made the sleeper hit Mr. Sun's Hatbox. Ball x Pit is being made by a team of more than half a dozen devs, in contrast to Sun's mostly solo prior works. It looks like an interesting mashup of Breakout and base-building mechanics, and there's a demo on Steam available right now. Then came IOI, the makers of Hitman, who put together a classic E3-style cringefest, full of awkward pauses, ill-paced demos and repetitive trailers. Honestly, as someone who's been watching game company presentations for two decades or so, it was a nice moment of nostalgia.  Away from the marvel of a presenter trying to cope with everything going wrong, the show did have some actual content, with an extended demo of the new James Bond-themed Hitman mission, an announcement that Hitman is coming to iOS and table tops, and a presentation on MindsEye, a game from former GTA producer Leslie Benzies that IOI is publishing.  Saturday-Sunday: Xbox and much, much more Now you're all caught up. We're expecting a lot of news this weekend, mostly from Xbox on Sunday. We'll be updating this article through the weekend and beyond, but you can find the latest announcements from Summer Game Fest 2025 on our front page.This article originally appeared on Engadget at #everything #new #summer #game #fest
    WWW.ENGADGET.COM
    Everything new at Summer Game Fest 2025: Marvel Tōkon, Resident Evil Requiem and more
    It's early June, which means it's time for a ton of video game events! Rising from the ashes of E3, Geoff Keighley's Summer Game Fest is now the premium gaming event of the year, just inching ahead of… Geoff Keighley's Game Awards in December. Unlike the show it replaced, Summer Game Fest is an egalitarian affair, spotlighting games from AAA developers and small indies across a diverse set of livestreams. SGF 2025 includes 15 individual events running from June 3-9 — you can find the full Summer Game Fest 2025 schedule here — and we're smack dab in the middle of that programming right now. We're covering SGF 2025 with a small team on the ground in LA and a far larger group of writers tuning in remotely to the various livestreams. Expect game previews, interviews and reactions to arrive over the coming days (the show's in-person component runs from Saturday-Monday), and a boatload of new trailers and release date announcements in between. Through it all, we're collating the biggest announcements right here, with links out to more in-depth coverage where we have it, in chronological order. Tuesday, June 3 State of Unreal: The Witcher IV and Fortnite AI Epic hitched its wagon to SGF this year, aligning its annual developer Unreal Fest conference, which last took place in the fall of 2024, with the consumer event. The conference was held in Orlando, Florida, from June 2-5, with well over a hundred developer sessions focused on Unreal Engine. The highlight was State of Unreal, which was the first event on the official Summer Game Fest schedule. Amid a bunch of very cool tech demos and announcements, we got some meaningful updates on Epic's own Fortnite and CD PROJEKT RED's upcoming The Witcher IV. The Witcher IV was first unveiled at The Game Awards last year, and we've heard very little about it since. At State of Unreal, we got a tech demo for Unreal Engine 5.6, played in real time on a base PS5. The roughly 10-minute slot featured a mix of gameplay and cinematics, and showed off a detailed, bustling world. Perhaps the technical highlight was Nanite Foliage, an extension of UE5's Nanite system for geometry that renders foliage without the level of detail pop-in that is perhaps the most widespread graphical aberration still plaguing games today. On the game side, we saw a town filled with hundreds of NPCs going about their business. The town itself wasn't quite on the scale of The Witcher III's Novigrad City, but nonetheless felt alive in a way beyond anything the last game achieved. It's fair to say that Fortnite's moment in the spotlight was… less impressive. Hot on the heels of smooshing a profane Darth Vader AI into the game, Epic announced that creators will be able to roll their own AI NPCs into the game later this year. Wednesday, June 4 PlayStation State of Play: Marvel Tōkon, Silent Hill f and the return of Lumines Another company getting a headstart on proceedings was Sony, who threw its third State of Play of the year onto the Summer Game Fest schedule a couple days ahead of the opening night event. It was a packed stream by Sony's standards, with over 20 games and even a surprise hardware announcement. The most time was given to Marvel Tōkon: Fighting Souls, a new PlayStation Studios tag fighter that fuses Marvel Superheroes with anime visuals. It's also 4 versus 4, which is wild. It's being developed by Arc System Works, the team perhaps best known for the Guilty Gear series. It's coming to PS5 and PC in 2026. Not-so-coincidentally, Sony also announced Project Defiant, a wireless fight stick that'll support PS5 and PC and arrive in… 2026. Elsewhere, we got a parade of release dates, with concrete dates for Sword of the Sea (August 19) Baby Steps (September 8) and Silent Hill f (September 25). We also got confirmation of that Final Fantasy Tactics remaster (coming September 30), an an all-new... let's call it aspirational "2026" date for Pragmata, which, if you're keeping score, was advertised alongside the launch of the PS5. Great going, Capcom! Rounding out the show was a bunch of smaller announcements. We heard about a new Nioh game, Nioh 3, coming in 2026; Suda51's new weirdness Romeo is a Dead Man; and Lumines Arise, a long-awaited return to the Lumines series from the developer behind Tetris Effect. Thursday, June 5 Diddly squat There were absolutely no Summer Game Fest events scheduled on Thursday. We assume that's out of respect for antipodean trees, as June 5 was Arbor Day in New Zealand. (It's probably because everyone was playing Nintendo Switch 2.) Friday, June 6 Summer Game Fest Live: Resident Evil Requiem, Stranger Than Heaven and sequels abound It's fair to say that previous Summer Game Fest opening night streams have been… whelming at best. This year's showing was certainly an improvement, not least because there were exponentially fewer mobile game and MMO ads littering the presentation. Yes, folks tracking Gabe Newell's yacht were disappointed that Half-Life 3 didn't show up, and the Silksong crowd remains sad, alone and unloved, but there were nonetheless some huge announcements. Perhaps the biggest of all was the "ninth" (Zero and Code Veronica erasure is real) Resident Evil game. Resident Evil Requiem is said to be a tonal shift compared to the last game, Resident Evil Village. Here's hoping it reinvigorates the series in the same way Resident Evil VII did following the disappointing 6. We also heard more from Sega studio Ryu Ga Gotoku about Project Century, which seems to be a 1943 take on the Yakuza series. It's now called Stranger Than Heaven, and there's a (literally) jazzy new trailer for your consideration. Outside of those big swings, there were sequels to a bunch of mid-sized games, like Atomic Heart, Code Vein and Mortal Shell, and a spiritual sequel of sorts: Scott Pilgrim EX, a beat-em-up that takes the baton from the 2010 Ubisoft brawler Scott Pilgrim vs. the World: The Game. There were countless other announcements at the show, including: Troy Baker is the big cheese in Mouse: P.I. for Hire Here's a silly puppet boxing game you never knew you needed Killer Inn turns Werewolf into a multiplayer action game Out of Words is a cozy stop-motion co-op adventure from Epic Games Lego Voyagers is a co-op puzzle game from the studio behind Builder's Journey Mina the Hollower, from the makers of Shovel Knight, arrives on Halloween Wu-Tang Clan's new game blends anime with Afro-surrealism Day of the Devs: Blighted, Snap & Grab, Blighted and Escape Academy II As always, the kickoff show was followed by a Day of the Devs stream, which focused on smaller projects and indie games. You can watch the full stream here. Escape Academy has been firmly on our best couch co-op games list for some time, and now it's got a sequel on the way. Escape Academy 2: Back 2 School takes the same basic co-op escape room fun and expands on it, moving away from a level-select map screen and towards a fully 3D school campus for players to explore. So long as the puzzles themselves are as fun as the original, it seems like a winner.  Semblance studio Nyamakop is back with new jam called Relooted, a heist game with a unique twist. As in the real world, museums in the West are full of items plundered from African nations under colonialism. Unlike the real world, in Relooted the colonial powers have signed a treaty to return these items to their places of origin, but things aren't going to plan, as many artifacts are finding their way into private collections. It's your job to steal them back. The British Museum is quaking in its boots. Here are some of the other games that caught our eye: Snap & Grab is No Goblin's campy, photography-based heist game Please, Watch the Artwork is a puzzle game with eerie paintings and a sad clown Bask in the grotesque pixel-art beauty of Neverway Pocket Boss turns corporate data manipulation into a puzzle game Tire Boy is a wacky open-world adventure game you can tread all over The rest: Ball x Pit, Hitman and 007 First Light After Day of the Devs came Devolver. Its Summer Game Fest show was a little more muted than usual, focusing on a single game: Ball x Pit. It's the next game from Kenny Sun, an indie developer who previously made the sleeper hit Mr. Sun's Hatbox. Ball x Pit is being made by a team of more than half a dozen devs, in contrast to Sun's mostly solo prior works. It looks like an interesting mashup of Breakout and base-building mechanics, and there's a demo on Steam available right now. Then came IOI, the makers of Hitman, who put together a classic E3-style cringefest, full of awkward pauses, ill-paced demos and repetitive trailers. Honestly, as someone who's been watching game company presentations for two decades or so, it was a nice moment of nostalgia.  Away from the marvel of a presenter trying to cope with everything going wrong, the show did have some actual content, with an extended demo of the new James Bond-themed Hitman mission, an announcement that Hitman is coming to iOS and table tops, and a presentation on MindsEye, a game from former GTA producer Leslie Benzies that IOI is publishing.  Saturday-Sunday: Xbox and much, much more Now you're all caught up. We're expecting a lot of news this weekend, mostly from Xbox on Sunday. We'll be updating this article through the weekend and beyond, but you can find the latest announcements from Summer Game Fest 2025 on our front page.This article originally appeared on Engadget at https://www.engadget.com/gaming/everything-new-at-summer-game-fest-2025-marvel-tokon-resident-evil-requiem-and-more-185425995.html?src=rss
    Like
    Love
    Wow
    Sad
    Angry
    525
    0 Commentarii 0 Distribuiri
  • The Orb Will See You Now

    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can seethis becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blaniaand Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than million actually building its app—but pumped an additional million or so into a referral program, whereby new users and the person who invited them would each receive in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash.A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data.“Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean wonhas landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much morelike: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just givingaccess to the latestmodels and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is billion, not billion.
    #orb #will #see #you #now
    The Orb Will See You Now
    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can seethis becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blaniaand Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than million actually building its app—but pumped an additional million or so into a referral program, whereby new users and the person who invited them would each receive in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash.A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data.“Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean wonhas landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much morelike: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just givingaccess to the latestmodels and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is billion, not billion. #orb #will #see #you #now
    TIME.COM
    The Orb Will See You Now
    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately $42, will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can see [how] this becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blania (left) and Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income (UBI)—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a $1 million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than $10 million actually building its app—but pumped an additional $70 million or so into a referral program, whereby new users and the person who invited them would each receive $10 in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised $244 million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the $50 million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around $12 billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash. (On May 21, he announced plans to team up with famed former Apple designer Jony Ive on a new AI personal device.) A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave $1,000 per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data. (Tools for Humanity has appealed; the regulator is now reassessing the decision.) “Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean won (about $54) has landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much more [about] like: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a $300 billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just giving [people] access to the latest [AI] models and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is $12 billion, not $1.2 billion.
    Like
    Love
    Wow
    Sad
    Angry
    240
    0 Commentarii 0 Distribuiri
  • Engadget Podcast: The AI and XR of Google I/O 2025

    Would you believe Google really wants to sell you on its AI? This week, we dive into the news from Google I/O 2025 with Engadget's Karissa Bell. We discuss how Gemini is headed to even more places, as well as Karissa's brief hands-on with Google's prototype XR glasses. It seems like Google is trying a bit harder now than it did with Google Glass and its defunct Daydream VR platform. But will the company end up giving up again, or does it really have a shot against Meta and Apple?

    Subscribe!

    iTunes
    Spotify
    Pocket Casts
    Stitcher
    Google Podcasts

    Topics

    Lots of AI and a little XR: Highlights from Google I/O 2025 – 1:15
    OpenAI buys Jony Ive’s design company for B, in an all equity deal – 29:27
    Fujifilm’s X Half could be the perfect retro camera for the social media age – 39:42
    Sesame Street is moving from HBO to Netflix – 44:09
    Cuts to IMLS will lead to headaches accessing content on apps like Libby and Hoopla – 45:49
    Listener Mail: Should I replace my Chromebook with a Mac or PC Laptop? – 48:33
    Pop culture picks – 52:22

    Credits 
    Hosts: Devindra Hardawar and Karissa BellProducer: Ben EllmanMusic: Dale North and Terrence O'Brien
    Transcript
    Devindra:What's up, internet and welcome back to the Engadget Podcast. I'm Senior Editor Devindra Hardawar. I'm joined this morning by Senior Writer Karissa Bell. Hello, Karissa.
    Karissa: Hello. Good morning.
    Devindra: Good morning. And also podcast producer Ben Elman. Hey Ben, I'm muted my dang self. Hello. Hello, Ben. Good morning. It's been a busy week, like it's one of those weeks where.
    Three major conferences happened all at once and a varying like relevance to us. Google IO is the big one. We'll be talking about that with Karissa who was there and got to demo Google's XR glasses, but also Computex was happening. That's over in Taipei and we got a lot of news from that to, we'll mention some of those things.
    Also, Microsoft build happened and I feel like this was the less least relevant build to us ever. I got one bit of news I can mention there. That's pretty much it. It's been a crazy hectic week for us over at Eng Gadget. As always, if you're enjoying the show, please be free to subscribe to us on iTunes or your podcast catcher of choice.
    Leave us a review on iTunes, drop us email at podcast@enggadget.com.Those emails, by the way, if you ask a good question, it could end up being part of our Ask Engadget section, so that's something we're starting out. I have another good one. I'll be throwing to asking Eng gadgets soon. So send us your emails podcast@enggadget.com, Google io.
    It's all about ai, isn't it? I feel like Karissa, we were watching the keynote for this thing and it felt like it went on and on of the thing about the things, like we all pretty much expect more about Gemini ai, more about their newer models a bit about xr. Can you give me, what's your overall impression of IO at this point?
    Karissa: Yeah, it's interesting because I've been covering IO long enough that I remember back when it used to be Android. And then there'd be like that little section at the end about, AI and some of the other stuff. And now it's completely reversed where it's entirely AI and basically no Android to the point where they had a whole separate event with their typical Android stuff the week before.
    So it didn't have to go through and talk about any of yeah, the mobile things.
    Devindra: That was just like a live stream that was just like a chill, live stream. No realeffort put into it. Whereas this is the whole show. They had a, who was it? But they had TOIs. TOIs, yeah. They had actual music which is something a lot of these folks do at keynotes.
    It's actually really disconcerting to see cool musicians taking the corporate gig and performing at one of these things. I think, it was like 20 13, 20 14, maybe the Intel one, IDF or something. But the weekend was there. Just trying to jam to all these nerds and it was sad, but yeah. How was the experience Karissa like actually going there?
    Karissa: Yeah, it was good. That keynote is always kind of a slog. Just, live blogging for our almost two hours straight, just constant is it's a lot. I did like the music. Towa was very chill. It was a nice way to start much. I preferred it over the crazy loop daddy set we got last year.
    If anyone remembers that.
    Devindra: Yeah.
    Ben: Yeah. Oh, I remember that. Mark Rub was at audio. That was so weird.
    Devindra: Yeah. Yeah, it was a little intense. Cool. So what are some of the highlights? Like there, there's a bunch of stuff. If you go look on, on the site on Engadget, wehave rounded up like all the major news and that includes a couple of things like hey, AI mode, chat bot coming to search.
    That's cool. We got more, I think the thing a lot of people were looking at was like Project Astra and where that's gonna be going. And that is the sort of universal AI assistant where you could hold your phone up and just ask it questions about the world. We got another demo video about that.
    Which again, the actual utility of it, I'm weirded out by. There was also one video where they were just like I'm gonna be dumb. I'm gonna pretend I'm very stupid and ask ask Astro, what is this tall building in front of me. And it was like a fire hydrant or something. It was like some piece of street thing.
    It was not a really well done demo. Do you have any thoughts about that, Krista? Does that seem more compelling to you now or is it the same as what we saw last year?
    Karissa: I think what was interesting to me about it was that we saw Astro last year and like that, I think there was a lot of excitement around that, but it wasn't really entirely clear where that.
    Project is going. They've said it's like an experimental research thing. And then, I feel like this year they really laid out that they want tobring all that stuff to Gemini. Astra is sort of their place to like tinker with this and, get all this stuff working.
    But like their end game is putting this into Gemini. You can already see it a little bit in Gemini Live, which is like their multimodal feature where you can do some. Version of what ASRA can do. And so that was interesting. They're saying, we want Gemini to be this universal AI assistant.
    They didn't use the word a GI or anything like that. But I think it's pretty clear where they're going and like what their ambition is they want this to be, an all seeing, all knowing AI assistant that can help you with anything is what they're trying to sell it as.
    Devindra: It is weird, like we're watching the demo video and it's a guy trying to fix his bike and he is pointing his phone at like the bike and asking questions like which, which particular, I don't know. It's which particular nut do I need for this tightening thing and it's giving him good advice.
    It's pointing to things on YouTube. I. I don't know how useful this will actually be. This kind of goes to part of the stuff we're seeing with AI too, of just like offloadingsome of the grunt work of human intelligence because you can do this right now, people have been YouTubing to fix things forever.
    YouTube has become this like information repository of just fix it stuff or home plumbing or whatever. And now it's just like you'll be able to talk to your phone. It'll direct you right to those videos or. Extract the actual instructions from those. That's cool. I feel like that's among the more useful things, more useful than like putting Gemini right into Chrome, which is another thing they're talking about, and I don't know how useful that is other than.
    They wanna push AI in front of us, just like Microsoft wants to push copilot in front of us at all times.
    Ben: What is a situation where you would have a question about your Chrome tabs? Like I'm not one of those people that has 15 chrome tabs open at any given time, and I know that I am. Yeah, I know.
    Wait, you're saying that like it's a high. Like it's high. Yeah, no I know. So I have a abnormally low number of chrome tabs open, but can you still come upwith an idea of why you would ask Gemini anything about your own tabs open? Hopefully you have them organized. At least
    Karissa: they should. A few examples of like online shopping, like maybe you have.
    Two tabs of two different products open. And you can say
    Devindra: exactly,
    Karissa: ask Gemini to like, compare the reviews. Or they use like the example of a recipe video, a recipe blog. And maybe, you wanna make some kind of modification, make the recipe gluten free. And you could ask Gemini Hey, make this how would I make this gluten free?
    But I think you're right, like it's not exactly clear. You can already just open a new tab and go to Gemini and ask it. Something. So they're just trying to reduce
    Devindra: friction. I think that's the main thing. Like just the less you have to think about it, the more it's in your face. You can just always always just jump right to it.
    It's hey, you can Google search from any your UL bar, your location bar in any browser. We've just grown to use that, but that didn't used to be the case. I remember there used to be a separate Google field. Some browsers and it wasn't always there in every browser too. They did announce some new models.
    Wesaw there's Gemini 2.5 Pro. There's a deep think reasoning model. There's also a flash model that they announced for smaller devices. Did they show any good demos of the reasoning stuff? Because I that's essentially slower AI processing to hopefully get you better answers with fewer flaws.
    Did they actually show how that worked? Karissa.
    Karissa: I only saw what we all saw during the keynote and I think it's, we've seen a few other AI companies do something similar where you can see it think like its reasoning process. Yeah. And see it do that in real time.
    But I think it's a bit unclear exactly what that's gonna look like.
    Devindra: Watching a video, oh, Gemini can simulate nature simulate light. Simulate puzzles, term images into code.
    Ben: I feel like the big thing, yeah. A lot of this stuff is from DeepMind, right? This is DeepMind an alphabet company.
    Devindra: DeepMind and Alphabet company. There is Deep mind. This is deep Think and don't confuse this with deep seek, which is that the Chinese AI company, and theyclearly knew what they were doing when they call it that thing. Deep seek. But no, yeah, that is, this is partially stuff coming out of DeepMind.
    DeepMind, a company which Google has been like doing stuff with for a while. And we just have not really seen much out of it. So I guess Gemini and all their AI processes are a way to do that. We also saw something that got a lot of people, we saw
    Ben: Nobel Prize from them. Come on.
    Devindra: Hey, we did see that.
    What does that mean? What is that even worth anymore? That's an open question. They also showed off. A new video tool called Flow, which I think got a lot of people intrigued because it's using a new VO three model. So an updated version of what they've had for video effects for a while.
    And the results look good. Like the video looks higher quality. Humans look more realistic. There have been. The interesting thing about VO three is it can also do synchronized audio to actually produce audio and dialogue for people too. So people have been uploading videos around this stuff online at this point, and you have tosubscribe to the crazy high end.
    Version of Google's subscription to even test out this thing at this point that is the AI Ultra plan that costs a month. But I saw something of yeah, here's a pretend tour of a make believe car show. And it was just people spouting random facts. So yeah, I like EVs. I would like an ev.
    And then it looks realistic. They sound synchronized like you could. I think this is a normal person. Then they just kinda start laughing at the end for no reason. Like weird little things. It's if you see a sociopath, try to pretend to be a human for a little bit. There's real Patrick Bateman vibes from a lot of those things, so I don't know.
    It's fun. It's cool. I think there's, so didn't we
    Ben: announce that they also had a tool to help you figure out whether or not a video was generated by flow? They did announce that
    Devindra: too.
    Ben: I've yeah, go ahead. Go
    Karissa: ahead. Yeah. The synth id, they've been working on that for a while. They talked about it last year at io.
    That's like their digital watermarking technology. And the funny thing about this istheir whole, the whole concept of AI watermarking is you put like these like invisible watermarks into AI generated content. You might, you couldn't just. See it, just watching this content.
    But you can go to this website now and basically like double check. If it has one of these watermarks, which is on one hand it's. I think it's important that they do this work, but I also just wonder how many people are gonna see a video and think I wonder what kind of AI is in this.
    Let me go to this other website and like double check it like that. Just,
    Ben: yeah. The people who are most likely to immediately believe it are the, also the least likely to go to the website and be like, I would like to double check
    Devindra: this. It doesn't matter because most people will not do it and the damage will be done.
    Just having super hyper realistic, AI video, they can, you can essentially make anything happen. It's funny that the big bad AI bad guy in the new Mission Impossible movies, the entity, one of the main things it does is oh, we don't know what's true anymore because the entity can just cr fabricate reality at whim.
    We're just doing that.We're just doing that for, I don't know, for fun. I feel like this is a thing we should see in all AI video tools. This doesn't really answer the problem, answer the question that everyone's having though. It's what is the point of these tools? Because it does devalue filmmaking, it devalues people using actual actors or using, going out and actually shooting something.
    Did Google make a better pitch for why you would use Flow Karissa or how it would fit into like actual filmmaking?
    Karissa: I'm not sure they did. They showed that goofy Darren Aronofsky trailer for some woman who was trying to like, make a movie about her own birth, and it was like seemed like they was trying to be in the style of some sort of like psychological thriller, but it just, I don't know, it just felt really weird to me.
    I was I was just like, what are we watching? This doesn't, what are we watching? Yeah.
    Ben: Was there any like good backstory about why she was doing that either or was it just Hey, we're doing something really weird?
    Karissa: No, she was just oh I wonder, you know what? I wanna tell the story of my own birth and Okay.
    Ben:Okay, but why is your relate birth more? Listen its like every, I need more details. Why is your birth more important? It's, everybody wants lots of babies. Write I memoir like one of three ways or something.
    Devindra: Yeah, it's about everybody who wants to write a memoir. It's kinda the same thing. Kinda that same naval ga thing.
    The project's just called ancestral. I'm gonna play a bit of a trailer here. I remember seeing this, it reminds me of that footage I dunno if you guys remember seeing, look who's talking for the very first time or something, or those movies where they, they showed a lot of things about how babies are made.
    And as a kid I was like, how'd they make that, how'd that get done? They're doing that now with AI video and ancestral this whole project. It is kinda sad because Aronofsky is one of my, like one of my favorite directors when he is on, he has made some of my favorite films, but also he's a guy who has admittedly stolen ideas and concepts from people like Satoshi kh as specific framing of scenes and things like that.
    In Requa for a Dream are in some cones movies as well. SoI guess it's to be expected, but it is. Sad because Hollywood as a whole, the union certainly do not like AI video. There was a story about James Earl Jones' voice being used as Darth Vader. In Fortnite. In Fortnite. In Fortnite, yeah.
    Which is something we knew was gonna happen because Disney licensed the rights to his voice before he died from his estate. He went in and recorded lines to at least create a better simulation of his voice. But people are going out there making that Darth Vader swear and say bad things in Fortnite and the WGA or is it sag?
    It's probably sag but sad. Like the unions are pissed off about this because they do not know this was happening ahead of time and they're worried about what this could mean for the future of AI talent. Flow looks interesting. I keep seeing play people play with it. I made a couple videos asked it to make Hey, show me three cats living in Brooklyn with a view of the Manhattan skyline or something.
    And it, it did that, but the apartment it rendered didn't look fully real.It had like weird heating things all around. And also apparently. If you just subscribe to the basic plan to get access to flow, you can use flow, but that's using the VO two model. So older AI model. To get VO three again, you have to pay a month.
    So maybe that'll come down in price eventually. But we shall see. The thing I really want to talk with you about Krisa is like, what the heck is happening with Android xr? And that is a weird project for them because I was writing up the news and they announced like a few things.
    They were like, Hey we have a new developer released to help you build Android XR apps. But it wasn't until the actual a IO show. That they showed off more of what they were actually thinking about. And you got to test out a pair of prototype Google XR glasses powered by Android xr. Can you tell me about that experience and just how does it differ from the other XR things you've seen from who is it from Several, look, you've seen Metas Meta, you saw one from Snap, right?
    Meta
    Karissa: I've seen Snap. Yeah. Yeah. I've seen the X reel. Yeah, some of the other smallercompanies I got to see at CES. Yeah, that was like a bit of a surprise. I know that they've been talking about Android XR for a while. I feel like it's been a little, more in the background. So they brought out these, these glasses and, the first thing that I noticed about them was like, they were actually pretty small and like normal looking compared to, met Orion or like the snap spectacles.
    Like these were very thin which was cool. But the display was only on one side. It was only on one lens. They called it like a monocular display. So there's one lens on one side. So it's basically just like a little window, very small field of view.
    Devindra: We could see it in, if you go to the picture on top of Chris's hands on piece, you can see the frame out.
    Of what that lens would be. Yeah.
    Karissa: Yeah. And I noticed even when we were watching that, that demo video that they did on stage, that like the field of view looked very small. It was even smaller than Snaps, which is 35 degrees like this. I would, if I had to guess, I'd say it's maybe like around 20.
    They wouldn't say what it was. They said, this is a prototype. We don't wanna say the way I thought about it, the wayI compared it to my piece was like the front screwing on a foldable phone, so it's you can get notifications and you can like glance at things, but it's not fully immersive ar it's not, surrounding your space and like really cha changing your reality, in the way that like snap and and meta are trying to do later when I was driving home, I realized it actually was reminded me like a better comparison might be the heads up display in your car.
    Speaker: Yeah. Yeah.
    Karissa: If you have a car that has that little hu where you can see how fast you're going and directions and stuff like that.
    Devindra: That's what Google Glass was doing too, right? Because that was a little thing off to the side of your revision that was never a full takeover. Your vision type of thing.
    Karissa: Yeah. It's funny, that's what our editor Aaron said when he was editing my piece, he was like, oh, this sounds like Google Glass.
    And I'm like, no, it actually, it's, it is better than that. These are like normal looking glasses. The, I tried Google Glass many years ago. Like the Fidelity was better. Actually I was thinking. It feels like a happy medium almost between, meta ray bands and like full ar Yeah, like I, I've had a meta ray band glassesfor a long time and people always ask me, like when I show it to someone, they're like, oh, that's so cool.
    And then they go, but you can see stuff, right? There's a display and I'm like. No. These are just, glasses with the speaker. And I feel like this might be like a good kind of InBetween thing because you have a little bit of display, but they still look like glasses. They're not bulky 'cause they're not trying to do too much. One thing I really liked is that when you take a photo, you actually get a little preview of that image that like floats onto the screen, which was really cool because it's hard to figure out how to frame pictures when you are taking using glasses camera on your smart glasses.
    So I think there's some interesting ideas, but it's very early. Obviously they want like Gemini to be a big part of it. The Gemini stuff. Was busted in my demo.
    Devindra: You also said they don't plan on selling these are like purely, hey, this is what could be a thing. But they're not selling these specific glasses, right?
    Karissa: Yeah, these specific ones are like, this is a research prototype. But they did also announce a partnership with Warby Parker and another glasses company. So I think it's like you can see them trying to take a meta approach here, whichactually would be pretty smart to say let's partner with.
    A known company that makes glasses, they're already popular. We can give them our, our tech expertise. They can make the glasses look good and, maybe we'll get something down the line. I actually heard a rumor that. Prototype was manufactured by Samsung.
    They wouldn't say
    Devindra: Of course it's Sam, Samsung wants to be all over this. Samsung is the one building their the full on Android XR headset, which is a sort of like vision Pro copycat, like it is Mohan. Yeah. Moan. It is displays with the pass through camera. That should be coming later this year.
    Go ahead Ben.
    Ben: Yeah. Question for Karissa. When Sergey brand was talking about Google Glass, did that happen before or after the big demo for the Google XR glasses?
    Karissa: That was after. That was at the end of the day. He was a surprise guest in this fireside chat with the DeepMind, CEO. And yeah, it was, we were all wondering about that.
    'cause we all, dev probably remembers this very well the, when Google Glass came out and cereal and skydivewearing them into io. Yeah.
    Speaker: Yep.
    Karissa: And then, now for him to come back and say we made a lot of mistakes with that product and.
    Ben: But was it mistakes or was it just the fact that like technology was not there yet because he was talking about like consumer electronic supply chain, blah, blah, blah, blah, blah.
    Devindra: He's right that the tech has caught up with what the vision of what they wanted to do, but also I think he fundamentally misread like people will see you looking like the goddamn borg and want to destroy you. They want you will turn into Captain Picard and be like, I must destroy whoever is wearing Google Glass because this looks like an alien trying to take over my civilization.
    And the thing that meta did right, that you've seen Karissa, is that make 'em look like normal glasses and Yeah, but nobody will knows,
    Ben: Karissa does not look entirely human in this picture either.
    Karissa: Yes. But listen from, if you see 'em straight on, they don't, they look transparent. That was I used that photo because I was trying to.
    Devindra: You get the angle, show The display.
    Karissa: Yeah.
    Devindra:Yeah. There's another one like you. This looks normal. This looks totally normal. The glasses themselves look like, they look like typical hipster glasses. Like they're not like a super big frame around them. You're they look like the arms seem big. The arms seem wider than a typical pair of glasses, but you wouldn't know that 'cause it's covered in your hair.
    A lot of people won't notice glasses, arms as much.
    Ben: Yeah,
    Devindra: that is cool. The issue
    Ben: still is that all of these frames are so chunky. And it's because you need to hide all of the internals and everything, but you're not gonna get like the beautiful, like thin Japanese like titanium anytime soon. No, because this stuff needs to shrink way more.
    Devindra: This stuff that's not, those the kind of frames they are. I will say I had a meeting with the one of the I believe the CEO of X reel who. Came not, I did talk to them at c so they, they had like a lot of ideas about that. I talked to the the head of space top, which isthe, that's the company that was doing the sort of AR laptop thing.
    And then they gave up on that idea because AI PCs have the nmps that they need to do that stuff. And they're all in on the idea that, more people will want to use these sorts of glasses. Maybe not all the time, but for specific use cases. Something that co covers your field of vision more.
    Could be a great thing when you sit down at your desk. I could see people doing this. I could see people getting these glasses. I don't know if it's gonna be good for society, right? It feels when Bluetooth headsets were first popping up and everybody hated those people, and you're like, oh, we must shun this person from society.
    This one, you can't quite see the screen. So you can pretend to be a normal human and then have this like augmented ability next to you. If they can hide that, if they can actually hide the fact that you have a display on your glasses that would help people like me who are face blind and I walk around I don't, I know this person.
    I've seen them before. What is their name? What is their name? I could see that being useful.
    Ben: On the other side of itthough, if you have one standard look for glasses like this, then you know, oh, this person is, I. Also interacting with like information and stuff that's like popping up in front of their eyes.
    It's a universal signifier, just like having a big pair of headphones is
    Devindra: I think you will see people looking off to the distance. Krisa, did you notice that your eye line was moving away from people you were talking to while you were wearing these?
    Karissa: Yeah, and that was also one of the issues that I had was that the.
    Actual, like display was like, was it like didn't quite render right? Where I'm not a farsighted person, but I actually had to look farther off in the distance to actually get it to like my eyes to focus on it. And I asked 'em about that and they're like, oh it's a prototype.
    It's not quite dialed in. They weren't calibrating these things to your eyeballs. Like the way when I did the Meta Orion demo, they have to take these specific measurements because there's eye tracking and all these things and this, didn't have any of that. There. Yeah, there definitely was.
    You're, somebody's talking to you, but you're looking over here.
    Devindra: That's not great. That'snot great for society. You're having a conversation with people. I like how they're framing this oh yes, you can be more connected with reality. 'cause you don't have a phone in front of your face, except you always have another display in front of your face, which nobody else can see, and you're gonna look like an alien walking around.
    They showed some videos of people using it for like street navigation. Which I kinda like. You're in a new city, you'll see the arrows and where to turn and stuff. That's useful. But there is this, there was one that was really overwrought. It was a couple dancing at Sunset, and the guy is take a picture of this beautiful moment of the sun peeking through behind, my lady friend.
    And it just felt like that's what you wanna do in that moment. You wanna talk to your virtual assistant while you should be enjoying the fact that you are having this beautiful dancing evening, which nobody will ever actually have. So that's the whole thing. I will say my overall thoughts on this stuff, like just looking at this, the stuff they showed before they actually showed us the glasses, it doesn't feel like Google is actually that far in terms of making this a reality.
    Karissa the, like I'm comparing it to. Where Metais right now, and even where Apple is right now, like when Apple showed us the vision Pro. We were able to sit down and I had a 30 minute demo of that thing working, and I saw the vision of what they were doing and they thought a lot about how this was.
    How long was your demo with this thing?
    Karissa: I was in the room with them for about five minutes and I had them on for about three minutes myself. That's not a demo. That's not a demo.
    Ben: Oh, goodness. So all of these pictures were taken in the same 90 seconds? Yes. Yeah. God. That's amazing.
    Devindra: It's amazing you were able to capture these impressions, Karissa.
    Yeah,
    Karissa: I will say that they did apparently have a demo in December, a press event in December where people got to see these things for a lot longer, but it was, they could not shoot them at all. We, a lot of us were wondering if that was why it was so constrained. They only had one room, there's hundreds of people basically lining up to try these out.
    And they're like very strict. You got five minutes, somebody's in there like after a couple minutes, rushing you out, and we're like, okay. Like
    Devindra: They clearly only have a handful of these. That's like the main reason this is happening. I am, this is the company, that did Google Glass and that was tooearly and also maybe too ambitious.
    But also don't forget, Google Cardboard, which was this that was a fun little project of getting phone-based vr happening. Daydream vr, which was their self-contained headset, which was cool. That was when Samsung was doing the thing with Meta as well, or with Oculus at the time. So and they gave up on those things.
    Completely. And Google's not a company I trust with consumer Hardaware in general. So I am. Don't think there is a huge future in Android xr, but they wanna be there. They wanna be where Meta is and where Apple is and we shall see. Anything else you wanna add about io, Karissa?
    Karissa: No, just that AI.
    A i a ai
    Devindra: a I didn't AI ao, A IAO a IO starline. The thing that was a, like weird 3D rendering teleconferencing video that is becoming a real thing that's turning to Google Beam video. But it's gonna be an enterprise thing. They're teaming up with AI to, with HP to bring a scaled down version of that two businesses.
    I don't think we'll love or see That's one of those things where it's oh, this existsin some corporate offices who will pay for this thing, but. I don't, normal people will never interact with this thing, so it practically just does not exist. So we shall see. Anyway, stay tuned for, we're gonna have more demos of the Gemini stuff.
    We'll be looking at the new models, and certainly Chris and I will be looking hard at Android XR and wherever the heck that's going.
    Let's quickly move on to other news. And I just wanna say there were other events, Compex, we wrote up a couple, a whole bunch of laptops. A MD announced a cheaper radio on graphics card. Go check out our stories on that stuff. Build. I wrote one, I got a 70 page book of news from Microsoft about build and 99% of that news just does not apply to us because Build is so fully a developer coding conference. Hey, there's more more copilot stuff. There's a copilot app coming to 360fi subscribers, and that's cool, but not super interesting. I would say the big thing that happened this week and that surprised a lot of us is the news that OpenAI has bought.
    Johnny i's design startup for six and a half billion. Dollars. This is a wild story, which is also paired with a weird picture. It looks like they're getting married. It looks like they're announcing their engagement over here because Johnny, ive is just leaning into him. Their heads are touching a little bit.
    It's so adorable. You're not showing
    Ben: the full website though. The full website has like a script font. It literally looks, yeah, like something from the knot.
    Devindra: It Is it? Yeah. Let's look at here. Sam and Johnny introduced io. This is an extraordinary moment. Computers are now seeing, thinking, understanding, please come to our ceremony at this coffee shop.
    For some reason, they also yeah, so they produced this coffee shop video to really show this thing off and, it is wild to me. Let me pull this up over here.
    Ben: While we're doing that. Karissa, what do youhave to say about this?
    Karissa: I don't, I'm trying to remember, so I know this is Johnny Ives like AI because he also has like the love from, which is still
    Devindra: this is love from, this is, so he is, let me get the specifics of the deal out here.
    Yeah. As part of the deal Ive and his design studio love form. Is it love form or love form? Love form. Yeah. Love form are gonna be joining are gonna work independently of open ai. But Scott Cannon Evans Hanky and Ang Tan who co-founded io. This is another io. I hate these. Yeah, so IO is his AI.
    Karissa: Focused design thing.
    And then love form is like his design
    Devindra: studio thing.
    Karissa: Sure. Yeah. I'm just, he
    Devindra: has two design things.
    Karissa: I'm trying to remember what they've done. I remember there was like a story about they made like a really expensive jacket with some weird buttons or something like
    Devindra: Yep. I do remember that.
    Karissa: I was just trying to back my brain of what Johnny Iiv has really done in his post Apple life. I feel like we haven't, he's made
    Devindra: billions of dollars courses. What's happened? Yes.Because he is now still an independent man. Clearly he's an independent contractor, but love like the other side of io.
    Which includes those folks. They will become open AI employees alongside 50 other engineers, designers, and researchers. They're gonna be working on AI Hardaware. It seems like Johnny, I will come in with like ideas, but he, this is not quite a marriage. He's not quite committing. He's just taking the money and being like, Ew, you can have part of my AI startup for six and a half billion dollars.
    Ben: Let us know your taxes. It's all equity though, so this is all paper money. Six and a half billion dollars. Of like open AI's like crazy, their crazy valuation who knows how act, how much it's actually going to be worth. But all these people are going to sell a huge chunk of stock as soon as open AI goes public anyway.
    So it's still gonna be an enormous amount of money.
    Devindra: Lemme, let me see here, the latest thing. Open OpenAI has raised 57.9 billion of funding over 11 rounds.Good Lord. Yeah. Yeah. So anyway, a big chunk of that is going to, to this thing because I think what happened is that Sam Altman wants to, he clearly just wants to be Steve Jobs.
    I think that's what's happening here. And go, I, all of you go look at the video, the announcement video for this thing, because it is one of the weirdest things I've seen. It is. Johnny I have walking through San Francisco, Sam Altman, walking through San Francisco with his hands in his pockets. There's a whole lot of setup to these guys meeting in a coffee shop, and then they sit there at the coffee shop like normal human beings, and then have an announcement video talking to nobody.
    They're just talking to the middle of the coffee bar. I don't know who they're addressing. Sometimes they refer to each other and sometimes they refer to camera, but they're never looking at the camera. This is just a really wild thing. Also. Yet, another thing that makes me believe, I don't think Sam Altman is is a real human boy.
    I think there is actually something robotic about this man, because I can't see him actually perform in real lifewhat they're gonna do. They reference vagaries, that's all. It's, we don't know what exactly is happening. There is a quote. From Johnny Ive, and he says, quote, the responsibility that Sam shares is honestly beyond my comprehension end quote.
    Responsibility of what? Just building this like giant AI thing. Sam Alman For humanity. Yeah, for humanity. Like just unlocking expertise everywhere. Sam Altman says he is. He has some sort of AI device and it's changed his life. We don't know what it is. We dunno what they're actually working on. They announced nothing here.
    But Johnny Ive is very happy because he has just made billions of dollars. He's not getting all of that money, but he, I think he's very pleased with this arrangement. And Sam Malman seems pleased that, oh, the guy who who designed the iPhone and the MacBook can now work for me. And Johnny, I also says the work here at Open AI is the best work he's ever done.
    Sure. You'd say that. Sure. By the way.
    Karissa: Sure. What do you think Apple thinks about all this?
    Devindra: Yeah,
    Karissa: their AIprogram is flailing and like their, star designer who, granted is not, separated from Apple a while ago, but is now teaming up with Sam Altman for some future computing AI Hardaware where like they can't even get AI Siri to work.
    That must be like a gut punch for folks maybe on the other side of it though. Yeah, I
    Ben: don't think it's sour grapes to say. Are they going into the like. Friend, like friend isn't even out yet, but like the humane pin? Yes. Or any of the other like AI sidekick sort of things like that has already crashed and burned spectacularly twice.
    Devindra: I think Apple is, maybe have dodged a bullet here because I, the only reason Johnny and I just working on this thing is because he OpenAI had put some money into left Formm or IO years ago too. So they already had some sort of collaboration and he's just okay, people are interested in the ai.
    What sort of like beautiful AI device can I buy? The thing is.Johnny Ive unchecked as a designer, leads to maddening things like the magic mouse, the charges from the bottom butterfly
    Karissa: keyboard,
    Devindra: any butterfly keyboard. Yeah, that's beautiful, but not exactly functional. I've always worked best when he Johnny, ive always worked best when I.
    He had the opposing force of somebody like a Steve Jobs who could be like, no, this idea is crazy. Or reign it in or be more functional. Steve Jobs not a great dude in many respects, but the very least, like he was able to hone into product ideas and think about how humans use products a lot. I don't think Johnny, ive on his own can do that.
    I don't think Sam Altman can do that because this man can barely sit and have a cup of coffee together. Like a human being. So I, whatever this is. I honestly, Chris, I feel like Apple has dodged a bullet because this is jumping into the AI gadget trend. Apple just needs to get the software right, because they have the devices, right?
    We are wearing, we're wearing Apple watches. People have iPhones, people have MacBooks. What they need to do, solidify the infrastructure the AIsmarts between all those devices. They don't need to go out and sell a whole new device. This just feels like opening AI is a new company and they can try to make an AI device a thing.
    I don't think it's super compelling, but let us know listeners, if any of this, listen to this chat of them talking about nothing. Unlocking human greatness, unlocking expertise just through ai, through some AI gadget. I don't quite buy it. I think it's kind of garbage, but yeah.
    Ben: Anything else you guys wanna say about this?
    This is coming from the same guy who, when he was asked in an interview what college students should study, he said Resilience.
    Karissa: Yeah. I just think all these companies want. To make the thing that's the next iPhone. Yes. They can all just stop being relying on Apple. It's the thing that Mark Zuckerberg has with all of their like Hardaware projects, which by the way, there was one of the stories said that Johnny I thing has been maybe working on some kind of.
    Head earbuds with cameras on them, which soundedvery similar to a thing that meta has been rumored about meta for a long time. And and also Apple,
    Devindra: like there, there were rumors about AirPods with head with
    Karissa: cameras. Yeah. And everyone's just I think trying to like, make the thing that's like not an iPhone that will replace our iPhones, but good luck to them, good, good
    Devindra: luck to that because I think that is coming from a fundamentally broken, like it's a broken purpose. The whole reason doing that is just try to outdo the iPhone. I was thinking about this, how many companies like Apple that was printing money with iPods would just be like, Hey we actually have a new thing and this will entirely kill our iPod business.
    This new thing will destroy the existing business that is working so well for us. Not many companies do that. That's the innovator's dilemma that comes back and bites companies in the butt. That's why Sony held off so long on jumping into flat screen TVs because they were the world's leader in CRTs, in Trinitron, and they're like, we're good.
    We're good into the nineties. And then they completely lost the TV business. That's why Toyota was so slow to EVs, because they're like, hybrids are good to us. Hybrids are great. We don't need an EV for a very long time. And then they released an EV thatwe, where the wheels fell off. So it comes for everybody.
    I dunno. I don't believe in these devices. Let's talk about something that could be cool. Something that is a little unrealistic, I think, but, for a certain aesthetic it is cool. Fujifilm announced the X half. Today it is an digital camera with an analog film aesthetic. It shoots in a three by four portrait aspect ratio.
    That's Inax mini ratio. It looks like an old school Fuji camera. This thing is pretty wild because the screen it's only making those portrait videos. One of the key selling points is that it can replicate some film some things you get from film there's a light leak simulation for when you like Overexpose film A little bit, a ation, and that's something
    Ben: that Fujifilm is known for.
    Devindra: Yes. They love that. They love these simulation modes. This is such a social media kid camera, especially for the people who cannot afford the Fuji films, compact cameras.Wow. Even the
    Ben: screen is do you wanna take some vertical photographs for your social media? Because vertical video has completely won.
    Devindra: You can't, and it can take video, but it is just, it is a simplistic living little device. It has that, what do you call that? It's that latch that you hit to wind film. It has that, so you can put it into a film photograph mode where you don't see anything on the screen. You have to use the viewfinder.
    To take pictures and it starts a countdown. You could tell it to do like a film, real number of pictures, and you have to click through to hit, take your next picture. It's the winder, it's, you can wind to the next picture. You can combine two portrait photos together. It's really cool. It's really cute.
    It's really unrealistic I think for a lot of folks, but. Hey, social media kits like influencers, the people who love to shoot stuff for social media and vertical video. This could be a really cool little device. I don't, what do you guys think about this?
    Karissa: You know what this reminds me of? Do you remember like in the early Instagram days when there was all theseapps, like hip, systematic where they tried to emulate like film aesthetics?
    And some of them would do these same things where like you would take the picture but you couldn't see it right away. 'cause it had to develop. And they even had a light leak thing. And I'm like, now we've come full circle where the camera companies are basically like yeah. Taking or like just doing their own.
    Spin on that, but
    Devindra: it only took them 15 years to really jump on this trend. But yes, everybody was trying to emulate classic cameras and foodie was like, oh, you want things that cost more but do less. Got it. That's the foodie film X half. And I think this thing will be a huge success. What you're talking about krisa, there is a mode where it's just yeah.
    You won't see the picture immediately. It has to develop in our app and then you will see it eventually. That's cool honestly, like I love this. I would not, I love it. I would not want it to be my main camera, but I would love to have something like this to play around when you could just be a little creative and pretend to be a street photographer for a little bit.
    Oh man. This would be huge in Brooklyn. I can just,
    Ben: Tom Rogers says cute, but stupid tech. I think that'sthe perfect summary.
    Devindra: But this is, and I would say this compared to the AI thing, which is just like. What is this device? What are you gonna do with it? It feels like a lot of nothing in bakery.
    Whereas this is a thing you hold, it takes cool pictures and you share it with your friends. It is such a precise thing, even though it's very expensive for what it is. I would say if you're intrigued by this, you can get cheap compact cameras, get used cameras. I only ever buy refurbished cameras.
    You don't necessarily need this, but, oh man, very, but having a
    Karissa: Fuji film camera is a status symbol anyway. So I don't know. This is it's eight 50 still seems like a little steep for a little toy camera, basically. But also I'm like I see that. I'm like, Ooh, that looks nice.
    Devindra: Yeah. It's funny the power shots that kids are into now from like the two thousands those used to cost like 200 to 300 bucks and I thought, oh, that is a big investment in camera. Then I stepped up to the Sony murals, which were like 500 to 600 or so. I'm like, okay, this is a bigger step up than even that.
    Most people would be better off with amuralist, but also those things are bigger than this tiny little pocket camera. I dunno. I'm really I think it's, I'm enamored with this whole thing. Also briefly in other news we saw that apparently Netflix is the one that is jumping out to save Sesame Street and it's going to, Sesame Street will air on Netflix and PBS simultaneously.
    That's a good, that's a good thing because there was previously a delay when HBO was in charge. Oh really? Yeah. They would get the new episodes and there was like, I forget how long the delay actually was, but it would be a while before new stuff hit PBS. This is just Hey, I don't love that so much of our entertainment and pop culture it, we are now relying on streamers for everything and the big media companies are just disappointing us, but.
    This is a good move. I think Sesame Street should stick around, especially with federal funding being killed left and right for public media like this. This is a good thing. Sesame Street is still good. My kids love it. When my son starts leaning into like his Blippy era, I. I justkinda slowly tune that out.
    Here's some Sesame Street. I got him into PeeWee's Playhouse, which is the original Blippy. I'm like, yes, let's go back to the source. Because Peewee was a good dude. He's really, and that show still holds up. That show is so much fun. Like a great introduction to camp for kids. Great. In introduction to like also.
    Diverse neighborhoods, just Sesame Street as well. Peewee was, or mr. Rogers was doing
    Ben: it before. I think everyone,
    Devindra: Mr. Rogers was doing it really well too. But Peewee was always something special because PeeWee's Wild, Peewee, Lawrence Fishburn was on Peewee. There, there's just a lot of cool stuff happening there.
    Looking back at it now as an adult, it is a strange thing. To watch, but anyway, great to hear that Sesame Street is back. Another thing, not so quick.
    Ben: Yeah, let me do this one. Go ahead, if I may. Go ahead. So if you have any trouble getting audio books on Libby or Hoopla or any of the other interlibrary loan systems that you can like access on your phone or iPad any tablet.
    That'sbecause of the US government because a while ago the Trump administration passed yet another executive order saying that they wanted to cut a bunch of funding to the Institute of Museum and Library Services, the IMLS, and they're the ones who help circulate big quotation marks there just because it's digital files, all of these things from interlibrary loans.
    So you can, get your audio books that you want. The crazy thing about this is that the IMLS was created in 1996 by a Republican controlled Congress. What's the deal here, guys? There's no waste, fraud and abuse, but if you have problems getting audio books, you can tell a friend or if anybody's complaining about why their, library selection went down.
    By a lot on Libby recently, now you have the answer.
    Devindra: It is truly sad. A lot of what's happening is just to reduce access to information because hey, a well-formed population isdangerous to anybody in charge, right? Terrible news. Let's move on to stuff from that's happening around in gadget.
    I wanna quickly shout out that Sam Rutherford has reviewed the ACEs RG flow Z 13. This is the sort of like surface like device. That's cool. This is the rise in pro Max chip. Sam seems to like it, so that's, it's a cool thing. Not exactly stealthy. He gave it a 79, which is right below. The threshold we have for recommending new products because this thing is expensive.
    You're paying a lot of money to get, essentially get a gaming tablet. But I tested out cs. It is cool that it actually worked for a certain type of person with too much money and who just needs the lightest gaming thing possible. I could see it being compelling. Let's see, what is the starting price?
    for a gaming tablet. Sam says it costs the same or more as a comparable RRG Zes G 14 with a real RTX 50 70. That is a great laptop. The RRGs Zes G 14, we have praised that laptop so much. So this is notreally meant for anybody ACEs lifts to do these experiments. They're getting there, they're getting there in terms of creating a gaming tablet, but not quite something I'd recommend for everybody at this point.
    All right. We have a quick email from a listener too. Thank you for sending this in, Jake Thompson. If you wanna send us an email, e podcast in gadget.com, and again, your emails may head into our Asking Gadget section. Jake asks. He's a real estate agent in need of a new laptop. He uses a Chromebook right now and it meets every need he has.
    Everything they do is web-based, but should they consider alternatives to a premium com Chromebook for their next computer, he says he doesn't mind spending or more if he can get something lightweight, trustworthy with a solid battery life. What would we consider in the search? I would point to, I immediately point to Jake, to our laptop guides because literally everything we mention, the MacBook Air.
    The AsisZen book, S 14, even the Dell Xbs 13 would be not much more than that price. I think more useful than a premium Chromebook because I think the idea of a premium Chromebook is a, is insanity. I don't know why you're spending so much money for a thing that can only do web apps, cheap Chromebooks, mid-range Chromebooks fine, or less.
    Great. But if you're spending that much money and you want something that's more reliable, that you could do more with, even if everything you're doing is web-based, there may be other things you wanna do. MacBook Windows laptop. There is so much more you can unlock there. Little bit, a little bit of gaming, a little bit of media creation.
    I don't know, Karissa. Ben, do you have any thoughts on this? What would you recommend or do, would you guys be fine with the Chromebook?
    Karissa: I like Chromebooks. I thought my first thought, and maybe this is like too out there, but would an iPad Pro fit that fit those requirements? 'cause you can do a lot with an iPad Pro.
    You
    Devindra: can do a lot that's actually great battery,
    Karissa: lightweight, lots of apps. If most everything he's doing is web based, there's. You can probably use iPad apps.
    Devindra: That's actually a good point. Karissa you cando a lot with an iPad and iPad Pro does start at around this price too. So it would be much lighter and thinner than a laptop.
    Especially if you could do a lot of web stuff. I feel like there are some web things that don't always run well in an iPad form. Safari and iPad doesn't support like everything you'd expect from a web-based site. Like I think if you. There are things we use like we use Video Ninja to record podcasts and that's using web RTC.
    Sometimes there are things like zencaster, something you have to use, apps to go use those things because I, iOS, iPad OS is so locked down. Multitasking isn't great on iPad os. But yeah, if you're not actually doing that much and you just want a nice. Media device. An iPad is a good option too. Alright, thank you so much Jake Thompson.
    That's a good one too because I wanna hear about people moving on from Chromebooks. 'cause they, send us more emails at podcast@enggadget.com for sure. Let's just skip right past what we're working on 'cause we're all busy. We're all busy with stuff unless you wanna mention anything. Chris, anything you're working on at the moment?
    Karissa: The only thing I wanna flag is thatwe are rapidly approaching another TikTok sale or ban. Deadline Yes. Next month.
    Speaker: Sure.
    Karissa: Been a while since we heard anything about that, but, I'm sure they're hard at work on trying to hammer out this deal.
    Ben: Okay. But that's actually more relevant because they just figured out maybe the tariff situation and the tariff was the thing that spoiled the first deal.
    So we'll see what happens like at the beginning of July, yeah. I think
    Karissa: The deadline's the 19th of June
    Ben: oh, at the beginning of June. Sorry.
    Karissa: Yeah, so it's. It's pretty close. And yeah, there has been not much that I've heard on that front. So
    Devindra: this is where we are. We're just like walking to one broken negotiation after another for the next couple years.
    Anything you wanna mention, pop culture related krisa that is taking your mind off of our broken world.
    Karissa: So this is a weird one, but I have been, my husband loves Stargate, and we have been for years through, wait, the movie, the TV shows, StargateSG one. Oh
    Devindra: God. And I'm yeah. Just on the
    Karissa: last few episodes now in the end game portion of that show.
    So that has been I spent years like making fun of this and like making fun of him for watching it, but that show's
    Devindra: ridiculously bad, but yeah. Yeah.
    Karissa: Everything is so bad now that it's, actually just a nice. Yeah. Distraction to just watch something like so silly.
    Devindra: That's heartwarming actually, because it is a throwback to when things were simpler. You could just make dumb TV shows and they would last for 24 episodes per season. My for how
    Ben: many seasons too,
    Devindra: Karissa?
    Karissa: 10 seasons.
    Devindra: You just go on forever. Yeah. My local or lamb and rice place, my local place that does essentially New York streetcar style food, they placed Arga SG one.
    Every time I'm in there and I'm sitting there watching, I was like, how did we survive with this? How did we watch this show? It's because we just didn't have that much. We were desperate for for genre of fiction, but okay, that's heartwarming Krisa. Have you guys done Farscape? No. Have you seen Farscape?
    'cause Farscape is very, is a very similar type ofshow, but it has Jim Henson puppets and it has better writing. I love Jim Henson. It's very cool. Okay. It's it's also, it's unlike Stargate. It also dares to be like I don't know, sexy and violent too. Stargate always felt too campy to me. But Farscape was great.
    I bought that for On iTunes, so that was a deal. I dunno if that deal is still there, but the entire series plus the the post series stuff is all out there. Shout out to Farscape. Shout out to Stargate SG one Simpler times. I'll just really briefly run down a few things and or season two finished over the last week.
    Incredible stuff. As I said in my initial review, it is really cool to people see people watching this thing and just being blown away by it. And I will say the show. Brought me to tears at the end, and I did not expect that. I did not expect that because we know this guy's gonna die. This is, we know his fate and yet it still means so much and it's so well written and the show is a phenomenon.
    Chris, I'd recommend it to you when you guys are recovering from Stargate SG one loss and or is fantastic. I also checked out a bit of murderbot theApple TV plus adaptation of the Martha Wells books. It's fine. It is weirdly I would say it is funny and entertaining because Alexander Skarsgard is a fun person to watch in in genre fiction.
    But it also feels like this could be funnier, this could be better produced. Like you could be doing more with this material and it feels like just lazy at times too. But it's a fine distraction if you are into like half-baked sci-fi. So I don't know. Another recommendation for Stargate SG one Levers, Karissa Final Destination Bloodlines.
    I reviewed over at the film Cast and I love this franchise. It is so cool to see it coming back after 15 years. This movie is incredible. Like this movie is great. If you understand the final destination formula, it's even better because it plays with your expectations of the franchise. I love a horror franchise where there's no, no definable villain.
    You're just trying to escape death. There's some great setups here. This is a great time at the movies. Get your popcorn. Just go enjoy the wonderfully creative kills.And shout out to the Zap lapovsky and Adam B. Stein who. Apparently we're listening to my other podcast, and now we're making good movies.
    So that's always fun thing to see Mount Destination Bloodlines a much better film. The Mission Impossible, the Final Reckoning. My review of that is on the website now too. You can read that in a gadget.
    Ben: Thanks everybody for listening. Our theme music is by Game Composer Dale North. Our outro music is by our former managing editor, Terrence O'Brien. The podcast is produced by me. Ben Elman. You can find Karissa online at
    Karissa: Karissa b on threads Blue Sky, and sometimes still X.
    Ben: Unfortunately, you can find Dendra online
    Devindra: At dendra on Blue Sky and also podcast about movies and TV at the film cast@thefilmcast.com.
    Ben: If you really want to, you can find me. At hey bellman on Blue Sky. Email us at podcast@enggadget.com. Leave us a review on iTunes and subscribe on anything that gets podcasts. That includesSpotify.

    This article originally appeared on Engadget at
    #engadget #podcast #google
    Engadget Podcast: The AI and XR of Google I/O 2025
    Would you believe Google really wants to sell you on its AI? This week, we dive into the news from Google I/O 2025 with Engadget's Karissa Bell. We discuss how Gemini is headed to even more places, as well as Karissa's brief hands-on with Google's prototype XR glasses. It seems like Google is trying a bit harder now than it did with Google Glass and its defunct Daydream VR platform. But will the company end up giving up again, or does it really have a shot against Meta and Apple? Subscribe! iTunes Spotify Pocket Casts Stitcher Google Podcasts Topics Lots of AI and a little XR: Highlights from Google I/O 2025 – 1:15 OpenAI buys Jony Ive’s design company for B, in an all equity deal – 29:27 Fujifilm’s X Half could be the perfect retro camera for the social media age – 39:42 Sesame Street is moving from HBO to Netflix – 44:09 Cuts to IMLS will lead to headaches accessing content on apps like Libby and Hoopla – 45:49 Listener Mail: Should I replace my Chromebook with a Mac or PC Laptop? – 48:33 Pop culture picks – 52:22 Credits  Hosts: Devindra Hardawar and Karissa BellProducer: Ben EllmanMusic: Dale North and Terrence O'Brien Transcript Devindra:What's up, internet and welcome back to the Engadget Podcast. I'm Senior Editor Devindra Hardawar. I'm joined this morning by Senior Writer Karissa Bell. Hello, Karissa. Karissa: Hello. Good morning. Devindra: Good morning. And also podcast producer Ben Elman. Hey Ben, I'm muted my dang self. Hello. Hello, Ben. Good morning. It's been a busy week, like it's one of those weeks where. Three major conferences happened all at once and a varying like relevance to us. Google IO is the big one. We'll be talking about that with Karissa who was there and got to demo Google's XR glasses, but also Computex was happening. That's over in Taipei and we got a lot of news from that to, we'll mention some of those things. Also, Microsoft build happened and I feel like this was the less least relevant build to us ever. I got one bit of news I can mention there. That's pretty much it. It's been a crazy hectic week for us over at Eng Gadget. As always, if you're enjoying the show, please be free to subscribe to us on iTunes or your podcast catcher of choice. Leave us a review on iTunes, drop us email at podcast@enggadget.com.Those emails, by the way, if you ask a good question, it could end up being part of our Ask Engadget section, so that's something we're starting out. I have another good one. I'll be throwing to asking Eng gadgets soon. So send us your emails podcast@enggadget.com, Google io. It's all about ai, isn't it? I feel like Karissa, we were watching the keynote for this thing and it felt like it went on and on of the thing about the things, like we all pretty much expect more about Gemini ai, more about their newer models a bit about xr. Can you give me, what's your overall impression of IO at this point? Karissa: Yeah, it's interesting because I've been covering IO long enough that I remember back when it used to be Android. And then there'd be like that little section at the end about, AI and some of the other stuff. And now it's completely reversed where it's entirely AI and basically no Android to the point where they had a whole separate event with their typical Android stuff the week before. So it didn't have to go through and talk about any of yeah, the mobile things. Devindra: That was just like a live stream that was just like a chill, live stream. No realeffort put into it. Whereas this is the whole show. They had a, who was it? But they had TOIs. TOIs, yeah. They had actual music which is something a lot of these folks do at keynotes. It's actually really disconcerting to see cool musicians taking the corporate gig and performing at one of these things. I think, it was like 20 13, 20 14, maybe the Intel one, IDF or something. But the weekend was there. Just trying to jam to all these nerds and it was sad, but yeah. How was the experience Karissa like actually going there? Karissa: Yeah, it was good. That keynote is always kind of a slog. Just, live blogging for our almost two hours straight, just constant is it's a lot. I did like the music. Towa was very chill. It was a nice way to start much. I preferred it over the crazy loop daddy set we got last year. If anyone remembers that. Devindra: Yeah. Ben: Yeah. Oh, I remember that. Mark Rub was at audio. That was so weird. Devindra: Yeah. Yeah, it was a little intense. Cool. So what are some of the highlights? Like there, there's a bunch of stuff. If you go look on, on the site on Engadget, wehave rounded up like all the major news and that includes a couple of things like hey, AI mode, chat bot coming to search. That's cool. We got more, I think the thing a lot of people were looking at was like Project Astra and where that's gonna be going. And that is the sort of universal AI assistant where you could hold your phone up and just ask it questions about the world. We got another demo video about that. Which again, the actual utility of it, I'm weirded out by. There was also one video where they were just like I'm gonna be dumb. I'm gonna pretend I'm very stupid and ask ask Astro, what is this tall building in front of me. And it was like a fire hydrant or something. It was like some piece of street thing. It was not a really well done demo. Do you have any thoughts about that, Krista? Does that seem more compelling to you now or is it the same as what we saw last year? Karissa: I think what was interesting to me about it was that we saw Astro last year and like that, I think there was a lot of excitement around that, but it wasn't really entirely clear where that. Project is going. They've said it's like an experimental research thing. And then, I feel like this year they really laid out that they want tobring all that stuff to Gemini. Astra is sort of their place to like tinker with this and, get all this stuff working. But like their end game is putting this into Gemini. You can already see it a little bit in Gemini Live, which is like their multimodal feature where you can do some. Version of what ASRA can do. And so that was interesting. They're saying, we want Gemini to be this universal AI assistant. They didn't use the word a GI or anything like that. But I think it's pretty clear where they're going and like what their ambition is they want this to be, an all seeing, all knowing AI assistant that can help you with anything is what they're trying to sell it as. Devindra: It is weird, like we're watching the demo video and it's a guy trying to fix his bike and he is pointing his phone at like the bike and asking questions like which, which particular, I don't know. It's which particular nut do I need for this tightening thing and it's giving him good advice. It's pointing to things on YouTube. I. I don't know how useful this will actually be. This kind of goes to part of the stuff we're seeing with AI too, of just like offloadingsome of the grunt work of human intelligence because you can do this right now, people have been YouTubing to fix things forever. YouTube has become this like information repository of just fix it stuff or home plumbing or whatever. And now it's just like you'll be able to talk to your phone. It'll direct you right to those videos or. Extract the actual instructions from those. That's cool. I feel like that's among the more useful things, more useful than like putting Gemini right into Chrome, which is another thing they're talking about, and I don't know how useful that is other than. They wanna push AI in front of us, just like Microsoft wants to push copilot in front of us at all times. Ben: What is a situation where you would have a question about your Chrome tabs? Like I'm not one of those people that has 15 chrome tabs open at any given time, and I know that I am. Yeah, I know. Wait, you're saying that like it's a high. Like it's high. Yeah, no I know. So I have a abnormally low number of chrome tabs open, but can you still come upwith an idea of why you would ask Gemini anything about your own tabs open? Hopefully you have them organized. At least Karissa: they should. A few examples of like online shopping, like maybe you have. Two tabs of two different products open. And you can say Devindra: exactly, Karissa: ask Gemini to like, compare the reviews. Or they use like the example of a recipe video, a recipe blog. And maybe, you wanna make some kind of modification, make the recipe gluten free. And you could ask Gemini Hey, make this how would I make this gluten free? But I think you're right, like it's not exactly clear. You can already just open a new tab and go to Gemini and ask it. Something. So they're just trying to reduce Devindra: friction. I think that's the main thing. Like just the less you have to think about it, the more it's in your face. You can just always always just jump right to it. It's hey, you can Google search from any your UL bar, your location bar in any browser. We've just grown to use that, but that didn't used to be the case. I remember there used to be a separate Google field. Some browsers and it wasn't always there in every browser too. They did announce some new models. Wesaw there's Gemini 2.5 Pro. There's a deep think reasoning model. There's also a flash model that they announced for smaller devices. Did they show any good demos of the reasoning stuff? Because I that's essentially slower AI processing to hopefully get you better answers with fewer flaws. Did they actually show how that worked? Karissa. Karissa: I only saw what we all saw during the keynote and I think it's, we've seen a few other AI companies do something similar where you can see it think like its reasoning process. Yeah. And see it do that in real time. But I think it's a bit unclear exactly what that's gonna look like. Devindra: Watching a video, oh, Gemini can simulate nature simulate light. Simulate puzzles, term images into code. Ben: I feel like the big thing, yeah. A lot of this stuff is from DeepMind, right? This is DeepMind an alphabet company. Devindra: DeepMind and Alphabet company. There is Deep mind. This is deep Think and don't confuse this with deep seek, which is that the Chinese AI company, and theyclearly knew what they were doing when they call it that thing. Deep seek. But no, yeah, that is, this is partially stuff coming out of DeepMind. DeepMind, a company which Google has been like doing stuff with for a while. And we just have not really seen much out of it. So I guess Gemini and all their AI processes are a way to do that. We also saw something that got a lot of people, we saw Ben: Nobel Prize from them. Come on. Devindra: Hey, we did see that. What does that mean? What is that even worth anymore? That's an open question. They also showed off. A new video tool called Flow, which I think got a lot of people intrigued because it's using a new VO three model. So an updated version of what they've had for video effects for a while. And the results look good. Like the video looks higher quality. Humans look more realistic. There have been. The interesting thing about VO three is it can also do synchronized audio to actually produce audio and dialogue for people too. So people have been uploading videos around this stuff online at this point, and you have tosubscribe to the crazy high end. Version of Google's subscription to even test out this thing at this point that is the AI Ultra plan that costs a month. But I saw something of yeah, here's a pretend tour of a make believe car show. And it was just people spouting random facts. So yeah, I like EVs. I would like an ev. And then it looks realistic. They sound synchronized like you could. I think this is a normal person. Then they just kinda start laughing at the end for no reason. Like weird little things. It's if you see a sociopath, try to pretend to be a human for a little bit. There's real Patrick Bateman vibes from a lot of those things, so I don't know. It's fun. It's cool. I think there's, so didn't we Ben: announce that they also had a tool to help you figure out whether or not a video was generated by flow? They did announce that Devindra: too. Ben: I've yeah, go ahead. Go Karissa: ahead. Yeah. The synth id, they've been working on that for a while. They talked about it last year at io. That's like their digital watermarking technology. And the funny thing about this istheir whole, the whole concept of AI watermarking is you put like these like invisible watermarks into AI generated content. You might, you couldn't just. See it, just watching this content. But you can go to this website now and basically like double check. If it has one of these watermarks, which is on one hand it's. I think it's important that they do this work, but I also just wonder how many people are gonna see a video and think I wonder what kind of AI is in this. Let me go to this other website and like double check it like that. Just, Ben: yeah. The people who are most likely to immediately believe it are the, also the least likely to go to the website and be like, I would like to double check Devindra: this. It doesn't matter because most people will not do it and the damage will be done. Just having super hyper realistic, AI video, they can, you can essentially make anything happen. It's funny that the big bad AI bad guy in the new Mission Impossible movies, the entity, one of the main things it does is oh, we don't know what's true anymore because the entity can just cr fabricate reality at whim. We're just doing that.We're just doing that for, I don't know, for fun. I feel like this is a thing we should see in all AI video tools. This doesn't really answer the problem, answer the question that everyone's having though. It's what is the point of these tools? Because it does devalue filmmaking, it devalues people using actual actors or using, going out and actually shooting something. Did Google make a better pitch for why you would use Flow Karissa or how it would fit into like actual filmmaking? Karissa: I'm not sure they did. They showed that goofy Darren Aronofsky trailer for some woman who was trying to like, make a movie about her own birth, and it was like seemed like they was trying to be in the style of some sort of like psychological thriller, but it just, I don't know, it just felt really weird to me. I was I was just like, what are we watching? This doesn't, what are we watching? Yeah. Ben: Was there any like good backstory about why she was doing that either or was it just Hey, we're doing something really weird? Karissa: No, she was just oh I wonder, you know what? I wanna tell the story of my own birth and Okay. Ben:Okay, but why is your relate birth more? Listen its like every, I need more details. Why is your birth more important? It's, everybody wants lots of babies. Write I memoir like one of three ways or something. Devindra: Yeah, it's about everybody who wants to write a memoir. It's kinda the same thing. Kinda that same naval ga thing. The project's just called ancestral. I'm gonna play a bit of a trailer here. I remember seeing this, it reminds me of that footage I dunno if you guys remember seeing, look who's talking for the very first time or something, or those movies where they, they showed a lot of things about how babies are made. And as a kid I was like, how'd they make that, how'd that get done? They're doing that now with AI video and ancestral this whole project. It is kinda sad because Aronofsky is one of my, like one of my favorite directors when he is on, he has made some of my favorite films, but also he's a guy who has admittedly stolen ideas and concepts from people like Satoshi kh as specific framing of scenes and things like that. In Requa for a Dream are in some cones movies as well. SoI guess it's to be expected, but it is. Sad because Hollywood as a whole, the union certainly do not like AI video. There was a story about James Earl Jones' voice being used as Darth Vader. In Fortnite. In Fortnite. In Fortnite, yeah. Which is something we knew was gonna happen because Disney licensed the rights to his voice before he died from his estate. He went in and recorded lines to at least create a better simulation of his voice. But people are going out there making that Darth Vader swear and say bad things in Fortnite and the WGA or is it sag? It's probably sag but sad. Like the unions are pissed off about this because they do not know this was happening ahead of time and they're worried about what this could mean for the future of AI talent. Flow looks interesting. I keep seeing play people play with it. I made a couple videos asked it to make Hey, show me three cats living in Brooklyn with a view of the Manhattan skyline or something. And it, it did that, but the apartment it rendered didn't look fully real.It had like weird heating things all around. And also apparently. If you just subscribe to the basic plan to get access to flow, you can use flow, but that's using the VO two model. So older AI model. To get VO three again, you have to pay a month. So maybe that'll come down in price eventually. But we shall see. The thing I really want to talk with you about Krisa is like, what the heck is happening with Android xr? And that is a weird project for them because I was writing up the news and they announced like a few things. They were like, Hey we have a new developer released to help you build Android XR apps. But it wasn't until the actual a IO show. That they showed off more of what they were actually thinking about. And you got to test out a pair of prototype Google XR glasses powered by Android xr. Can you tell me about that experience and just how does it differ from the other XR things you've seen from who is it from Several, look, you've seen Metas Meta, you saw one from Snap, right? Meta Karissa: I've seen Snap. Yeah. Yeah. I've seen the X reel. Yeah, some of the other smallercompanies I got to see at CES. Yeah, that was like a bit of a surprise. I know that they've been talking about Android XR for a while. I feel like it's been a little, more in the background. So they brought out these, these glasses and, the first thing that I noticed about them was like, they were actually pretty small and like normal looking compared to, met Orion or like the snap spectacles. Like these were very thin which was cool. But the display was only on one side. It was only on one lens. They called it like a monocular display. So there's one lens on one side. So it's basically just like a little window, very small field of view. Devindra: We could see it in, if you go to the picture on top of Chris's hands on piece, you can see the frame out. Of what that lens would be. Yeah. Karissa: Yeah. And I noticed even when we were watching that, that demo video that they did on stage, that like the field of view looked very small. It was even smaller than Snaps, which is 35 degrees like this. I would, if I had to guess, I'd say it's maybe like around 20. They wouldn't say what it was. They said, this is a prototype. We don't wanna say the way I thought about it, the wayI compared it to my piece was like the front screwing on a foldable phone, so it's you can get notifications and you can like glance at things, but it's not fully immersive ar it's not, surrounding your space and like really cha changing your reality, in the way that like snap and and meta are trying to do later when I was driving home, I realized it actually was reminded me like a better comparison might be the heads up display in your car. Speaker: Yeah. Yeah. Karissa: If you have a car that has that little hu where you can see how fast you're going and directions and stuff like that. Devindra: That's what Google Glass was doing too, right? Because that was a little thing off to the side of your revision that was never a full takeover. Your vision type of thing. Karissa: Yeah. It's funny, that's what our editor Aaron said when he was editing my piece, he was like, oh, this sounds like Google Glass. And I'm like, no, it actually, it's, it is better than that. These are like normal looking glasses. The, I tried Google Glass many years ago. Like the Fidelity was better. Actually I was thinking. It feels like a happy medium almost between, meta ray bands and like full ar Yeah, like I, I've had a meta ray band glassesfor a long time and people always ask me, like when I show it to someone, they're like, oh, that's so cool. And then they go, but you can see stuff, right? There's a display and I'm like. No. These are just, glasses with the speaker. And I feel like this might be like a good kind of InBetween thing because you have a little bit of display, but they still look like glasses. They're not bulky 'cause they're not trying to do too much. One thing I really liked is that when you take a photo, you actually get a little preview of that image that like floats onto the screen, which was really cool because it's hard to figure out how to frame pictures when you are taking using glasses camera on your smart glasses. So I think there's some interesting ideas, but it's very early. Obviously they want like Gemini to be a big part of it. The Gemini stuff. Was busted in my demo. Devindra: You also said they don't plan on selling these are like purely, hey, this is what could be a thing. But they're not selling these specific glasses, right? Karissa: Yeah, these specific ones are like, this is a research prototype. But they did also announce a partnership with Warby Parker and another glasses company. So I think it's like you can see them trying to take a meta approach here, whichactually would be pretty smart to say let's partner with. A known company that makes glasses, they're already popular. We can give them our, our tech expertise. They can make the glasses look good and, maybe we'll get something down the line. I actually heard a rumor that. Prototype was manufactured by Samsung. They wouldn't say Devindra: Of course it's Sam, Samsung wants to be all over this. Samsung is the one building their the full on Android XR headset, which is a sort of like vision Pro copycat, like it is Mohan. Yeah. Moan. It is displays with the pass through camera. That should be coming later this year. Go ahead Ben. Ben: Yeah. Question for Karissa. When Sergey brand was talking about Google Glass, did that happen before or after the big demo for the Google XR glasses? Karissa: That was after. That was at the end of the day. He was a surprise guest in this fireside chat with the DeepMind, CEO. And yeah, it was, we were all wondering about that. 'cause we all, dev probably remembers this very well the, when Google Glass came out and cereal and skydivewearing them into io. Yeah. Speaker: Yep. Karissa: And then, now for him to come back and say we made a lot of mistakes with that product and. Ben: But was it mistakes or was it just the fact that like technology was not there yet because he was talking about like consumer electronic supply chain, blah, blah, blah, blah, blah. Devindra: He's right that the tech has caught up with what the vision of what they wanted to do, but also I think he fundamentally misread like people will see you looking like the goddamn borg and want to destroy you. They want you will turn into Captain Picard and be like, I must destroy whoever is wearing Google Glass because this looks like an alien trying to take over my civilization. And the thing that meta did right, that you've seen Karissa, is that make 'em look like normal glasses and Yeah, but nobody will knows, Ben: Karissa does not look entirely human in this picture either. Karissa: Yes. But listen from, if you see 'em straight on, they don't, they look transparent. That was I used that photo because I was trying to. Devindra: You get the angle, show The display. Karissa: Yeah. Devindra:Yeah. There's another one like you. This looks normal. This looks totally normal. The glasses themselves look like, they look like typical hipster glasses. Like they're not like a super big frame around them. You're they look like the arms seem big. The arms seem wider than a typical pair of glasses, but you wouldn't know that 'cause it's covered in your hair. A lot of people won't notice glasses, arms as much. Ben: Yeah, Devindra: that is cool. The issue Ben: still is that all of these frames are so chunky. And it's because you need to hide all of the internals and everything, but you're not gonna get like the beautiful, like thin Japanese like titanium anytime soon. No, because this stuff needs to shrink way more. Devindra: This stuff that's not, those the kind of frames they are. I will say I had a meeting with the one of the I believe the CEO of X reel who. Came not, I did talk to them at c so they, they had like a lot of ideas about that. I talked to the the head of space top, which isthe, that's the company that was doing the sort of AR laptop thing. And then they gave up on that idea because AI PCs have the nmps that they need to do that stuff. And they're all in on the idea that, more people will want to use these sorts of glasses. Maybe not all the time, but for specific use cases. Something that co covers your field of vision more. Could be a great thing when you sit down at your desk. I could see people doing this. I could see people getting these glasses. I don't know if it's gonna be good for society, right? It feels when Bluetooth headsets were first popping up and everybody hated those people, and you're like, oh, we must shun this person from society. This one, you can't quite see the screen. So you can pretend to be a normal human and then have this like augmented ability next to you. If they can hide that, if they can actually hide the fact that you have a display on your glasses that would help people like me who are face blind and I walk around I don't, I know this person. I've seen them before. What is their name? What is their name? I could see that being useful. Ben: On the other side of itthough, if you have one standard look for glasses like this, then you know, oh, this person is, I. Also interacting with like information and stuff that's like popping up in front of their eyes. It's a universal signifier, just like having a big pair of headphones is Devindra: I think you will see people looking off to the distance. Krisa, did you notice that your eye line was moving away from people you were talking to while you were wearing these? Karissa: Yeah, and that was also one of the issues that I had was that the. Actual, like display was like, was it like didn't quite render right? Where I'm not a farsighted person, but I actually had to look farther off in the distance to actually get it to like my eyes to focus on it. And I asked 'em about that and they're like, oh it's a prototype. It's not quite dialed in. They weren't calibrating these things to your eyeballs. Like the way when I did the Meta Orion demo, they have to take these specific measurements because there's eye tracking and all these things and this, didn't have any of that. There. Yeah, there definitely was. You're, somebody's talking to you, but you're looking over here. Devindra: That's not great. That'snot great for society. You're having a conversation with people. I like how they're framing this oh yes, you can be more connected with reality. 'cause you don't have a phone in front of your face, except you always have another display in front of your face, which nobody else can see, and you're gonna look like an alien walking around. They showed some videos of people using it for like street navigation. Which I kinda like. You're in a new city, you'll see the arrows and where to turn and stuff. That's useful. But there is this, there was one that was really overwrought. It was a couple dancing at Sunset, and the guy is take a picture of this beautiful moment of the sun peeking through behind, my lady friend. And it just felt like that's what you wanna do in that moment. You wanna talk to your virtual assistant while you should be enjoying the fact that you are having this beautiful dancing evening, which nobody will ever actually have. So that's the whole thing. I will say my overall thoughts on this stuff, like just looking at this, the stuff they showed before they actually showed us the glasses, it doesn't feel like Google is actually that far in terms of making this a reality. Karissa the, like I'm comparing it to. Where Metais right now, and even where Apple is right now, like when Apple showed us the vision Pro. We were able to sit down and I had a 30 minute demo of that thing working, and I saw the vision of what they were doing and they thought a lot about how this was. How long was your demo with this thing? Karissa: I was in the room with them for about five minutes and I had them on for about three minutes myself. That's not a demo. That's not a demo. Ben: Oh, goodness. So all of these pictures were taken in the same 90 seconds? Yes. Yeah. God. That's amazing. Devindra: It's amazing you were able to capture these impressions, Karissa. Yeah, Karissa: I will say that they did apparently have a demo in December, a press event in December where people got to see these things for a lot longer, but it was, they could not shoot them at all. We, a lot of us were wondering if that was why it was so constrained. They only had one room, there's hundreds of people basically lining up to try these out. And they're like very strict. You got five minutes, somebody's in there like after a couple minutes, rushing you out, and we're like, okay. Like Devindra: They clearly only have a handful of these. That's like the main reason this is happening. I am, this is the company, that did Google Glass and that was tooearly and also maybe too ambitious. But also don't forget, Google Cardboard, which was this that was a fun little project of getting phone-based vr happening. Daydream vr, which was their self-contained headset, which was cool. That was when Samsung was doing the thing with Meta as well, or with Oculus at the time. So and they gave up on those things. Completely. And Google's not a company I trust with consumer Hardaware in general. So I am. Don't think there is a huge future in Android xr, but they wanna be there. They wanna be where Meta is and where Apple is and we shall see. Anything else you wanna add about io, Karissa? Karissa: No, just that AI. A i a ai Devindra: a I didn't AI ao, A IAO a IO starline. The thing that was a, like weird 3D rendering teleconferencing video that is becoming a real thing that's turning to Google Beam video. But it's gonna be an enterprise thing. They're teaming up with AI to, with HP to bring a scaled down version of that two businesses. I don't think we'll love or see That's one of those things where it's oh, this existsin some corporate offices who will pay for this thing, but. I don't, normal people will never interact with this thing, so it practically just does not exist. So we shall see. Anyway, stay tuned for, we're gonna have more demos of the Gemini stuff. We'll be looking at the new models, and certainly Chris and I will be looking hard at Android XR and wherever the heck that's going. Let's quickly move on to other news. And I just wanna say there were other events, Compex, we wrote up a couple, a whole bunch of laptops. A MD announced a cheaper radio on graphics card. Go check out our stories on that stuff. Build. I wrote one, I got a 70 page book of news from Microsoft about build and 99% of that news just does not apply to us because Build is so fully a developer coding conference. Hey, there's more more copilot stuff. There's a copilot app coming to 360fi subscribers, and that's cool, but not super interesting. I would say the big thing that happened this week and that surprised a lot of us is the news that OpenAI has bought. Johnny i's design startup for six and a half billion. Dollars. This is a wild story, which is also paired with a weird picture. It looks like they're getting married. It looks like they're announcing their engagement over here because Johnny, ive is just leaning into him. Their heads are touching a little bit. It's so adorable. You're not showing Ben: the full website though. The full website has like a script font. It literally looks, yeah, like something from the knot. Devindra: It Is it? Yeah. Let's look at here. Sam and Johnny introduced io. This is an extraordinary moment. Computers are now seeing, thinking, understanding, please come to our ceremony at this coffee shop. For some reason, they also yeah, so they produced this coffee shop video to really show this thing off and, it is wild to me. Let me pull this up over here. Ben: While we're doing that. Karissa, what do youhave to say about this? Karissa: I don't, I'm trying to remember, so I know this is Johnny Ives like AI because he also has like the love from, which is still Devindra: this is love from, this is, so he is, let me get the specifics of the deal out here. Yeah. As part of the deal Ive and his design studio love form. Is it love form or love form? Love form. Yeah. Love form are gonna be joining are gonna work independently of open ai. But Scott Cannon Evans Hanky and Ang Tan who co-founded io. This is another io. I hate these. Yeah, so IO is his AI. Karissa: Focused design thing. And then love form is like his design Devindra: studio thing. Karissa: Sure. Yeah. I'm just, he Devindra: has two design things. Karissa: I'm trying to remember what they've done. I remember there was like a story about they made like a really expensive jacket with some weird buttons or something like Devindra: Yep. I do remember that. Karissa: I was just trying to back my brain of what Johnny Iiv has really done in his post Apple life. I feel like we haven't, he's made Devindra: billions of dollars courses. What's happened? Yes.Because he is now still an independent man. Clearly he's an independent contractor, but love like the other side of io. Which includes those folks. They will become open AI employees alongside 50 other engineers, designers, and researchers. They're gonna be working on AI Hardaware. It seems like Johnny, I will come in with like ideas, but he, this is not quite a marriage. He's not quite committing. He's just taking the money and being like, Ew, you can have part of my AI startup for six and a half billion dollars. Ben: Let us know your taxes. It's all equity though, so this is all paper money. Six and a half billion dollars. Of like open AI's like crazy, their crazy valuation who knows how act, how much it's actually going to be worth. But all these people are going to sell a huge chunk of stock as soon as open AI goes public anyway. So it's still gonna be an enormous amount of money. Devindra: Lemme, let me see here, the latest thing. Open OpenAI has raised 57.9 billion of funding over 11 rounds.Good Lord. Yeah. Yeah. So anyway, a big chunk of that is going to, to this thing because I think what happened is that Sam Altman wants to, he clearly just wants to be Steve Jobs. I think that's what's happening here. And go, I, all of you go look at the video, the announcement video for this thing, because it is one of the weirdest things I've seen. It is. Johnny I have walking through San Francisco, Sam Altman, walking through San Francisco with his hands in his pockets. There's a whole lot of setup to these guys meeting in a coffee shop, and then they sit there at the coffee shop like normal human beings, and then have an announcement video talking to nobody. They're just talking to the middle of the coffee bar. I don't know who they're addressing. Sometimes they refer to each other and sometimes they refer to camera, but they're never looking at the camera. This is just a really wild thing. Also. Yet, another thing that makes me believe, I don't think Sam Altman is is a real human boy. I think there is actually something robotic about this man, because I can't see him actually perform in real lifewhat they're gonna do. They reference vagaries, that's all. It's, we don't know what exactly is happening. There is a quote. From Johnny Ive, and he says, quote, the responsibility that Sam shares is honestly beyond my comprehension end quote. Responsibility of what? Just building this like giant AI thing. Sam Alman For humanity. Yeah, for humanity. Like just unlocking expertise everywhere. Sam Altman says he is. He has some sort of AI device and it's changed his life. We don't know what it is. We dunno what they're actually working on. They announced nothing here. But Johnny Ive is very happy because he has just made billions of dollars. He's not getting all of that money, but he, I think he's very pleased with this arrangement. And Sam Malman seems pleased that, oh, the guy who who designed the iPhone and the MacBook can now work for me. And Johnny, I also says the work here at Open AI is the best work he's ever done. Sure. You'd say that. Sure. By the way. Karissa: Sure. What do you think Apple thinks about all this? Devindra: Yeah, Karissa: their AIprogram is flailing and like their, star designer who, granted is not, separated from Apple a while ago, but is now teaming up with Sam Altman for some future computing AI Hardaware where like they can't even get AI Siri to work. That must be like a gut punch for folks maybe on the other side of it though. Yeah, I Ben: don't think it's sour grapes to say. Are they going into the like. Friend, like friend isn't even out yet, but like the humane pin? Yes. Or any of the other like AI sidekick sort of things like that has already crashed and burned spectacularly twice. Devindra: I think Apple is, maybe have dodged a bullet here because I, the only reason Johnny and I just working on this thing is because he OpenAI had put some money into left Formm or IO years ago too. So they already had some sort of collaboration and he's just okay, people are interested in the ai. What sort of like beautiful AI device can I buy? The thing is.Johnny Ive unchecked as a designer, leads to maddening things like the magic mouse, the charges from the bottom butterfly Karissa: keyboard, Devindra: any butterfly keyboard. Yeah, that's beautiful, but not exactly functional. I've always worked best when he Johnny, ive always worked best when I. He had the opposing force of somebody like a Steve Jobs who could be like, no, this idea is crazy. Or reign it in or be more functional. Steve Jobs not a great dude in many respects, but the very least, like he was able to hone into product ideas and think about how humans use products a lot. I don't think Johnny, ive on his own can do that. I don't think Sam Altman can do that because this man can barely sit and have a cup of coffee together. Like a human being. So I, whatever this is. I honestly, Chris, I feel like Apple has dodged a bullet because this is jumping into the AI gadget trend. Apple just needs to get the software right, because they have the devices, right? We are wearing, we're wearing Apple watches. People have iPhones, people have MacBooks. What they need to do, solidify the infrastructure the AIsmarts between all those devices. They don't need to go out and sell a whole new device. This just feels like opening AI is a new company and they can try to make an AI device a thing. I don't think it's super compelling, but let us know listeners, if any of this, listen to this chat of them talking about nothing. Unlocking human greatness, unlocking expertise just through ai, through some AI gadget. I don't quite buy it. I think it's kind of garbage, but yeah. Ben: Anything else you guys wanna say about this? This is coming from the same guy who, when he was asked in an interview what college students should study, he said Resilience. Karissa: Yeah. I just think all these companies want. To make the thing that's the next iPhone. Yes. They can all just stop being relying on Apple. It's the thing that Mark Zuckerberg has with all of their like Hardaware projects, which by the way, there was one of the stories said that Johnny I thing has been maybe working on some kind of. Head earbuds with cameras on them, which soundedvery similar to a thing that meta has been rumored about meta for a long time. And and also Apple, Devindra: like there, there were rumors about AirPods with head with Karissa: cameras. Yeah. And everyone's just I think trying to like, make the thing that's like not an iPhone that will replace our iPhones, but good luck to them, good, good Devindra: luck to that because I think that is coming from a fundamentally broken, like it's a broken purpose. The whole reason doing that is just try to outdo the iPhone. I was thinking about this, how many companies like Apple that was printing money with iPods would just be like, Hey we actually have a new thing and this will entirely kill our iPod business. This new thing will destroy the existing business that is working so well for us. Not many companies do that. That's the innovator's dilemma that comes back and bites companies in the butt. That's why Sony held off so long on jumping into flat screen TVs because they were the world's leader in CRTs, in Trinitron, and they're like, we're good. We're good into the nineties. And then they completely lost the TV business. That's why Toyota was so slow to EVs, because they're like, hybrids are good to us. Hybrids are great. We don't need an EV for a very long time. And then they released an EV thatwe, where the wheels fell off. So it comes for everybody. I dunno. I don't believe in these devices. Let's talk about something that could be cool. Something that is a little unrealistic, I think, but, for a certain aesthetic it is cool. Fujifilm announced the X half. Today it is an digital camera with an analog film aesthetic. It shoots in a three by four portrait aspect ratio. That's Inax mini ratio. It looks like an old school Fuji camera. This thing is pretty wild because the screen it's only making those portrait videos. One of the key selling points is that it can replicate some film some things you get from film there's a light leak simulation for when you like Overexpose film A little bit, a ation, and that's something Ben: that Fujifilm is known for. Devindra: Yes. They love that. They love these simulation modes. This is such a social media kid camera, especially for the people who cannot afford the Fuji films, compact cameras.Wow. Even the Ben: screen is do you wanna take some vertical photographs for your social media? Because vertical video has completely won. Devindra: You can't, and it can take video, but it is just, it is a simplistic living little device. It has that, what do you call that? It's that latch that you hit to wind film. It has that, so you can put it into a film photograph mode where you don't see anything on the screen. You have to use the viewfinder. To take pictures and it starts a countdown. You could tell it to do like a film, real number of pictures, and you have to click through to hit, take your next picture. It's the winder, it's, you can wind to the next picture. You can combine two portrait photos together. It's really cool. It's really cute. It's really unrealistic I think for a lot of folks, but. Hey, social media kits like influencers, the people who love to shoot stuff for social media and vertical video. This could be a really cool little device. I don't, what do you guys think about this? Karissa: You know what this reminds me of? Do you remember like in the early Instagram days when there was all theseapps, like hip, systematic where they tried to emulate like film aesthetics? And some of them would do these same things where like you would take the picture but you couldn't see it right away. 'cause it had to develop. And they even had a light leak thing. And I'm like, now we've come full circle where the camera companies are basically like yeah. Taking or like just doing their own. Spin on that, but Devindra: it only took them 15 years to really jump on this trend. But yes, everybody was trying to emulate classic cameras and foodie was like, oh, you want things that cost more but do less. Got it. That's the foodie film X half. And I think this thing will be a huge success. What you're talking about krisa, there is a mode where it's just yeah. You won't see the picture immediately. It has to develop in our app and then you will see it eventually. That's cool honestly, like I love this. I would not, I love it. I would not want it to be my main camera, but I would love to have something like this to play around when you could just be a little creative and pretend to be a street photographer for a little bit. Oh man. This would be huge in Brooklyn. I can just, Ben: Tom Rogers says cute, but stupid tech. I think that'sthe perfect summary. Devindra: But this is, and I would say this compared to the AI thing, which is just like. What is this device? What are you gonna do with it? It feels like a lot of nothing in bakery. Whereas this is a thing you hold, it takes cool pictures and you share it with your friends. It is such a precise thing, even though it's very expensive for what it is. I would say if you're intrigued by this, you can get cheap compact cameras, get used cameras. I only ever buy refurbished cameras. You don't necessarily need this, but, oh man, very, but having a Karissa: Fuji film camera is a status symbol anyway. So I don't know. This is it's eight 50 still seems like a little steep for a little toy camera, basically. But also I'm like I see that. I'm like, Ooh, that looks nice. Devindra: Yeah. It's funny the power shots that kids are into now from like the two thousands those used to cost like 200 to 300 bucks and I thought, oh, that is a big investment in camera. Then I stepped up to the Sony murals, which were like 500 to 600 or so. I'm like, okay, this is a bigger step up than even that. Most people would be better off with amuralist, but also those things are bigger than this tiny little pocket camera. I dunno. I'm really I think it's, I'm enamored with this whole thing. Also briefly in other news we saw that apparently Netflix is the one that is jumping out to save Sesame Street and it's going to, Sesame Street will air on Netflix and PBS simultaneously. That's a good, that's a good thing because there was previously a delay when HBO was in charge. Oh really? Yeah. They would get the new episodes and there was like, I forget how long the delay actually was, but it would be a while before new stuff hit PBS. This is just Hey, I don't love that so much of our entertainment and pop culture it, we are now relying on streamers for everything and the big media companies are just disappointing us, but. This is a good move. I think Sesame Street should stick around, especially with federal funding being killed left and right for public media like this. This is a good thing. Sesame Street is still good. My kids love it. When my son starts leaning into like his Blippy era, I. I justkinda slowly tune that out. Here's some Sesame Street. I got him into PeeWee's Playhouse, which is the original Blippy. I'm like, yes, let's go back to the source. Because Peewee was a good dude. He's really, and that show still holds up. That show is so much fun. Like a great introduction to camp for kids. Great. In introduction to like also. Diverse neighborhoods, just Sesame Street as well. Peewee was, or mr. Rogers was doing Ben: it before. I think everyone, Devindra: Mr. Rogers was doing it really well too. But Peewee was always something special because PeeWee's Wild, Peewee, Lawrence Fishburn was on Peewee. There, there's just a lot of cool stuff happening there. Looking back at it now as an adult, it is a strange thing. To watch, but anyway, great to hear that Sesame Street is back. Another thing, not so quick. Ben: Yeah, let me do this one. Go ahead, if I may. Go ahead. So if you have any trouble getting audio books on Libby or Hoopla or any of the other interlibrary loan systems that you can like access on your phone or iPad any tablet. That'sbecause of the US government because a while ago the Trump administration passed yet another executive order saying that they wanted to cut a bunch of funding to the Institute of Museum and Library Services, the IMLS, and they're the ones who help circulate big quotation marks there just because it's digital files, all of these things from interlibrary loans. So you can, get your audio books that you want. The crazy thing about this is that the IMLS was created in 1996 by a Republican controlled Congress. What's the deal here, guys? There's no waste, fraud and abuse, but if you have problems getting audio books, you can tell a friend or if anybody's complaining about why their, library selection went down. By a lot on Libby recently, now you have the answer. Devindra: It is truly sad. A lot of what's happening is just to reduce access to information because hey, a well-formed population isdangerous to anybody in charge, right? Terrible news. Let's move on to stuff from that's happening around in gadget. I wanna quickly shout out that Sam Rutherford has reviewed the ACEs RG flow Z 13. This is the sort of like surface like device. That's cool. This is the rise in pro Max chip. Sam seems to like it, so that's, it's a cool thing. Not exactly stealthy. He gave it a 79, which is right below. The threshold we have for recommending new products because this thing is expensive. You're paying a lot of money to get, essentially get a gaming tablet. But I tested out cs. It is cool that it actually worked for a certain type of person with too much money and who just needs the lightest gaming thing possible. I could see it being compelling. Let's see, what is the starting price? for a gaming tablet. Sam says it costs the same or more as a comparable RRG Zes G 14 with a real RTX 50 70. That is a great laptop. The RRGs Zes G 14, we have praised that laptop so much. So this is notreally meant for anybody ACEs lifts to do these experiments. They're getting there, they're getting there in terms of creating a gaming tablet, but not quite something I'd recommend for everybody at this point. All right. We have a quick email from a listener too. Thank you for sending this in, Jake Thompson. If you wanna send us an email, e podcast in gadget.com, and again, your emails may head into our Asking Gadget section. Jake asks. He's a real estate agent in need of a new laptop. He uses a Chromebook right now and it meets every need he has. Everything they do is web-based, but should they consider alternatives to a premium com Chromebook for their next computer, he says he doesn't mind spending or more if he can get something lightweight, trustworthy with a solid battery life. What would we consider in the search? I would point to, I immediately point to Jake, to our laptop guides because literally everything we mention, the MacBook Air. The AsisZen book, S 14, even the Dell Xbs 13 would be not much more than that price. I think more useful than a premium Chromebook because I think the idea of a premium Chromebook is a, is insanity. I don't know why you're spending so much money for a thing that can only do web apps, cheap Chromebooks, mid-range Chromebooks fine, or less. Great. But if you're spending that much money and you want something that's more reliable, that you could do more with, even if everything you're doing is web-based, there may be other things you wanna do. MacBook Windows laptop. There is so much more you can unlock there. Little bit, a little bit of gaming, a little bit of media creation. I don't know, Karissa. Ben, do you have any thoughts on this? What would you recommend or do, would you guys be fine with the Chromebook? Karissa: I like Chromebooks. I thought my first thought, and maybe this is like too out there, but would an iPad Pro fit that fit those requirements? 'cause you can do a lot with an iPad Pro. You Devindra: can do a lot that's actually great battery, Karissa: lightweight, lots of apps. If most everything he's doing is web based, there's. You can probably use iPad apps. Devindra: That's actually a good point. Karissa you cando a lot with an iPad and iPad Pro does start at around this price too. So it would be much lighter and thinner than a laptop. Especially if you could do a lot of web stuff. I feel like there are some web things that don't always run well in an iPad form. Safari and iPad doesn't support like everything you'd expect from a web-based site. Like I think if you. There are things we use like we use Video Ninja to record podcasts and that's using web RTC. Sometimes there are things like zencaster, something you have to use, apps to go use those things because I, iOS, iPad OS is so locked down. Multitasking isn't great on iPad os. But yeah, if you're not actually doing that much and you just want a nice. Media device. An iPad is a good option too. Alright, thank you so much Jake Thompson. That's a good one too because I wanna hear about people moving on from Chromebooks. 'cause they, send us more emails at podcast@enggadget.com for sure. Let's just skip right past what we're working on 'cause we're all busy. We're all busy with stuff unless you wanna mention anything. Chris, anything you're working on at the moment? Karissa: The only thing I wanna flag is thatwe are rapidly approaching another TikTok sale or ban. Deadline Yes. Next month. Speaker: Sure. Karissa: Been a while since we heard anything about that, but, I'm sure they're hard at work on trying to hammer out this deal. Ben: Okay. But that's actually more relevant because they just figured out maybe the tariff situation and the tariff was the thing that spoiled the first deal. So we'll see what happens like at the beginning of July, yeah. I think Karissa: The deadline's the 19th of June Ben: oh, at the beginning of June. Sorry. Karissa: Yeah, so it's. It's pretty close. And yeah, there has been not much that I've heard on that front. So Devindra: this is where we are. We're just like walking to one broken negotiation after another for the next couple years. Anything you wanna mention, pop culture related krisa that is taking your mind off of our broken world. Karissa: So this is a weird one, but I have been, my husband loves Stargate, and we have been for years through, wait, the movie, the TV shows, StargateSG one. Oh Devindra: God. And I'm yeah. Just on the Karissa: last few episodes now in the end game portion of that show. So that has been I spent years like making fun of this and like making fun of him for watching it, but that show's Devindra: ridiculously bad, but yeah. Yeah. Karissa: Everything is so bad now that it's, actually just a nice. Yeah. Distraction to just watch something like so silly. Devindra: That's heartwarming actually, because it is a throwback to when things were simpler. You could just make dumb TV shows and they would last for 24 episodes per season. My for how Ben: many seasons too, Devindra: Karissa? Karissa: 10 seasons. Devindra: You just go on forever. Yeah. My local or lamb and rice place, my local place that does essentially New York streetcar style food, they placed Arga SG one. Every time I'm in there and I'm sitting there watching, I was like, how did we survive with this? How did we watch this show? It's because we just didn't have that much. We were desperate for for genre of fiction, but okay, that's heartwarming Krisa. Have you guys done Farscape? No. Have you seen Farscape? 'cause Farscape is very, is a very similar type ofshow, but it has Jim Henson puppets and it has better writing. I love Jim Henson. It's very cool. Okay. It's it's also, it's unlike Stargate. It also dares to be like I don't know, sexy and violent too. Stargate always felt too campy to me. But Farscape was great. I bought that for On iTunes, so that was a deal. I dunno if that deal is still there, but the entire series plus the the post series stuff is all out there. Shout out to Farscape. Shout out to Stargate SG one Simpler times. I'll just really briefly run down a few things and or season two finished over the last week. Incredible stuff. As I said in my initial review, it is really cool to people see people watching this thing and just being blown away by it. And I will say the show. Brought me to tears at the end, and I did not expect that. I did not expect that because we know this guy's gonna die. This is, we know his fate and yet it still means so much and it's so well written and the show is a phenomenon. Chris, I'd recommend it to you when you guys are recovering from Stargate SG one loss and or is fantastic. I also checked out a bit of murderbot theApple TV plus adaptation of the Martha Wells books. It's fine. It is weirdly I would say it is funny and entertaining because Alexander Skarsgard is a fun person to watch in in genre fiction. But it also feels like this could be funnier, this could be better produced. Like you could be doing more with this material and it feels like just lazy at times too. But it's a fine distraction if you are into like half-baked sci-fi. So I don't know. Another recommendation for Stargate SG one Levers, Karissa Final Destination Bloodlines. I reviewed over at the film Cast and I love this franchise. It is so cool to see it coming back after 15 years. This movie is incredible. Like this movie is great. If you understand the final destination formula, it's even better because it plays with your expectations of the franchise. I love a horror franchise where there's no, no definable villain. You're just trying to escape death. There's some great setups here. This is a great time at the movies. Get your popcorn. Just go enjoy the wonderfully creative kills.And shout out to the Zap lapovsky and Adam B. Stein who. Apparently we're listening to my other podcast, and now we're making good movies. So that's always fun thing to see Mount Destination Bloodlines a much better film. The Mission Impossible, the Final Reckoning. My review of that is on the website now too. You can read that in a gadget. Ben: Thanks everybody for listening. Our theme music is by Game Composer Dale North. Our outro music is by our former managing editor, Terrence O'Brien. The podcast is produced by me. Ben Elman. You can find Karissa online at Karissa: Karissa b on threads Blue Sky, and sometimes still X. Ben: Unfortunately, you can find Dendra online Devindra: At dendra on Blue Sky and also podcast about movies and TV at the film cast@thefilmcast.com. Ben: If you really want to, you can find me. At hey bellman on Blue Sky. Email us at podcast@enggadget.com. Leave us a review on iTunes and subscribe on anything that gets podcasts. That includesSpotify. This article originally appeared on Engadget at #engadget #podcast #google
    WWW.ENGADGET.COM
    Engadget Podcast: The AI and XR of Google I/O 2025
    Would you believe Google really wants to sell you on its AI? This week, we dive into the news from Google I/O 2025 with Engadget's Karissa Bell. We discuss how Gemini is headed to even more places, as well as Karissa's brief hands-on with Google's prototype XR glasses. It seems like Google is trying a bit harder now than it did with Google Glass and its defunct Daydream VR platform. But will the company end up giving up again, or does it really have a shot against Meta and Apple? Subscribe! iTunes Spotify Pocket Casts Stitcher Google Podcasts Topics Lots of AI and a little XR: Highlights from Google I/O 2025 – 1:15 OpenAI buys Jony Ive’s design company for $6.6B, in an all equity deal – 29:27 Fujifilm’s $850 X Half could be the perfect retro camera for the social media age – 39:42 Sesame Street is moving from HBO to Netflix – 44:09 Cuts to IMLS will lead to headaches accessing content on apps like Libby and Hoopla – 45:49 Listener Mail: Should I replace my Chromebook with a Mac or PC Laptop? – 48:33 Pop culture picks – 52:22 Credits  Hosts: Devindra Hardawar and Karissa BellProducer: Ben EllmanMusic: Dale North and Terrence O'Brien Transcript Devindra: [00:00:00] What's up, internet and welcome back to the Engadget Podcast. I'm Senior Editor Devindra Hardawar. I'm joined this morning by Senior Writer Karissa Bell. Hello, Karissa. Karissa: Hello. Good morning. Devindra: Good morning. And also podcast producer Ben Elman. Hey Ben, I'm muted my dang self. Hello. Hello, Ben. Good morning. It's been a busy week, like it's one of those weeks where. Three major conferences happened all at once and a varying like relevance to us. Google IO is the big one. We'll be talking about that with Karissa who was there and got to demo Google's XR glasses, but also Computex was happening. That's over in Taipei and we got a lot of news from that to, we'll mention some of those things. Also, Microsoft build happened and I feel like this was the less least relevant build to us ever. I got one bit of news I can mention there. That's pretty much it. It's been a crazy hectic week for us over at Eng Gadget. As always, if you're enjoying the show, please be free to subscribe to us on iTunes or your podcast catcher of choice. Leave us a review on iTunes, drop us email at podcast@enggadget.com. [00:01:00] Those emails, by the way, if you ask a good question, it could end up being part of our Ask Engadget section, so that's something we're starting out. I have another good one. I'll be throwing to asking Eng gadgets soon. So send us your emails podcast@enggadget.com, Google io. It's all about ai, isn't it? I feel like Karissa, we were watching the keynote for this thing and it felt like it went on and on of the thing about the things, like we all pretty much expect more about Gemini ai, more about their newer models a bit about xr. Can you give me, what's your overall impression of IO at this point? Karissa: Yeah, it's interesting because I've been covering IO long enough that I remember back when it used to be Android. And then there'd be like that little section at the end about, AI and some of the other stuff. And now it's completely reversed where it's entirely AI and basically no Android to the point where they had a whole separate event with their typical Android stuff the week before. So it didn't have to go through and talk about any of yeah, the mobile things. Devindra: That was just like a live stream that was just like a chill, live stream. No real [00:02:00] effort put into it. Whereas this is the whole show. They had a, who was it? But they had TOIs. TOIs, yeah. They had actual music which is something a lot of these folks do at keynotes. It's actually really disconcerting to see cool musicians taking the corporate gig and performing at one of these things. I think, it was like 20 13, 20 14, maybe the Intel one, IDF or something. But the weekend was there. Just trying to jam to all these nerds and it was sad, but yeah. How was the experience Karissa like actually going there? Karissa: Yeah, it was good. That keynote is always kind of a slog. Just, live blogging for our almost two hours straight, just constant is it's a lot. I did like the music. Towa was very chill. It was a nice way to start much. I preferred it over the crazy loop daddy set we got last year. If anyone remembers that. Devindra: Yeah. Ben: Yeah. Oh, I remember that. Mark Rub was at audio. That was so weird. Devindra: Yeah. Yeah, it was a little intense. Cool. So what are some of the highlights? Like there, there's a bunch of stuff. If you go look on, on the site on Engadget, we [00:03:00] have rounded up like all the major news and that includes a couple of things like hey, AI mode, chat bot coming to search. That's cool. We got more, I think the thing a lot of people were looking at was like Project Astra and where that's gonna be going. And that is the sort of universal AI assistant where you could hold your phone up and just ask it questions about the world. We got another demo video about that. Which again, the actual utility of it, I'm weirded out by. There was also one video where they were just like I'm gonna be dumb. I'm gonna pretend I'm very stupid and ask ask Astro, what is this tall building in front of me. And it was like a fire hydrant or something. It was like some piece of street thing. It was not a really well done demo. Do you have any thoughts about that, Krista? Does that seem more compelling to you now or is it the same as what we saw last year? Karissa: I think what was interesting to me about it was that we saw Astro last year and like that, I think there was a lot of excitement around that, but it wasn't really entirely clear where that. Project is going. They've said it's like an experimental research thing. And then, I feel like this year they really laid out that they want to [00:04:00] bring all that stuff to Gemini. Astra is sort of their place to like tinker with this and, get all this stuff working. But like their end game is putting this into Gemini. You can already see it a little bit in Gemini Live, which is like their multimodal feature where you can do some. Version of what ASRA can do. And so that was interesting. They're saying, we want Gemini to be this universal AI assistant. They didn't use the word a GI or anything like that. But I think it's pretty clear where they're going and like what their ambition is they want this to be, an all seeing, all knowing AI assistant that can help you with anything is what they're trying to sell it as. Devindra: It is weird, like we're watching the demo video and it's a guy trying to fix his bike and he is pointing his phone at like the bike and asking questions like which, which particular, I don't know. It's which particular nut do I need for this tightening thing and it's giving him good advice. It's pointing to things on YouTube. I. I don't know how useful this will actually be. This kind of goes to part of the stuff we're seeing with AI too, of just like offloading [00:05:00] some of the grunt work of human intelligence because you can do this right now, people have been YouTubing to fix things forever. YouTube has become this like information repository of just fix it stuff or home plumbing or whatever. And now it's just like you'll be able to talk to your phone. It'll direct you right to those videos or. Extract the actual instructions from those. That's cool. I feel like that's among the more useful things, more useful than like putting Gemini right into Chrome, which is another thing they're talking about, and I don't know how useful that is other than. They wanna push AI in front of us, just like Microsoft wants to push copilot in front of us at all times. Ben: What is a situation where you would have a question about your Chrome tabs? Like I'm not one of those people that has 15 chrome tabs open at any given time, and I know that I am. Yeah, I know. Wait, you're saying that like it's a high. Like it's high. Yeah, no I know. So I have a abnormally low number of chrome tabs open, but can you still come up [00:06:00] with an idea of why you would ask Gemini anything about your own tabs open? Hopefully you have them organized. At least Karissa: they should. A few examples of like online shopping, like maybe you have. Two tabs of two different products open. And you can say Devindra: exactly, Karissa: ask Gemini to like, compare the reviews. Or they use like the example of a recipe video, a recipe blog. And maybe, you wanna make some kind of modification, make the recipe gluten free. And you could ask Gemini Hey, make this how would I make this gluten free? But I think you're right, like it's not exactly clear. You can already just open a new tab and go to Gemini and ask it. Something. So they're just trying to reduce Devindra: friction. I think that's the main thing. Like just the less you have to think about it, the more it's in your face. You can just always always just jump right to it. It's hey, you can Google search from any your UL bar, your location bar in any browser. We've just grown to use that, but that didn't used to be the case. I remember there used to be a separate Google field. Some browsers and it wasn't always there in every browser too. They did announce some new models. We [00:07:00] saw there's Gemini 2.5 Pro. There's a deep think reasoning model. There's also a flash model that they announced for smaller devices. Did they show any good demos of the reasoning stuff? Because I that's essentially slower AI processing to hopefully get you better answers with fewer flaws. Did they actually show how that worked? Karissa. Karissa: I only saw what we all saw during the keynote and I think it's, we've seen a few other AI companies do something similar where you can see it think like its reasoning process. Yeah. And see it do that in real time. But I think it's a bit unclear exactly what that's gonna look like. Devindra: Watching a video, oh, Gemini can simulate nature simulate light. Simulate puzzles, term images into code. Ben: I feel like the big thing, yeah. A lot of this stuff is from DeepMind, right? This is DeepMind an alphabet company. Devindra: DeepMind and Alphabet company. There is Deep mind. This is deep Think and don't confuse this with deep seek, which is that the Chinese AI company, and they [00:08:00] clearly knew what they were doing when they call it that thing. Deep seek. But no, yeah, that is, this is partially stuff coming out of DeepMind. DeepMind, a company which Google has been like doing stuff with for a while. And we just have not really seen much out of it. So I guess Gemini and all their AI processes are a way to do that. We also saw something that got a lot of people, we saw Ben: Nobel Prize from them. Come on. Devindra: Hey, we did see that. What does that mean? What is that even worth anymore? That's an open question. They also showed off. A new video tool called Flow, which I think got a lot of people intrigued because it's using a new VO three model. So an updated version of what they've had for video effects for a while. And the results look good. Like the video looks higher quality. Humans look more realistic. There have been. The interesting thing about VO three is it can also do synchronized audio to actually produce audio and dialogue for people too. So people have been uploading videos around this stuff online at this point, and you have to [00:09:00] subscribe to the crazy high end. Version of Google's subscription to even test out this thing at this point that is the AI Ultra plan that costs $250 a month. But I saw something of yeah, here's a pretend tour of a make believe car show. And it was just people spouting random facts. So yeah, I like EVs. I would like an ev. And then it looks realistic. They sound synchronized like you could. I think this is a normal person. Then they just kinda start laughing at the end for no reason. Like weird little things. It's if you see a sociopath, try to pretend to be a human for a little bit. There's real Patrick Bateman vibes from a lot of those things, so I don't know. It's fun. It's cool. I think there's, so didn't we Ben: announce that they also had a tool to help you figure out whether or not a video was generated by flow? They did announce that Devindra: too. Ben: I've yeah, go ahead. Go Karissa: ahead. Yeah. The synth id, they've been working on that for a while. They talked about it last year at io. That's like their digital watermarking technology. And the funny thing about this is [00:10:00] their whole, the whole concept of AI watermarking is you put like these like invisible watermarks into AI generated content. You might, you couldn't just. See it, just watching this content. But you can go to this website now and basically like double check. If it has one of these watermarks, which is on one hand it's. I think it's important that they do this work, but I also just wonder how many people are gonna see a video and think I wonder what kind of AI is in this. Let me go to this other website and like double check it like that. Just, Ben: yeah. The people who are most likely to immediately believe it are the, also the least likely to go to the website and be like, I would like to double check Devindra: this. It doesn't matter because most people will not do it and the damage will be done. Just having super hyper realistic, AI video, they can, you can essentially make anything happen. It's funny that the big bad AI bad guy in the new Mission Impossible movies, the entity, one of the main things it does is oh, we don't know what's true anymore because the entity can just cr fabricate reality at whim. We're just doing that. [00:11:00] We're just doing that for, I don't know, for fun. I feel like this is a thing we should see in all AI video tools. This doesn't really answer the problem, answer the question that everyone's having though. It's what is the point of these tools? Because it does devalue filmmaking, it devalues people using actual actors or using, going out and actually shooting something. Did Google make a better pitch for why you would use Flow Karissa or how it would fit into like actual filmmaking? Karissa: I'm not sure they did. They showed that goofy Darren Aronofsky trailer for some woman who was trying to like, make a movie about her own birth, and it was like seemed like they was trying to be in the style of some sort of like psychological thriller, but it just, I don't know, it just felt really weird to me. I was I was just like, what are we watching? This doesn't, what are we watching? Yeah. Ben: Was there any like good backstory about why she was doing that either or was it just Hey, we're doing something really weird? Karissa: No, she was just oh I wonder, you know what? I wanna tell the story of my own birth and Okay. Ben: [00:12:00] Okay, but why is your relate birth more? Listen its like every, I need more details. Why is your birth more important? It's, everybody wants lots of babies. Write I memoir like one of three ways or something. Devindra: Yeah, it's about everybody who wants to write a memoir. It's kinda the same thing. Kinda that same naval ga thing. The project's just called ancestral. I'm gonna play a bit of a trailer here. I remember seeing this, it reminds me of that footage I dunno if you guys remember seeing, look who's talking for the very first time or something, or those movies where they, they showed a lot of things about how babies are made. And as a kid I was like, how'd they make that, how'd that get done? They're doing that now with AI video and ancestral this whole project. It is kinda sad because Aronofsky is one of my, like one of my favorite directors when he is on, he has made some of my favorite films, but also he's a guy who has admittedly stolen ideas and concepts from people like Satoshi kh as specific framing of scenes and things like that. In Requa for a Dream are in some cones movies as well. So [00:13:00] I guess it's to be expected, but it is. Sad because Hollywood as a whole, the union certainly do not like AI video. There was a story about James Earl Jones' voice being used as Darth Vader. In Fortnite. In Fortnite. In Fortnite, yeah. Which is something we knew was gonna happen because Disney licensed the rights to his voice before he died from his estate. He went in and recorded lines to at least create a better simulation of his voice. But people are going out there making that Darth Vader swear and say bad things in Fortnite and the WGA or is it sag? It's probably sag but sad. Like the unions are pissed off about this because they do not know this was happening ahead of time and they're worried about what this could mean for the future of AI talent. Flow looks interesting. I keep seeing play people play with it. I made a couple videos asked it to make Hey, show me three cats living in Brooklyn with a view of the Manhattan skyline or something. And it, it did that, but the apartment it rendered didn't look fully real. [00:14:00] It had like weird heating things all around. And also apparently. If you just subscribe to the basic plan to get access to flow, you can use flow, but that's using the VO two model. So older AI model. To get VO three again, you have to pay $250 a month. So maybe that'll come down in price eventually. But we shall see. The thing I really want to talk with you about Krisa is like, what the heck is happening with Android xr? And that is a weird project for them because I was writing up the news and they announced like a few things. They were like, Hey we have a new developer released to help you build Android XR apps. But it wasn't until the actual a IO show. That they showed off more of what they were actually thinking about. And you got to test out a pair of prototype Google XR glasses powered by Android xr. Can you tell me about that experience and just how does it differ from the other XR things you've seen from who is it from Several, look, you've seen Metas Meta, you saw one from Snap, right? Meta Karissa: I've seen Snap. Yeah. Yeah. I've seen the X reel. Yeah, some of the other smaller [00:15:00] companies I got to see at CES. Yeah, that was like a bit of a surprise. I know that they've been talking about Android XR for a while. I feel like it's been a little, more in the background. So they brought out these, these glasses and, the first thing that I noticed about them was like, they were actually pretty small and like normal looking compared to, met Orion or like the snap spectacles. Like these were very thin which was cool. But the display was only on one side. It was only on one lens. They called it like a monocular display. So there's one lens on one side. So it's basically just like a little window, very small field of view. Devindra: We could see it in, if you go to the picture on top of Chris's hands on piece, you can see the frame out. Of what that lens would be. Yeah. Karissa: Yeah. And I noticed even when we were watching that, that demo video that they did on stage, that like the field of view looked very small. It was even smaller than Snaps, which is 35 degrees like this. I would, if I had to guess, I'd say it's maybe like around 20. They wouldn't say what it was. They said, this is a prototype. We don't wanna say the way I thought about it, the way [00:16:00] I compared it to my piece was like the front screwing on a foldable phone, so it's you can get notifications and you can like glance at things, but it's not fully immersive ar it's not, surrounding your space and like really cha changing your reality, in the way that like snap and and meta are trying to do later when I was driving home, I realized it actually was reminded me like a better comparison might be the heads up display in your car. Speaker: Yeah. Yeah. Karissa: If you have a car that has that little hu where you can see how fast you're going and directions and stuff like that. Devindra: That's what Google Glass was doing too, right? Because that was a little thing off to the side of your revision that was never a full takeover. Your vision type of thing. Karissa: Yeah. It's funny, that's what our editor Aaron said when he was editing my piece, he was like, oh, this sounds like Google Glass. And I'm like, no, it actually, it's, it is better than that. These are like normal looking glasses. The, I tried Google Glass many years ago. Like the Fidelity was better. Actually I was thinking. It feels like a happy medium almost between, meta ray bands and like full ar Yeah, like I, I've had a meta ray band glasses [00:17:00] for a long time and people always ask me, like when I show it to someone, they're like, oh, that's so cool. And then they go, but you can see stuff, right? There's a display and I'm like. No. These are just, glasses with the speaker. And I feel like this might be like a good kind of InBetween thing because you have a little bit of display, but they still look like glasses. They're not bulky 'cause they're not trying to do too much. One thing I really liked is that when you take a photo, you actually get a little preview of that image that like floats onto the screen, which was really cool because it's hard to figure out how to frame pictures when you are taking using glasses camera on your smart glasses. So I think there's some interesting ideas, but it's very early. Obviously they want like Gemini to be a big part of it. The Gemini stuff. Was busted in my demo. Devindra: You also said they don't plan on selling these are like purely, hey, this is what could be a thing. But they're not selling these specific glasses, right? Karissa: Yeah, these specific ones are like, this is a research prototype. But they did also announce a partnership with Warby Parker and another glasses company. So I think it's like you can see them trying to take a meta approach here, which [00:18:00] actually would be pretty smart to say let's partner with. A known company that makes glasses, they're already popular. We can give them our, our tech expertise. They can make the glasses look good and, maybe we'll get something down the line. I actually heard a rumor that. Prototype was manufactured by Samsung. They wouldn't say Devindra: Of course it's Sam, Samsung wants to be all over this. Samsung is the one building their the full on Android XR headset, which is a sort of like vision Pro copycat, like it is Mohan. Yeah. Moan. It is displays with the pass through camera. That should be coming later this year. Go ahead Ben. Ben: Yeah. Question for Karissa. When Sergey brand was talking about Google Glass, did that happen before or after the big demo for the Google XR glasses? Karissa: That was after. That was at the end of the day. He was a surprise guest in this fireside chat with the DeepMind, CEO. And yeah, it was, we were all wondering about that. 'cause we all, dev probably remembers this very well the, when Google Glass came out and cereal and skydive [00:19:00] wearing them into io. Yeah. Speaker: Yep. Karissa: And then, now for him to come back and say we made a lot of mistakes with that product and. Ben: But was it mistakes or was it just the fact that like technology was not there yet because he was talking about like consumer electronic supply chain, blah, blah, blah, blah, blah. Devindra: He's right that the tech has caught up with what the vision of what they wanted to do, but also I think he fundamentally misread like people will see you looking like the goddamn borg and want to destroy you. They want you will turn into Captain Picard and be like, I must destroy whoever is wearing Google Glass because this looks like an alien trying to take over my civilization. And the thing that meta did right, that you've seen Karissa, is that make 'em look like normal glasses and Yeah, but nobody will knows, Ben: Karissa does not look entirely human in this picture either. Karissa: Yes. But listen from, if you see 'em straight on, they don't, they look transparent. That was I used that photo because I was trying to. Devindra: You get the angle, show The display. Karissa: Yeah. Devindra: [00:20:00] Yeah. There's another one like you. This looks normal. This looks totally normal. The glasses themselves look like, they look like typical hipster glasses. Like they're not like a super big frame around them. You're they look like the arms seem big. The arms seem wider than a typical pair of glasses, but you wouldn't know that 'cause it's covered in your hair. A lot of people won't notice glasses, arms as much. Ben: Yeah, Devindra: that is cool. The issue Ben: still is that all of these frames are so chunky. And it's because you need to hide all of the internals and everything, but you're not gonna get like the beautiful, like thin Japanese like titanium anytime soon. No, because this stuff needs to shrink way more. Devindra: This stuff that's not, those the kind of frames they are. I will say I had a meeting with the one of the I believe the CEO of X reel who. Came not, I did talk to them at c so they, they had like a lot of ideas about that. I talked to the the head of space top, which is [00:21:00] the, that's the company that was doing the sort of AR laptop thing. And then they gave up on that idea because AI PCs have the nmps that they need to do that stuff. And they're all in on the idea that, more people will want to use these sorts of glasses. Maybe not all the time, but for specific use cases. Something that co covers your field of vision more. Could be a great thing when you sit down at your desk. I could see people doing this. I could see people getting these glasses. I don't know if it's gonna be good for society, right? It feels when Bluetooth headsets were first popping up and everybody hated those people, and you're like, oh, we must shun this person from society. This one, you can't quite see the screen. So you can pretend to be a normal human and then have this like augmented ability next to you. If they can hide that, if they can actually hide the fact that you have a display on your glasses that would help people like me who are face blind and I walk around I don't, I know this person. I've seen them before. What is their name? What is their name? I could see that being useful. Ben: On the other side of it [00:22:00] though, if you have one standard look for glasses like this, then you know, oh, this person is, I. Also interacting with like information and stuff that's like popping up in front of their eyes. It's a universal signifier, just like having a big pair of headphones is Devindra: I think you will see people looking off to the distance. Krisa, did you notice that your eye line was moving away from people you were talking to while you were wearing these? Karissa: Yeah, and that was also one of the issues that I had was that the. Actual, like display was like, was it like didn't quite render right? Where I'm not a farsighted person, but I actually had to look farther off in the distance to actually get it to like my eyes to focus on it. And I asked 'em about that and they're like, oh it's a prototype. It's not quite dialed in. They weren't calibrating these things to your eyeballs. Like the way when I did the Meta Orion demo, they have to take these specific measurements because there's eye tracking and all these things and this, didn't have any of that. There. Yeah, there definitely was. You're, somebody's talking to you, but you're looking over here. Devindra: That's not great. That's [00:23:00] not great for society. You're having a conversation with people. I like how they're framing this oh yes, you can be more connected with reality. 'cause you don't have a phone in front of your face, except you always have another display in front of your face, which nobody else can see, and you're gonna look like an alien walking around. They showed some videos of people using it for like street navigation. Which I kinda like. You're in a new city, you'll see the arrows and where to turn and stuff. That's useful. But there is this, there was one that was really overwrought. It was a couple dancing at Sunset, and the guy is take a picture of this beautiful moment of the sun peeking through behind, my lady friend. And it just felt like that's what you wanna do in that moment. You wanna talk to your virtual assistant while you should be enjoying the fact that you are having this beautiful dancing evening, which nobody will ever actually have. So that's the whole thing. I will say my overall thoughts on this stuff, like just looking at this, the stuff they showed before they actually showed us the glasses, it doesn't feel like Google is actually that far in terms of making this a reality. Karissa the, like I'm comparing it to. Where Meta [00:24:00] is right now, and even where Apple is right now, like when Apple showed us the vision Pro. We were able to sit down and I had a 30 minute demo of that thing working, and I saw the vision of what they were doing and they thought a lot about how this was. How long was your demo with this thing? Karissa: I was in the room with them for about five minutes and I had them on for about three minutes myself. That's not a demo. That's not a demo. Ben: Oh, goodness. So all of these pictures were taken in the same 90 seconds? Yes. Yeah. God. That's amazing. Devindra: It's amazing you were able to capture these impressions, Karissa. Yeah, Karissa: I will say that they did apparently have a demo in December, a press event in December where people got to see these things for a lot longer, but it was, they could not shoot them at all. We, a lot of us were wondering if that was why it was so constrained. They only had one room, there's hundreds of people basically lining up to try these out. And they're like very strict. You got five minutes, somebody's in there like after a couple minutes, rushing you out, and we're like, okay. Like Devindra: They clearly only have a handful of these. That's like the main reason this is happening. I am, this is the company, that did Google Glass and that was too [00:25:00] early and also maybe too ambitious. But also don't forget, Google Cardboard, which was this that was a fun little project of getting phone-based vr happening. Daydream vr, which was their self-contained headset, which was cool. That was when Samsung was doing the thing with Meta as well, or with Oculus at the time. So and they gave up on those things. Completely. And Google's not a company I trust with consumer Hardaware in general. So I am. Don't think there is a huge future in Android xr, but they wanna be there. They wanna be where Meta is and where Apple is and we shall see. Anything else you wanna add about io, Karissa? Karissa: No, just that AI. A i a ai Devindra: a I didn't AI ao, A IAO a IO starline. The thing that was a, like weird 3D rendering teleconferencing video that is becoming a real thing that's turning to Google Beam video. But it's gonna be an enterprise thing. They're teaming up with AI to, with HP to bring a scaled down version of that two businesses. I don't think we'll love or see That's one of those things where it's oh, this exists [00:26:00] in some corporate offices who will pay $50,000 for this thing, but. I don't, normal people will never interact with this thing, so it practically just does not exist. So we shall see. Anyway, stay tuned for, we're gonna have more demos of the Gemini stuff. We'll be looking at the new models, and certainly Chris and I will be looking hard at Android XR and wherever the heck that's going. Let's quickly move on to other news. And I just wanna say there were other events, Compex, we wrote up a couple, a whole bunch of laptops. A MD announced a cheaper radio on graphics card. Go check out our stories on that stuff. Build. I wrote one, I got a 70 page book of news from Microsoft about build and 99% of that news just does not apply to us because Build is so fully a developer coding conference. Hey, there's more more copilot stuff. There's a copilot app coming to 360 [00:27:00] fi subscribers, and that's cool, but not super interesting. I would say the big thing that happened this week and that surprised a lot of us is the news that OpenAI has bought. Johnny i's design startup for six and a half billion. Dollars. This is a wild story, which is also paired with a weird picture. It looks like they're getting married. It looks like they're announcing their engagement over here because Johnny, ive is just leaning into him. Their heads are touching a little bit. It's so adorable. You're not showing Ben: the full website though. The full website has like a script font. It literally looks, yeah, like something from the knot. Devindra: It Is it? Yeah. Let's look at here. Sam and Johnny introduced io. This is an extraordinary moment. Computers are now seeing, thinking, understanding, please come to our ceremony at this coffee shop. For some reason, they also yeah, so they produced this coffee shop video to really show this thing off and, it is wild to me. Let me pull this up over here. Ben: While we're doing that. Karissa, what do you [00:28:00] have to say about this? Karissa: I don't, I'm trying to remember, so I know this is Johnny Ives like AI because he also has like the love from, which is still Devindra: this is love from, this is, so he is, let me get the specifics of the deal out here. Yeah. As part of the deal Ive and his design studio love form. Is it love form or love form? Love form. Yeah. Love form are gonna be joining are gonna work independently of open ai. But Scott Cannon Evans Hanky and Ang Tan who co-founded io. This is another io. I hate these. Yeah, so IO is his AI. Karissa: Focused design thing. And then love form is like his design Devindra: studio thing. Karissa: Sure. Yeah. I'm just, he Devindra: has two design things. Karissa: I'm trying to remember what they've done. I remember there was like a story about they made like a really expensive jacket with some weird buttons or something like Devindra: Yep. I do remember that. Karissa: I was just trying to back my brain of what Johnny Iiv has really done in his post Apple life. I feel like we haven't, he's made Devindra: billions of dollars courses. What's happened? Yes. [00:29:00] Because he is now still an independent man. Clearly he's an independent contractor, but love like the other side of io. Which includes those folks. They will become open AI employees alongside 50 other engineers, designers, and researchers. They're gonna be working on AI Hardaware. It seems like Johnny, I will come in with like ideas, but he, this is not quite a marriage. He's not quite committing. He's just taking the money and being like, Ew, you can have part of my AI startup for six and a half billion dollars. Ben: Let us know your taxes. It's all equity though, so this is all paper money. Six and a half billion dollars. Of like open AI's like crazy, their crazy valuation who knows how act, how much it's actually going to be worth. But all these people are going to sell a huge chunk of stock as soon as open AI goes public anyway. So it's still gonna be an enormous amount of money. Devindra: Lemme, let me see here, the latest thing. Open OpenAI has raised 57.9 billion of funding over 11 rounds. [00:30:00] Good Lord. Yeah. Yeah. So anyway, a big chunk of that is going to, to this thing because I think what happened is that Sam Altman wants to, he clearly just wants to be Steve Jobs. I think that's what's happening here. And go, I, all of you go look at the video, the announcement video for this thing, because it is one of the weirdest things I've seen. It is. Johnny I have walking through San Francisco, Sam Altman, walking through San Francisco with his hands in his pockets. There's a whole lot of setup to these guys meeting in a coffee shop, and then they sit there at the coffee shop like normal human beings, and then have an announcement video talking to nobody. They're just talking to the middle of the coffee bar. I don't know who they're addressing. Sometimes they refer to each other and sometimes they refer to camera, but they're never looking at the camera. This is just a really wild thing. Also. Yet, another thing that makes me believe, I don't think Sam Altman is is a real human boy. I think there is actually something robotic about this man, because I can't see him actually perform in real life [00:31:00] what they're gonna do. They reference vagaries, that's all. It's, we don't know what exactly is happening. There is a quote. From Johnny Ive, and he says, quote, the responsibility that Sam shares is honestly beyond my comprehension end quote. Responsibility of what? Just building this like giant AI thing. Sam Alman For humanity. Yeah, for humanity. Like just unlocking expertise everywhere. Sam Altman says he is. He has some sort of AI device and it's changed his life. We don't know what it is. We dunno what they're actually working on. They announced nothing here. But Johnny Ive is very happy because he has just made billions of dollars. He's not getting all of that money, but he, I think he's very pleased with this arrangement. And Sam Malman seems pleased that, oh, the guy who who designed the iPhone and the MacBook can now work for me. And Johnny, I also says the work here at Open AI is the best work he's ever done. Sure. You'd say that. Sure. By the way. Karissa: Sure. What do you think Apple thinks about all this? Devindra: Yeah, Karissa: their AI [00:32:00] program is flailing and like their, star designer who, granted is not, separated from Apple a while ago, but is now teaming up with Sam Altman for some future computing AI Hardaware where like they can't even get AI Siri to work. That must be like a gut punch for folks maybe on the other side of it though. Yeah, I Ben: don't think it's sour grapes to say. Are they going into the like. Friend, like friend isn't even out yet, but like the humane pin? Yes. Or any of the other like AI sidekick sort of things like that has already crashed and burned spectacularly twice. Devindra: I think Apple is, maybe have dodged a bullet here because I, the only reason Johnny and I just working on this thing is because he OpenAI had put some money into left Formm or IO years ago too. So they already had some sort of collaboration and he's just okay, people are interested in the ai. What sort of like beautiful AI device can I buy? The thing is. [00:33:00] Johnny Ive unchecked as a designer, leads to maddening things like the magic mouse, the charges from the bottom butterfly Karissa: keyboard, Devindra: any butterfly keyboard. Yeah, that's beautiful, but not exactly functional. I've always worked best when he Johnny, ive always worked best when I. He had the opposing force of somebody like a Steve Jobs who could be like, no, this idea is crazy. Or reign it in or be more functional. Steve Jobs not a great dude in many respects, but the very least, like he was able to hone into product ideas and think about how humans use products a lot. I don't think Johnny, ive on his own can do that. I don't think Sam Altman can do that because this man can barely sit and have a cup of coffee together. Like a human being. So I, whatever this is. I honestly, Chris, I feel like Apple has dodged a bullet because this is jumping into the AI gadget trend. Apple just needs to get the software right, because they have the devices, right? We are wearing, we're wearing Apple watches. People have iPhones, people have MacBooks. What they need to do, solidify the infrastructure the AI [00:34:00] smarts between all those devices. They don't need to go out and sell a whole new device. This just feels like opening AI is a new company and they can try to make an AI device a thing. I don't think it's super compelling, but let us know listeners, if any of this, listen to this chat of them talking about nothing. Unlocking human greatness, unlocking expertise just through ai, through some AI gadget. I don't quite buy it. I think it's kind of garbage, but yeah. Ben: Anything else you guys wanna say about this? This is coming from the same guy who, when he was asked in an interview what college students should study, he said Resilience. Karissa: Yeah. I just think all these companies want. To make the thing that's the next iPhone. Yes. They can all just stop being relying on Apple. It's the thing that Mark Zuckerberg has with all of their like Hardaware projects, which by the way, there was one of the stories said that Johnny I thing has been maybe working on some kind of. Head earbuds with cameras on them, which sounded [00:35:00] very similar to a thing that meta has been rumored about meta for a long time. And and also Apple, Devindra: like there, there were rumors about AirPods with head with Karissa: cameras. Yeah. And everyone's just I think trying to like, make the thing that's like not an iPhone that will replace our iPhones, but good luck to them, good, good Devindra: luck to that because I think that is coming from a fundamentally broken, like it's a broken purpose. The whole reason doing that is just try to outdo the iPhone. I was thinking about this, how many companies like Apple that was printing money with iPods would just be like, Hey we actually have a new thing and this will entirely kill our iPod business. This new thing will destroy the existing business that is working so well for us. Not many companies do that. That's the innovator's dilemma that comes back and bites companies in the butt. That's why Sony held off so long on jumping into flat screen TVs because they were the world's leader in CRTs, in Trinitron, and they're like, we're good. We're good into the nineties. And then they completely lost the TV business. That's why Toyota was so slow to EVs, because they're like, hybrids are good to us. Hybrids are great. We don't need an EV for a very long time. And then they released an EV that [00:36:00] we, where the wheels fell off. So it comes for everybody. I dunno. I don't believe in these devices. Let's talk about something that could be cool. Something that is a little unrealistic, I think, but, for a certain aesthetic it is cool. Fujifilm announced the X half. Today it is an $850 digital camera with an analog film aesthetic. It shoots in a three by four portrait aspect ratio. That's Inax mini ratio. It looks like an old school Fuji camera. This thing is pretty wild because the screen it's only making those portrait videos. One of the key selling points is that it can replicate some film some things you get from film there's a light leak simulation for when you like Overexpose film A little bit, a ation, and that's something Ben: that Fujifilm is known for. Devindra: Yes. They love that. They love these simulation modes. This is such a social media kid camera, especially for the people who cannot afford the $2,000 Fuji films, compact cameras. [00:37:00] Wow. Even the Ben: screen is do you wanna take some vertical photographs for your social media? Because vertical video has completely won. Devindra: You can't, and it can take video, but it is just, it is a simplistic living little device. It has that, what do you call that? It's that latch that you hit to wind film. It has that, so you can put it into a film photograph mode where you don't see anything on the screen. You have to use the viewfinder. To take pictures and it starts a countdown. You could tell it to do like a film, real number of pictures, and you have to click through to hit, take your next picture. It's the winder, it's, you can wind to the next picture. You can combine two portrait photos together. It's really cool. It's really cute. It's really unrealistic I think for a lot of folks, but. Hey, social media kits like influencers, the people who love to shoot stuff for social media and vertical video. This could be a really cool little device. I don't, what do you guys think about this? Karissa: You know what this reminds me of? Do you remember like in the early Instagram days when there was all these [00:38:00] apps, like hip, systematic where they tried to emulate like film aesthetics? And some of them would do these same things where like you would take the picture but you couldn't see it right away. 'cause it had to develop. And they even had a light leak thing. And I'm like, now we've come full circle where the camera companies are basically like yeah. Taking or like just doing their own. Spin on that, but Devindra: it only took them 15 years to really jump on this trend. But yes, everybody was trying to emulate classic cameras and foodie was like, oh, you want things that cost more but do less. Got it. That's the foodie film X half. And I think this thing will be a huge success. What you're talking about krisa, there is a mode where it's just yeah. You won't see the picture immediately. It has to develop in our app and then you will see it eventually. That's cool honestly, like I love this. I would not, I love it. I would not want it to be my main camera, but I would love to have something like this to play around when you could just be a little creative and pretend to be a street photographer for a little bit. Oh man. This would be huge in Brooklyn. I can just, Ben: Tom Rogers says cute, but stupid tech. I think that's [00:39:00] the perfect summary. Devindra: But this is, and I would say this compared to the AI thing, which is just like. What is this device? What are you gonna do with it? It feels like a lot of nothing in bakery. Whereas this is a thing you hold, it takes cool pictures and you share it with your friends. It is such a precise thing, even though it's very expensive for what it is. I would say if you're intrigued by this, you can get cheap compact cameras, get used cameras. I only ever buy refurbished cameras. You don't necessarily need this, but, oh man, very, but having a Karissa: Fuji film camera is a status symbol anyway. So I don't know. This is it's eight 50 still seems like a little steep for a little toy camera, basically. But also I'm like I see that. I'm like, Ooh, that looks nice. Devindra: Yeah. It's funny the power shots that kids are into now from like the two thousands those used to cost like 200 to 300 bucks and I thought, oh, that is a big investment in camera. Then I stepped up to the Sony murals, which were like 500 to 600 or so. I'm like, okay, this is a bigger step up than even that. Most people would be better off with a [00:40:00] muralist, but also those things are bigger than this tiny little pocket camera. I dunno. I'm really I think it's, I'm enamored with this whole thing. Also briefly in other news we saw that apparently Netflix is the one that is jumping out to save Sesame Street and it's going to, Sesame Street will air on Netflix and PBS simultaneously. That's a good, that's a good thing because there was previously a delay when HBO was in charge. Oh really? Yeah. They would get the new episodes and there was like, I forget how long the delay actually was, but it would be a while before new stuff hit PBS. This is just Hey, I don't love that so much of our entertainment and pop culture it, we are now relying on streamers for everything and the big media companies are just disappointing us, but. This is a good move. I think Sesame Street should stick around, especially with federal funding being killed left and right for public media like this. This is a good thing. Sesame Street is still good. My kids love it. When my son starts leaning into like his Blippy era, I. I just [00:41:00] kinda slowly tune that out. Here's some Sesame Street. I got him into PeeWee's Playhouse, which is the original Blippy. I'm like, yes, let's go back to the source. Because Peewee was a good dude. He's really, and that show still holds up. That show is so much fun. Like a great introduction to camp for kids. Great. In introduction to like also. Diverse neighborhoods, just Sesame Street as well. Peewee was, or mr. Rogers was doing Ben: it before. I think everyone, Devindra: Mr. Rogers was doing it really well too. But Peewee was always something special because PeeWee's Wild, Peewee, Lawrence Fishburn was on Peewee. There, there's just a lot of cool stuff happening there. Looking back at it now as an adult, it is a strange thing. To watch, but anyway, great to hear that Sesame Street is back. Another thing, not so quick. Ben: Yeah, let me do this one. Go ahead, if I may. Go ahead. So if you have any trouble getting audio books on Libby or Hoopla or any of the other interlibrary loan systems that you can like access on your phone or iPad any tablet. That's [00:42:00] because of the US government because a while ago the Trump administration passed yet another executive order saying that they wanted to cut a bunch of funding to the Institute of Museum and Library Services, the IMLS, and they're the ones who help circulate big quotation marks there just because it's digital files, all of these things from interlibrary loans. So you can, get your audio books that you want. The crazy thing about this is that the IMLS was created in 1996 by a Republican controlled Congress. What's the deal here, guys? There's no waste, fraud and abuse, but if you have problems getting audio books, you can tell a friend or if anybody's complaining about why their, library selection went down. By a lot on Libby recently, now you have the answer. Devindra: It is truly sad. A lot of what's happening is just to reduce access to information because hey, a well-formed population is [00:43:00] dangerous to anybody in charge, right? Terrible news. Let's move on to stuff from that's happening around in gadget. I wanna quickly shout out that Sam Rutherford has reviewed the ACEs RG flow Z 13. This is the sort of like surface like device. That's cool. This is the rise in pro Max chip. Sam seems to like it, so that's, it's a cool thing. Not exactly stealthy. He gave it a 79, which is right below. The threshold we have for recommending new products because this thing is expensive. You're paying a lot of money to get, essentially get a gaming tablet. But I tested out cs. It is cool that it actually worked for a certain type of person with too much money and who just needs the lightest gaming thing possible. I could see it being compelling. Let's see, what is the starting price? $2,100. $2,100 for a gaming tablet. Sam says it costs the same or more as a comparable RRG Zes G 14 with a real RTX 50 70. That is a great laptop. The RRGs Zes G 14, we have praised that laptop so much. So this is not [00:44:00] really meant for anybody ACEs lifts to do these experiments. They're getting there, they're getting there in terms of creating a gaming tablet, but not quite something I'd recommend for everybody at this point. All right. We have a quick email from a listener too. Thank you for sending this in, Jake Thompson. If you wanna send us an email, e podcast in gadget.com, and again, your emails may head into our Asking Gadget section. Jake asks. He's a real estate agent in need of a new laptop. He uses a Chromebook right now and it meets every need he has. Everything they do is web-based, but should they consider alternatives to a premium com Chromebook for their next computer, he says he doesn't mind spending $750 or more if he can get something lightweight, trustworthy with a solid battery life. What would we consider in the search? I would point to, I immediately point to Jake, to our laptop guides because literally everything we mention, the MacBook Air. The Asis [00:45:00] Zen book, S 14, even the Dell Xbs 13 would be not much more than that price. I think more useful than a premium Chromebook because I think the idea of a premium Chromebook is a, is insanity. I don't know why you're spending so much money for a thing that can only do web apps, cheap Chromebooks, mid-range Chromebooks fine, $500 or less. Great. But if you're spending that much money and you want something that's more reliable, that you could do more with, even if everything you're doing is web-based, there may be other things you wanna do. MacBook Windows laptop. There is so much more you can unlock there. Little bit, a little bit of gaming, a little bit of media creation. I don't know, Karissa. Ben, do you have any thoughts on this? What would you recommend or do, would you guys be fine with the Chromebook? Karissa: I like Chromebooks. I thought my first thought, and maybe this is like too out there, but would an iPad Pro fit that fit those requirements? 'cause you can do a lot with an iPad Pro. You Devindra: can do a lot that's actually great battery, Karissa: lightweight, lots of apps. If most everything he's doing is web based, there's. You can probably use iPad apps. Devindra: That's actually a good point. Karissa you can [00:46:00] do a lot with an iPad and iPad Pro does start at around this price too. So it would be much lighter and thinner than a laptop. Especially if you could do a lot of web stuff. I feel like there are some web things that don't always run well in an iPad form. Safari and iPad doesn't support like everything you'd expect from a web-based site. Like I think if you. There are things we use like we use Video Ninja to record podcasts and that's using web RTC. Sometimes there are things like zencaster, something you have to use, apps to go use those things because I, iOS, iPad OS is so locked down. Multitasking isn't great on iPad os. But yeah, if you're not actually doing that much and you just want a nice. Media device. An iPad is a good option too. Alright, thank you so much Jake Thompson. That's a good one too because I wanna hear about people moving on from Chromebooks. 'cause they, send us more emails at podcast@enggadget.com for sure. Let's just skip right past what we're working on 'cause we're all busy. We're all busy with stuff unless you wanna mention anything. Chris, anything you're working on at the moment? Karissa: The only thing I wanna flag is that [00:47:00] we are rapidly approaching another TikTok sale or ban. Deadline Yes. Next month. Speaker: Sure. Karissa: Been a while since we heard anything about that, but, I'm sure they're hard at work on trying to hammer out this deal. Ben: Okay. But that's actually more relevant because they just figured out maybe the tariff situation and the tariff was the thing that spoiled the first deal. So we'll see what happens like at the beginning of July, yeah. I think Karissa: The deadline's the 19th of June Ben: oh, at the beginning of June. Sorry. Karissa: Yeah, so it's. It's pretty close. And yeah, there has been not much that I've heard on that front. So Devindra: this is where we are. We're just like walking to one broken negotiation after another for the next couple years. Anything you wanna mention, pop culture related krisa that is taking your mind off of our broken world. Karissa: So this is a weird one, but I have been, my husband loves Stargate, and we have been for years through, wait, the movie, the TV shows, Stargate [00:48:00] SG one. Oh Devindra: God. And I'm yeah. Just on the Karissa: last few episodes now in the end game portion of that show. So that has been I spent years like making fun of this and like making fun of him for watching it, but that show's Devindra: ridiculously bad, but yeah. Yeah. Karissa: Everything is so bad now that it's, actually just a nice. Yeah. Distraction to just watch something like so silly. Devindra: That's heartwarming actually, because it is a throwback to when things were simpler. You could just make dumb TV shows and they would last for 24 episodes per season. My for how Ben: many seasons too, Devindra: Karissa? Karissa: 10 seasons. Devindra: You just go on forever. Yeah. My local or lamb and rice place, my local place that does essentially New York streetcar style food, they placed Arga SG one. Every time I'm in there and I'm sitting there watching, I was like, how did we survive with this? How did we watch this show? It's because we just didn't have that much. We were desperate for for genre of fiction, but okay, that's heartwarming Krisa. Have you guys done Farscape? No. Have you seen Farscape? 'cause Farscape is very, is a very similar type of [00:49:00] show, but it has Jim Henson puppets and it has better writing. I love Jim Henson. It's very cool. Okay. It's it's also, it's unlike Stargate. It also dares to be like I don't know, sexy and violent too. Stargate always felt too campy to me. But Farscape was great. I bought that for $15. On iTunes, so that was a deal. I dunno if that deal is still there, but the entire series plus the the post series stuff is all out there. Shout out to Farscape. Shout out to Stargate SG one Simpler times. I'll just really briefly run down a few things and or season two finished over the last week. Incredible stuff. As I said in my initial review, it is really cool to people see people watching this thing and just being blown away by it. And I will say the show. Brought me to tears at the end, and I did not expect that. I did not expect that because we know this guy's gonna die. This is, we know his fate and yet it still means so much and it's so well written and the show is a phenomenon. Chris, I'd recommend it to you when you guys are recovering from Stargate SG one loss and or is fantastic. I also checked out a bit of murderbot the [00:50:00] Apple TV plus adaptation of the Martha Wells books. It's fine. It is weirdly I would say it is funny and entertaining because Alexander Skarsgard is a fun person to watch in in genre fiction. But it also feels like this could be funnier, this could be better produced. Like you could be doing more with this material and it feels like just lazy at times too. But it's a fine distraction if you are into like half-baked sci-fi. So I don't know. Another recommendation for Stargate SG one Levers, Karissa Final Destination Bloodlines. I reviewed over at the film Cast and I love this franchise. It is so cool to see it coming back after 15 years. This movie is incredible. Like this movie is great. If you understand the final destination formula, it's even better because it plays with your expectations of the franchise. I love a horror franchise where there's no, no definable villain. You're just trying to escape death. There's some great setups here. This is a great time at the movies. Get your popcorn. Just go enjoy the wonderfully creative kills. [00:51:00] And shout out to the Zap lapovsky and Adam B. Stein who. Apparently we're listening to my other podcast, and now we're making good movies. So that's always fun thing to see Mount Destination Bloodlines a much better film. The Mission Impossible, the Final Reckoning. My review of that is on the website now too. You can read that in a gadget. Ben: Thanks everybody for listening. Our theme music is by Game Composer Dale North. Our outro music is by our former managing editor, Terrence O'Brien. The podcast is produced by me. Ben Elman. You can find Karissa online at Karissa: Karissa b on threads Blue Sky, and sometimes still X. Ben: Unfortunately, you can find Dendra online Devindra: At dendra on Blue Sky and also podcast about movies and TV at the film cast@thefilmcast.com. Ben: If you really want to, you can find me. At hey bellman on Blue Sky. Email us at podcast@enggadget.com. Leave us a review on iTunes and subscribe on anything that gets podcasts. That includes [00:52:00] Spotify. This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/engadget-podcast-the-ai-and-xr-of-google-io-2025-131552868.html?src=rss
    0 Commentarii 0 Distribuiri
  • This AI-generated Fortnite video is a bleak glimpse at our future

    Earlier this week, Google unveiled Flow, a tool that can be used to generate AI video with ease. Users can submit text prompts or give Veo, the AI model that Flow uses, the digital equivalent of a mood board in exchange for eight second clips. From there, users can direct Flow to patch together different clips to form a longer stream of footage, potentially allowing for the creation of entire films. Immediately, people experimented with asking the AI to generate gameplay footage — and the tools are shockingly good at looking like games that you might recognize.

    Already, one video has amassed millions of views as onlookers are in awe over how easily the AI footage could be mistaken for actual Fortnite gameplay. According to Matt Shumer, who originally generated the footage, the prompt he entered to produce this content never mentioned Fortnite by name. What he apparently wrote was, “Streamer getting a victory royale with just his pickaxe.”

    Uhhh… I don't think Veo 3 is supposed to be generating Fortnite gameplay pic.twitter.com/bWKruQ5Nox— Matt ShumerMay 21, 2025

    Google did not respond to a request for comment over whether or not Veo should be generating footage that mimics copyrighted material. However, this does not appear to be an isolated incident. Another user got Veo to spit out something based on the idea of GTA 6. The result is probably a far cry from the realistic graphics GTA 6 has displayed in trailers thus far, but the gameplay still successfully replicates the aesthetic Rockstar is known for:

    We got Veo 3 playing GTA 6 before we got GTA 6!pic.twitter.com/OM63yf0CKK— Sherveen MashayekhiMay 20, 2025

    Though there are limitations — eight seconds is a short period of time, especially compared to the hours of material that human streamers generate — it’s undoubtedly an impressive piece of technology that augurs a specific pathway for the future of livestreams. We’ve already got AI-powered Twitch streamers like Neuro-sama, which hooks up a large language model to a text-to-speech program that allows the chibi influencer to speak to her viewers. Neuro-sama learns from other actual Twitch streamers, which makes her personality as malleable as it is chaotic.

    Imagine, for a moment, if an AI streamer didn’t need to rely on an actual game to endlessly entertain its viewers. Most games have a distinct beginning and end, and even live service games cannot endlessly produce new material. The combination of endless entertainment hosted by a personality who never needs to eat or sleep is a powerful if not terrifying combo, no? In January, Neuro-sama briefly became one in the top ten most subscribed Twitch channels according to stats website Twitch Tracker.

    That, and, an AI personality can sidestep many of the issues that are inherent to parasocial relationships. An AI cannot be harassed, swatted, or stalked by traditional means. An AI can still offend its viewers, but blame and responsibility in such instances are hazy concepts. AI-on-AI content — meaning, an AI streamer showing off AI footage — seems like the natural end point for the trends we’re seeing on platforms like Twitch.

    Twitch, for its part, already has a category for AI content. Its policies do not address the use of AI content beyond banning deepfake porn, but sexually explicit content of that nature wouldn’t be allowed regardless of source.

    “This topic is very much on our radar, and we are always monitoring emerging behaviors to ensure our policies remain relevant to what’s happening on our service,” a Twitch blog post from 2023 on deepfakes states. In 2024, ex-Twitch CEO Dan Clancy — who has a PhD in artificial intelligence — seemed confident about the opportunities that AI might afford Twitch streamers when Business Insider asked him about it in 2024. Clancy called AI a “boon” for Twitch that could potentially generate “endless” stimuli to react to.

    Would the general populace really be receptive to AI-on-AI content, though? Slurs aside, Fortnite’s AI Darth Vader seemed to be a hit. At the same time, nearly all generative models tend to spawn humans who have an unsettling aura. Everyone is laughing, yet no one looks happy. The cheer is forced in a way where you can practically imagine someone off-frame, menacingly holding a gun to the AI’s head. Like a dream where the more people smile, the closer things get to a nightmare. Everything is as perfect as it is hollow.

    Until the technology improves, any potential entertainer molded in the image of stock photography risks repulsing its viewers. Yet the internet is already slipping away from serving the needs of real human beings. Millions of bots roam about Twitch, dutifully inflating the views of streamers. Human beings will always crave the company of other people, sure. Much like mass production did for artisanal crafts, a future where our feeds are taken over by AI might just exponentially raise the value of authenticity and the human touch.

    But 2025 was the first year in history that traffic on the internet was determined to be frequented more by bots than people. It’s already a bot’s world out there. We’re just breathing in it.
    #this #aigenerated #fortnite #video #bleak
    This AI-generated Fortnite video is a bleak glimpse at our future
    Earlier this week, Google unveiled Flow, a tool that can be used to generate AI video with ease. Users can submit text prompts or give Veo, the AI model that Flow uses, the digital equivalent of a mood board in exchange for eight second clips. From there, users can direct Flow to patch together different clips to form a longer stream of footage, potentially allowing for the creation of entire films. Immediately, people experimented with asking the AI to generate gameplay footage — and the tools are shockingly good at looking like games that you might recognize. Already, one video has amassed millions of views as onlookers are in awe over how easily the AI footage could be mistaken for actual Fortnite gameplay. According to Matt Shumer, who originally generated the footage, the prompt he entered to produce this content never mentioned Fortnite by name. What he apparently wrote was, “Streamer getting a victory royale with just his pickaxe.” Uhhh… I don't think Veo 3 is supposed to be generating Fortnite gameplay pic.twitter.com/bWKruQ5Nox— Matt ShumerMay 21, 2025 Google did not respond to a request for comment over whether or not Veo should be generating footage that mimics copyrighted material. However, this does not appear to be an isolated incident. Another user got Veo to spit out something based on the idea of GTA 6. The result is probably a far cry from the realistic graphics GTA 6 has displayed in trailers thus far, but the gameplay still successfully replicates the aesthetic Rockstar is known for: We got Veo 3 playing GTA 6 before we got GTA 6!pic.twitter.com/OM63yf0CKK— Sherveen MashayekhiMay 20, 2025 Though there are limitations — eight seconds is a short period of time, especially compared to the hours of material that human streamers generate — it’s undoubtedly an impressive piece of technology that augurs a specific pathway for the future of livestreams. We’ve already got AI-powered Twitch streamers like Neuro-sama, which hooks up a large language model to a text-to-speech program that allows the chibi influencer to speak to her viewers. Neuro-sama learns from other actual Twitch streamers, which makes her personality as malleable as it is chaotic. Imagine, for a moment, if an AI streamer didn’t need to rely on an actual game to endlessly entertain its viewers. Most games have a distinct beginning and end, and even live service games cannot endlessly produce new material. The combination of endless entertainment hosted by a personality who never needs to eat or sleep is a powerful if not terrifying combo, no? In January, Neuro-sama briefly became one in the top ten most subscribed Twitch channels according to stats website Twitch Tracker. That, and, an AI personality can sidestep many of the issues that are inherent to parasocial relationships. An AI cannot be harassed, swatted, or stalked by traditional means. An AI can still offend its viewers, but blame and responsibility in such instances are hazy concepts. AI-on-AI content — meaning, an AI streamer showing off AI footage — seems like the natural end point for the trends we’re seeing on platforms like Twitch. Twitch, for its part, already has a category for AI content. Its policies do not address the use of AI content beyond banning deepfake porn, but sexually explicit content of that nature wouldn’t be allowed regardless of source. “This topic is very much on our radar, and we are always monitoring emerging behaviors to ensure our policies remain relevant to what’s happening on our service,” a Twitch blog post from 2023 on deepfakes states. In 2024, ex-Twitch CEO Dan Clancy — who has a PhD in artificial intelligence — seemed confident about the opportunities that AI might afford Twitch streamers when Business Insider asked him about it in 2024. Clancy called AI a “boon” for Twitch that could potentially generate “endless” stimuli to react to. Would the general populace really be receptive to AI-on-AI content, though? Slurs aside, Fortnite’s AI Darth Vader seemed to be a hit. At the same time, nearly all generative models tend to spawn humans who have an unsettling aura. Everyone is laughing, yet no one looks happy. The cheer is forced in a way where you can practically imagine someone off-frame, menacingly holding a gun to the AI’s head. Like a dream where the more people smile, the closer things get to a nightmare. Everything is as perfect as it is hollow. Until the technology improves, any potential entertainer molded in the image of stock photography risks repulsing its viewers. Yet the internet is already slipping away from serving the needs of real human beings. Millions of bots roam about Twitch, dutifully inflating the views of streamers. Human beings will always crave the company of other people, sure. Much like mass production did for artisanal crafts, a future where our feeds are taken over by AI might just exponentially raise the value of authenticity and the human touch. But 2025 was the first year in history that traffic on the internet was determined to be frequented more by bots than people. It’s already a bot’s world out there. We’re just breathing in it. #this #aigenerated #fortnite #video #bleak
    WWW.POLYGON.COM
    This AI-generated Fortnite video is a bleak glimpse at our future
    Earlier this week, Google unveiled Flow, a tool that can be used to generate AI video with ease. Users can submit text prompts or give Veo, the AI model that Flow uses, the digital equivalent of a mood board in exchange for eight second clips. From there, users can direct Flow to patch together different clips to form a longer stream of footage, potentially allowing for the creation of entire films. Immediately, people experimented with asking the AI to generate gameplay footage — and the tools are shockingly good at looking like games that you might recognize. Already, one video has amassed millions of views as onlookers are in awe over how easily the AI footage could be mistaken for actual Fortnite gameplay. According to Matt Shumer, who originally generated the footage, the prompt he entered to produce this content never mentioned Fortnite by name. What he apparently wrote was, “Streamer getting a victory royale with just his pickaxe.” Uhhh… I don't think Veo 3 is supposed to be generating Fortnite gameplay pic.twitter.com/bWKruQ5Nox— Matt Shumer (@mattshumer_) May 21, 2025 Google did not respond to a request for comment over whether or not Veo should be generating footage that mimics copyrighted material. However, this does not appear to be an isolated incident. Another user got Veo to spit out something based on the idea of GTA 6. The result is probably a far cry from the realistic graphics GTA 6 has displayed in trailers thus far, but the gameplay still successfully replicates the aesthetic Rockstar is known for: We got Veo 3 playing GTA 6 before we got GTA 6!(what impresses me here is two distinct throughlines of audio: the guy, the game – prompt was 'a twitch streamer playing grand theft auto 6') pic.twitter.com/OM63yf0CKK— Sherveen Mashayekhi (@Sherveen) May 20, 2025 Though there are limitations — eight seconds is a short period of time, especially compared to the hours of material that human streamers generate — it’s undoubtedly an impressive piece of technology that augurs a specific pathway for the future of livestreams. We’ve already got AI-powered Twitch streamers like Neuro-sama, which hooks up a large language model to a text-to-speech program that allows the chibi influencer to speak to her viewers. Neuro-sama learns from other actual Twitch streamers, which makes her personality as malleable as it is chaotic. Imagine, for a moment, if an AI streamer didn’t need to rely on an actual game to endlessly entertain its viewers. Most games have a distinct beginning and end, and even live service games cannot endlessly produce new material. The combination of endless entertainment hosted by a personality who never needs to eat or sleep is a powerful if not terrifying combo, no? In January, Neuro-sama briefly became one in the top ten most subscribed Twitch channels according to stats website Twitch Tracker. That, and, an AI personality can sidestep many of the issues that are inherent to parasocial relationships. An AI cannot be harassed, swatted, or stalked by traditional means. An AI can still offend its viewers, but blame and responsibility in such instances are hazy concepts. AI-on-AI content — meaning, an AI streamer showing off AI footage — seems like the natural end point for the trends we’re seeing on platforms like Twitch. Twitch, for its part, already has a category for AI content. Its policies do not address the use of AI content beyond banning deepfake porn, but sexually explicit content of that nature wouldn’t be allowed regardless of source. “This topic is very much on our radar, and we are always monitoring emerging behaviors to ensure our policies remain relevant to what’s happening on our service,” a Twitch blog post from 2023 on deepfakes states. In 2024, ex-Twitch CEO Dan Clancy — who has a PhD in artificial intelligence — seemed confident about the opportunities that AI might afford Twitch streamers when Business Insider asked him about it in 2024. Clancy called AI a “boon” for Twitch that could potentially generate “endless” stimuli to react to. Would the general populace really be receptive to AI-on-AI content, though? Slurs aside, Fortnite’s AI Darth Vader seemed to be a hit. At the same time, nearly all generative models tend to spawn humans who have an unsettling aura. Everyone is laughing, yet no one looks happy. The cheer is forced in a way where you can practically imagine someone off-frame, menacingly holding a gun to the AI’s head. Like a dream where the more people smile, the closer things get to a nightmare. Everything is as perfect as it is hollow. Until the technology improves, any potential entertainer molded in the image of stock photography risks repulsing its viewers. Yet the internet is already slipping away from serving the needs of real human beings. Millions of bots roam about Twitch, dutifully inflating the views of streamers. Human beings will always crave the company of other people, sure. Much like mass production did for artisanal crafts, a future where our feeds are taken over by AI might just exponentially raise the value of authenticity and the human touch. But 2025 was the first year in history that traffic on the internet was determined to be frequented more by bots than people. It’s already a bot’s world out there. We’re just breathing in it.
    0 Commentarii 0 Distribuiri
  • These Memorial Day Sales Are the Perfect Excuse to Upgrade My Gaming Office Setup

    I have big plans over Memorial Day weekend to get some shopping in. I don't usually buy anything from Memorial Day sales, but with all of the uncertainty about tariffs and prices in the U.S. I've been feeling like I should make my purchases now rather than wait to see if prices go up or not. More specifically, I've been meaning to upgrade my gaming setup for years and it's starting to feel like a now or never situation.Most of the Memorial Day sales have been live since last week, and I've primarily been browsing the Wayfair sale for various furniture items. I'm specifically looking for a new office chair and desk as well as a nice bookcase to store my games, and the Wayfair sale has overall been the easiest website to navigate so far. I've also been looking at IGN's picks for best gaming chair and gaming desks to help broaden my search to other popular brands.If you're looking to upgrade your own gaming setup or office during these sales, here are my suggestions for how to go about it.Start With the Wayfair SaleFurnitureWayfair Memorial Day saleSort through thousands of home and office deals.See it at WayfairAlthough the Amazon Memorial Day sale has the most discounts on everything, Wayfair seems to have the most discounts on furniture. I'm suggesting this as a good starting point because of the sheer quantity of deals, but also because of how easy it is to filter out everything that isn't a Memorial Day deal. You can also use filters to narrow down every product category by dimensions, material, color, price, etc. Even if you don't end up buying anything from here, it's still a good place to narrow down what you actually want for your space. I've used Wayfair to buy things like bookshelves, rugs, lamps, and tables and would generally recommend them for things like that.That being said, Wayfair doesn't always have all of the best brands. Their prices are usually pretty good for standard furniture items, but it's not necessarily the best place to buy a really nice office chair or a gaming-specific desk.Splurge on Comfort OnlyIf you're upgrading your whole setup, I'd recommend splurging on the chair only. I am still deciding which chair I'm going to buy, but I already know it's the thing I'm spending the most money on. My current office chair is literally falling apart and I have it held together with a pair of sweatpants I wrapped around it. Anything would be an upgrade at this point, but since I'm planning to both work and play games in this chair I'm being picky. I still haven't decided on whether I'm going to get a gaming chair or an office chair, but IGN has recommended the Secretlab Titan Evo and I'm very tempted by that now that there's a Secretlab sale happening.Gaming chairs and desksSecretlab Memorial Day SaleSee it at SecretlabGet Your Accessories at AmazonThe final, and arguably most important step, is customizing your gaming setup. This means things like RGB lights, mouse pads, laptop stands, headset holder, Steam Deck docks, and everything else that actually attunes your battle station to you. For this I've found that Amazon has generally had the best prices on a lot of these things. Sure, Wayfair has a cardboard cutout of Darth Vader on sale, but for everything else Amazon has been where I've been shopping. The Biggest SaleAmazon Memorial Day SaleSee it at AmazonAlthough I have no intention of buying a TV myself, it is worth noting that Amazon also has some of the best Memorial Day TV deals I've seen so far. Most notably, the LG C4 dropped to a new all-time low on Amazon and is an excellent gaming TV for both console and PC gamers.Upgrade Your Gaming PC While You CanI don't currently have a good gaming PC and have absolutely no room in my budget for one right now. I primarily use my Steam Deck deck and Nintendo Switch for gaming, which has been great for mobility but not for performance. I would love to upgrade to a new PC with an RTX 5090, but with GPU prices on the rise and a potential Switch 2 purchase in my future it just doesn't seem like its in the cards for me right now.That being said, if you are looking for a new gaming laptop or PC, Memorial Day sales are actually a really good time to shop for that. I'd recommend checking out the Dell Memorial Day sale first, but there are also discounts at HP, Lenovo, Best Buy, and more. Here are all of the biggest sales I've found in that category.PC and LaptopsDell Memorial Day SaleSee it at DellPC and LaptopsLenovo Memorial Day SaleSee it at LenovoPC and LaptopsHP Memorial Day SaleSee it at HPTech and AppliancesBest Buy Memorial Day SaleSee it at Best BuyJacob Kienlen is a Senior SEO Strategist and Writer for IGN. Born and raised in Portland, Oregon, he has considered the Northwest his home for his entire life. With a bachelor's degree in communication and over 8 years of professional writing experience, his expertise ranges from books and games to technology and food. He has spent a good chunk of his career writing about deals and sales to help consumers find the best discounts on whatever they may be looking to buy.
    #these #memorial #day #sales #are
    These Memorial Day Sales Are the Perfect Excuse to Upgrade My Gaming Office Setup
    I have big plans over Memorial Day weekend to get some shopping in. I don't usually buy anything from Memorial Day sales, but with all of the uncertainty about tariffs and prices in the U.S. I've been feeling like I should make my purchases now rather than wait to see if prices go up or not. More specifically, I've been meaning to upgrade my gaming setup for years and it's starting to feel like a now or never situation.Most of the Memorial Day sales have been live since last week, and I've primarily been browsing the Wayfair sale for various furniture items. I'm specifically looking for a new office chair and desk as well as a nice bookcase to store my games, and the Wayfair sale has overall been the easiest website to navigate so far. I've also been looking at IGN's picks for best gaming chair and gaming desks to help broaden my search to other popular brands.If you're looking to upgrade your own gaming setup or office during these sales, here are my suggestions for how to go about it.Start With the Wayfair SaleFurnitureWayfair Memorial Day saleSort through thousands of home and office deals.See it at WayfairAlthough the Amazon Memorial Day sale has the most discounts on everything, Wayfair seems to have the most discounts on furniture. I'm suggesting this as a good starting point because of the sheer quantity of deals, but also because of how easy it is to filter out everything that isn't a Memorial Day deal. You can also use filters to narrow down every product category by dimensions, material, color, price, etc. Even if you don't end up buying anything from here, it's still a good place to narrow down what you actually want for your space. I've used Wayfair to buy things like bookshelves, rugs, lamps, and tables and would generally recommend them for things like that.That being said, Wayfair doesn't always have all of the best brands. Their prices are usually pretty good for standard furniture items, but it's not necessarily the best place to buy a really nice office chair or a gaming-specific desk.Splurge on Comfort OnlyIf you're upgrading your whole setup, I'd recommend splurging on the chair only. I am still deciding which chair I'm going to buy, but I already know it's the thing I'm spending the most money on. My current office chair is literally falling apart and I have it held together with a pair of sweatpants I wrapped around it. Anything would be an upgrade at this point, but since I'm planning to both work and play games in this chair I'm being picky. I still haven't decided on whether I'm going to get a gaming chair or an office chair, but IGN has recommended the Secretlab Titan Evo and I'm very tempted by that now that there's a Secretlab sale happening.Gaming chairs and desksSecretlab Memorial Day SaleSee it at SecretlabGet Your Accessories at AmazonThe final, and arguably most important step, is customizing your gaming setup. This means things like RGB lights, mouse pads, laptop stands, headset holder, Steam Deck docks, and everything else that actually attunes your battle station to you. For this I've found that Amazon has generally had the best prices on a lot of these things. Sure, Wayfair has a cardboard cutout of Darth Vader on sale, but for everything else Amazon has been where I've been shopping. The Biggest SaleAmazon Memorial Day SaleSee it at AmazonAlthough I have no intention of buying a TV myself, it is worth noting that Amazon also has some of the best Memorial Day TV deals I've seen so far. Most notably, the LG C4 dropped to a new all-time low on Amazon and is an excellent gaming TV for both console and PC gamers.Upgrade Your Gaming PC While You CanI don't currently have a good gaming PC and have absolutely no room in my budget for one right now. I primarily use my Steam Deck deck and Nintendo Switch for gaming, which has been great for mobility but not for performance. I would love to upgrade to a new PC with an RTX 5090, but with GPU prices on the rise and a potential Switch 2 purchase in my future it just doesn't seem like its in the cards for me right now.That being said, if you are looking for a new gaming laptop or PC, Memorial Day sales are actually a really good time to shop for that. I'd recommend checking out the Dell Memorial Day sale first, but there are also discounts at HP, Lenovo, Best Buy, and more. Here are all of the biggest sales I've found in that category.PC and LaptopsDell Memorial Day SaleSee it at DellPC and LaptopsLenovo Memorial Day SaleSee it at LenovoPC and LaptopsHP Memorial Day SaleSee it at HPTech and AppliancesBest Buy Memorial Day SaleSee it at Best BuyJacob Kienlen is a Senior SEO Strategist and Writer for IGN. Born and raised in Portland, Oregon, he has considered the Northwest his home for his entire life. With a bachelor's degree in communication and over 8 years of professional writing experience, his expertise ranges from books and games to technology and food. He has spent a good chunk of his career writing about deals and sales to help consumers find the best discounts on whatever they may be looking to buy. #these #memorial #day #sales #are
    WWW.IGN.COM
    These Memorial Day Sales Are the Perfect Excuse to Upgrade My Gaming Office Setup
    I have big plans over Memorial Day weekend to get some shopping in. I don't usually buy anything from Memorial Day sales, but with all of the uncertainty about tariffs and prices in the U.S. I've been feeling like I should make my purchases now rather than wait to see if prices go up or not. More specifically, I've been meaning to upgrade my gaming setup for years and it's starting to feel like a now or never situation.Most of the Memorial Day sales have been live since last week, and I've primarily been browsing the Wayfair sale for various furniture items. I'm specifically looking for a new office chair and desk as well as a nice bookcase to store my games, and the Wayfair sale has overall been the easiest website to navigate so far. I've also been looking at IGN's picks for best gaming chair and gaming desks to help broaden my search to other popular brands.If you're looking to upgrade your own gaming setup or office during these sales, here are my suggestions for how to go about it.Start With the Wayfair SaleFurnitureWayfair Memorial Day saleSort through thousands of home and office deals.See it at WayfairAlthough the Amazon Memorial Day sale has the most discounts on everything, Wayfair seems to have the most discounts on furniture. I'm suggesting this as a good starting point because of the sheer quantity of deals, but also because of how easy it is to filter out everything that isn't a Memorial Day deal. You can also use filters to narrow down every product category by dimensions, material, color, price, etc. Even if you don't end up buying anything from here, it's still a good place to narrow down what you actually want for your space. I've used Wayfair to buy things like bookshelves, rugs, lamps, and tables and would generally recommend them for things like that.That being said, Wayfair doesn't always have all of the best brands. Their prices are usually pretty good for standard furniture items, but it's not necessarily the best place to buy a really nice office chair or a gaming-specific desk.Splurge on Comfort OnlyIf you're upgrading your whole setup, I'd recommend splurging on the chair only. I am still deciding which chair I'm going to buy, but I already know it's the thing I'm spending the most money on. My current office chair is literally falling apart and I have it held together with a pair of sweatpants I wrapped around it. Anything would be an upgrade at this point, but since I'm planning to both work and play games in this chair I'm being picky. I still haven't decided on whether I'm going to get a gaming chair or an office chair, but IGN has recommended the Secretlab Titan Evo and I'm very tempted by that now that there's a Secretlab sale happening.Gaming chairs and desksSecretlab Memorial Day SaleSee it at SecretlabGet Your Accessories at AmazonThe final, and arguably most important step, is customizing your gaming setup. This means things like RGB lights, mouse pads, laptop stands, headset holder, Steam Deck docks, and everything else that actually attunes your battle station to you. For this I've found that Amazon has generally had the best prices on a lot of these things. Sure, Wayfair has a cardboard cutout of Darth Vader on sale, but for everything else Amazon has been where I've been shopping. The Biggest SaleAmazon Memorial Day SaleSee it at AmazonAlthough I have no intention of buying a TV myself, it is worth noting that Amazon also has some of the best Memorial Day TV deals I've seen so far. Most notably, the LG C4 dropped to a new all-time low on Amazon and is an excellent gaming TV for both console and PC gamers.Upgrade Your Gaming PC While You CanI don't currently have a good gaming PC and have absolutely no room in my budget for one right now. I primarily use my Steam Deck deck and Nintendo Switch for gaming, which has been great for mobility but not for performance. I would love to upgrade to a new PC with an RTX 5090, but with GPU prices on the rise and a potential Switch 2 purchase in my future it just doesn't seem like its in the cards for me right now.That being said, if you are looking for a new gaming laptop or PC, Memorial Day sales are actually a really good time to shop for that. I'd recommend checking out the Dell Memorial Day sale first, but there are also discounts at HP, Lenovo, Best Buy, and more. Here are all of the biggest sales I've found in that category.PC and LaptopsDell Memorial Day SaleSee it at DellPC and LaptopsLenovo Memorial Day SaleSee it at LenovoPC and LaptopsHP Memorial Day SaleSee it at HPTech and AppliancesBest Buy Memorial Day SaleSee it at Best BuyJacob Kienlen is a Senior SEO Strategist and Writer for IGN. Born and raised in Portland, Oregon, he has considered the Northwest his home for his entire life. With a bachelor's degree in communication and over 8 years of professional writing experience, his expertise ranges from books and games to technology and food. He has spent a good chunk of his career writing about deals and sales to help consumers find the best discounts on whatever they may be looking to buy.
    0 Commentarii 0 Distribuiri