• ## Introduction

    In recent months, the French government has implemented strict regulations that have led to the closure of popular adult websites such as PornHub, YouPorn, and RedTube. These measures, aimed at controlling the online adult content market, have left many users frustrated and seeking ways to regain access to their favorite sites. This article will explore the implications of these restrictions, the reasons behind them, and how individuals can circumvent these limitations.

    ## The...
    ## Introduction In recent months, the French government has implemented strict regulations that have led to the closure of popular adult websites such as PornHub, YouPorn, and RedTube. These measures, aimed at controlling the online adult content market, have left many users frustrated and seeking ways to regain access to their favorite sites. This article will explore the implications of these restrictions, the reasons behind them, and how individuals can circumvent these limitations. ## The...
    Débloquer PornHub, YouPorn, RedTube en France : retrouvez un accès illimité
    ## Introduction In recent months, the French government has implemented strict regulations that have led to the closure of popular adult websites such as PornHub, YouPorn, and RedTube. These measures, aimed at controlling the online adult content market, have left many users frustrated and seeking ways to regain access to their favorite sites. This article will explore the implications of...
    Like
    Love
    Wow
    Sad
    Angry
    559
    1 Commentaires 0 Parts
  • In a world where 3D printing has become the new frontier of human achievement, it appears that our beloved gadgets are not just printing our wildest dreams, but also a symphony of snaps and crackles that would make even the most seasoned sound engineer weep. Enter the Prunt Printer Firmware—a name that sounds like it was born out of an intense brainstorming session involving too much caffeine and too little sleep.

    Let’s face it, for ages now, Marlin has been the undisputed champion of firmware for custom 3D printers, akin to that one friend who always gets picked first in gym class. But wait! Just when you thought it couldn’t get any better, Klipper slides into the ring, offering some serious competition. Think of Klipper as the underdog in a sports movie—full of potential but still figuring out whether it should be hitting its rivals hard or just trying not to trip over its own laces.

    Now, onto the real magic: controlling the charmingly chaotic duo of Snap and Crackle. It’s almost poetic, isn’t it? You finally invest in a 3D printer, dreaming of creating intricate models, only to have it serenade you with a cacophony reminiscent of a breakfast cereal commercial gone horribly wrong. But fear not! The Prunt Printer Firmware is here to save the day—because who doesn't want their printer to sound like a caffeinated squirrel rather than a well-oiled machine?

    Embracing the Prunt Firmware is like adopting a pet rock. Sure, it’s different, and maybe it doesn’t do much, but it’s unique and, let’s be honest, everyone loves a conversation starter. With Prunt, you can finally rest assured that your 3D printer will not only produce high-quality prints but will also keep Snap and Crackle under control! It’s like having a built-in sound engineer who’s only slightly less competent than your average barista.

    And let’s not overlook the sheer genius of this firmware’s name. “Prunt”? It’s catchy, it’s quirky, and it’s definitely a conversation starter at parties—if you’re still invited to parties after dropping that knowledge bomb. “Oh, you’re using Marlin? How quaint. I’ve upgraded to Prunt. It’s the future!” Cue the blank stares and awkward silence.

    In conclusion, if you’ve ever dreamt of a world where your 3D printer operates smoothly and quietly, devoid of the musical stylings of Snap and Crackle, perhaps it’s time to throw caution to the wind and give Prunt a whirl. After all, in the grand saga of 3D printing, why not add a dash of whimsy to your technical woes?

    Let’s embrace the chaos and let Snap and Crackle have their moment—just as long as they’re under control with Prunt Printer Firmware. Because in the end, isn’t that what we all really want?

    #3DPrinting #PruntFirmware #SnapAndCrackle #MarlinVsKlipper #TechHumor
    In a world where 3D printing has become the new frontier of human achievement, it appears that our beloved gadgets are not just printing our wildest dreams, but also a symphony of snaps and crackles that would make even the most seasoned sound engineer weep. Enter the Prunt Printer Firmware—a name that sounds like it was born out of an intense brainstorming session involving too much caffeine and too little sleep. Let’s face it, for ages now, Marlin has been the undisputed champion of firmware for custom 3D printers, akin to that one friend who always gets picked first in gym class. But wait! Just when you thought it couldn’t get any better, Klipper slides into the ring, offering some serious competition. Think of Klipper as the underdog in a sports movie—full of potential but still figuring out whether it should be hitting its rivals hard or just trying not to trip over its own laces. Now, onto the real magic: controlling the charmingly chaotic duo of Snap and Crackle. It’s almost poetic, isn’t it? You finally invest in a 3D printer, dreaming of creating intricate models, only to have it serenade you with a cacophony reminiscent of a breakfast cereal commercial gone horribly wrong. But fear not! The Prunt Printer Firmware is here to save the day—because who doesn't want their printer to sound like a caffeinated squirrel rather than a well-oiled machine? Embracing the Prunt Firmware is like adopting a pet rock. Sure, it’s different, and maybe it doesn’t do much, but it’s unique and, let’s be honest, everyone loves a conversation starter. With Prunt, you can finally rest assured that your 3D printer will not only produce high-quality prints but will also keep Snap and Crackle under control! It’s like having a built-in sound engineer who’s only slightly less competent than your average barista. And let’s not overlook the sheer genius of this firmware’s name. “Prunt”? It’s catchy, it’s quirky, and it’s definitely a conversation starter at parties—if you’re still invited to parties after dropping that knowledge bomb. “Oh, you’re using Marlin? How quaint. I’ve upgraded to Prunt. It’s the future!” Cue the blank stares and awkward silence. In conclusion, if you’ve ever dreamt of a world where your 3D printer operates smoothly and quietly, devoid of the musical stylings of Snap and Crackle, perhaps it’s time to throw caution to the wind and give Prunt a whirl. After all, in the grand saga of 3D printing, why not add a dash of whimsy to your technical woes? Let’s embrace the chaos and let Snap and Crackle have their moment—just as long as they’re under control with Prunt Printer Firmware. Because in the end, isn’t that what we all really want? #3DPrinting #PruntFirmware #SnapAndCrackle #MarlinVsKlipper #TechHumor
    Keeping Snap and Crackle under Control with Prunt Printer Firmware
    For quite some time now, Marlin has been the firmware of choice for any kind of custom 3D printer, with only Klipper offering some serious competition in the open-source world. …read more
    Like
    Love
    Wow
    Sad
    Angry
    632
    1 Commentaires 0 Parts
  • Hey there, amazing people!

    Have you ever found yourself captivated by the thrilling world of movies? Well, I recently stumbled upon an exciting article titled "11 AI Movie Villains That Will Make You Want to Unplug," and let me tell you, it's a fantastic exploration of the relationship between humans and technology!

    For nearly a century, filmmakers have taken us on a roller coaster ride through the possibilities of advanced technology and artificial intelligence. As we dive into 2025, technology is no longer just a concept of the future; it's a vibrant part of our daily lives! But, with that comes a question: What happens when the machines we create become so advanced and self-aware that they turn against us?

    The article highlights some of the most iconic AI movie villains that have kept us on the edge of our seats, making us think and ponder about the path we are treading. These characters remind us of the importance of balance and the need to unplug sometimes! It’s a gentle nudge to reflect on our relationship with technology. Are we in control, or is it controlling us?

    But here’s the good news! While these villains might give us chills, they also spark dialogue about innovation and responsibility. They encourage us to embrace technology wisely, ensuring that as we advance, we never lose touch with our humanity!

    Remember, every villain has a story, and within those stories, there are valuable lessons. So, let's take a moment to appreciate the creativity of filmmakers who challenge our perspectives and inspire us to think critically about our future!

    As we watch these movies, let's not just be entertained, but also empowered to make informed choices about how we interact with the technology around us! What are some of your favorite AI villains from movies? How do they inspire you to engage with technology more mindfully? Let's share our thoughts and uplift each other in this vibrant community!

    Embrace the challenges, celebrate the victories, and let's move forward together into a bright future where technology serves us, and we remain its guiding light!

    #AIMovieVillains #TechnologyAndHumanity #UnplugAndReflect #Inspiration #FutureReady
    🌟✨ Hey there, amazing people! 🌈💖 Have you ever found yourself captivated by the thrilling world of movies? 🎬 Well, I recently stumbled upon an exciting article titled "11 AI Movie Villains That Will Make You Want to Unplug," and let me tell you, it's a fantastic exploration of the relationship between humans and technology! 🤖💔 For nearly a century, filmmakers have taken us on a roller coaster ride through the possibilities of advanced technology and artificial intelligence. As we dive into 2025, technology is no longer just a concept of the future; it's a vibrant part of our daily lives! 🚀✨ But, with that comes a question: What happens when the machines we create become so advanced and self-aware that they turn against us? 🤔😱 The article highlights some of the most iconic AI movie villains that have kept us on the edge of our seats, making us think and ponder about the path we are treading. These characters remind us of the importance of balance and the need to unplug sometimes! 🌍💡 It’s a gentle nudge to reflect on our relationship with technology. Are we in control, or is it controlling us? But here’s the good news! 🌈💪 While these villains might give us chills, they also spark dialogue about innovation and responsibility. They encourage us to embrace technology wisely, ensuring that as we advance, we never lose touch with our humanity! 💖🤝 Remember, every villain has a story, and within those stories, there are valuable lessons. 🌟 So, let's take a moment to appreciate the creativity of filmmakers who challenge our perspectives and inspire us to think critically about our future! 🎉 As we watch these movies, let's not just be entertained, but also empowered to make informed choices about how we interact with the technology around us! 🌟 What are some of your favorite AI villains from movies? How do they inspire you to engage with technology more mindfully? Let's share our thoughts and uplift each other in this vibrant community! 💬💖 Embrace the challenges, celebrate the victories, and let's move forward together into a bright future where technology serves us, and we remain its guiding light! 🌟✨ #AIMovieVillains #TechnologyAndHumanity #UnplugAndReflect #Inspiration #FutureReady
    11 AI Movie Villains That Will Make You Want to Unplug
    For nearly a century, filmmakers have been questioning what happens when technology becomes so advanced and self-aware that the machines we invent turn against the humans who created them. Artificial intelligence is no longer just a science fiction c
    Like
    Love
    Wow
    Angry
    Sad
    566
    1 Commentaires 0 Parts
  • Hungry Bacteria Hunt Their Neighbors With Tiny, Poison-Tipped Harpoons

    Starving bacteriause a microscopic harpoon—called the Type VI secretion system—to stab and kill neighboring cells. The prey burst, turning spherical and leaking nutrients, which the killers then use to survive and grow.NewsletterSign up for our email newsletter for the latest science newsBacteria are bad neighbors. And we’re not talking noisy, never-take-out-the-trash bad neighbors. We’re talking has-a-harpoon-gun-and-points-it-at-you bad neighbors. According to a new study in Science, some bacteria hunt nearby bacterial species when they’re hungry. Using a special weapon system called the Type VI Secretion System, these bacteria shoot, spill, and then absorb the nutrients from the microbes they harpoon. “The punchline is: When things get tough, you eat your neighbors,” said Glen D’Souza, a study author and an assistant professor at Arizona State University, according to a press release. “We’ve known bacteria kill each other, that’s textbook. But what we’re seeing is that it’s not just important that the bacteria have weapons to kill, but they are controlling when they use those weapons specifically for situations to eat others where they can’t grow themselves.” According to the study authors, the research doesn’t just have implications for bacterial neighborhoods; it also has implications for human health and medicine. By harnessing these bacterial weapons, it may be possible to build better targeted antibiotics, designed to overcome antibiotic resistance. Ruthless Bacteria Use HarpoonsResearchers have long known that some bacteria can be ruthless, using weapons like the T6SS to clear out their competition. A nasty tool, the T6SS is essentially a tiny harpoon gun with a poison-tipped needle. When a bacterium shoots the weapon into another bacterium from a separate species, the needle pierces the microbe without killing it. Then, it injects toxins into the microbe that cause its internal nutrients to spill out.Up until now, researchers thought that this weapon helped bacteria eliminate their competition for space and for food, but after watching bacteria use the T6SS to attack their neighbors when food was scarce, the study authors concluded that these tiny harpooners use the weapon not only to remove rivals, but also to consume their competitors’ leaked nutrients.“Watching these cells in action really drives home how resourceful bacteria can be,” said Astrid Stubbusch, another study author and a researcher who worked on the study while at ETH Zurich, according to the press release. “By slowly releasing nutrients from their neighbors, they maximize their nutrient harvesting when every molecule counts.” Absorbing Food From NeighborsTo show that the bacteria used this system to eat when there was no food around, the study authors compared their attacks in both nutrient-rich and nutrient-poor environments. When supplied with ample resources, the bacteria used their harpoons to kill their neighbors quickly, with the released nutrients leaking out and dissolving immediately. But when resources were few and far between, they used their harpoons to kill their neighbors slowly, with the nutrients seeping out and sticking around. “This difference in dissolution time could mean that the killer cells load their spears with different toxins,” D’Souza said in another press release. While one toxin could eliminate the competition for space and for food when nutrients are available, another could create a food source, allowing bacteria to “absorb as many nutrients as possible” when sustenance is in short supply.Because of all this, this weapon system is more than ruthless; it’s also smart, and important to some species’ survival. When genetically unedited T6SS bacteria were put in an environment without food, they survived on spilled nutrients. But when genetically edited T6SS bacteria were placed in a similar environment, they died, because their ability to find food in their neighbors had been “turned off.”Harnessing Bacterial HarpoonsAccording to the study authors, the T6SS system is widely used by bacteria, both in and outside the lab. “It’s present in many different environments,” D’Souza said in one of the press releases. “It’s operational and happening in nature, from the oceans to the human gut.” The study authors add that their research could change the way we think about bacteria and could help in our fight against antibiotic resistance. In fact, the T6SS could one day serve as a foundation for targeted drug delivery systems, which could mitigate the development of broader bacterial resistance to antibiotics. But before that can happen, however, researchers have to learn more about bacterial harpoons, and about when and how bacteria use them, both to beat and eat their neighbors.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as /monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as !SubscribeAlready a subscriber?Register or Log In
    #hungry #bacteria #hunt #their #neighbors
    Hungry Bacteria Hunt Their Neighbors With Tiny, Poison-Tipped Harpoons
    Starving bacteriause a microscopic harpoon—called the Type VI secretion system—to stab and kill neighboring cells. The prey burst, turning spherical and leaking nutrients, which the killers then use to survive and grow.NewsletterSign up for our email newsletter for the latest science newsBacteria are bad neighbors. And we’re not talking noisy, never-take-out-the-trash bad neighbors. We’re talking has-a-harpoon-gun-and-points-it-at-you bad neighbors. According to a new study in Science, some bacteria hunt nearby bacterial species when they’re hungry. Using a special weapon system called the Type VI Secretion System, these bacteria shoot, spill, and then absorb the nutrients from the microbes they harpoon. “The punchline is: When things get tough, you eat your neighbors,” said Glen D’Souza, a study author and an assistant professor at Arizona State University, according to a press release. “We’ve known bacteria kill each other, that’s textbook. But what we’re seeing is that it’s not just important that the bacteria have weapons to kill, but they are controlling when they use those weapons specifically for situations to eat others where they can’t grow themselves.” According to the study authors, the research doesn’t just have implications for bacterial neighborhoods; it also has implications for human health and medicine. By harnessing these bacterial weapons, it may be possible to build better targeted antibiotics, designed to overcome antibiotic resistance. Ruthless Bacteria Use HarpoonsResearchers have long known that some bacteria can be ruthless, using weapons like the T6SS to clear out their competition. A nasty tool, the T6SS is essentially a tiny harpoon gun with a poison-tipped needle. When a bacterium shoots the weapon into another bacterium from a separate species, the needle pierces the microbe without killing it. Then, it injects toxins into the microbe that cause its internal nutrients to spill out.Up until now, researchers thought that this weapon helped bacteria eliminate their competition for space and for food, but after watching bacteria use the T6SS to attack their neighbors when food was scarce, the study authors concluded that these tiny harpooners use the weapon not only to remove rivals, but also to consume their competitors’ leaked nutrients.“Watching these cells in action really drives home how resourceful bacteria can be,” said Astrid Stubbusch, another study author and a researcher who worked on the study while at ETH Zurich, according to the press release. “By slowly releasing nutrients from their neighbors, they maximize their nutrient harvesting when every molecule counts.” Absorbing Food From NeighborsTo show that the bacteria used this system to eat when there was no food around, the study authors compared their attacks in both nutrient-rich and nutrient-poor environments. When supplied with ample resources, the bacteria used their harpoons to kill their neighbors quickly, with the released nutrients leaking out and dissolving immediately. But when resources were few and far between, they used their harpoons to kill their neighbors slowly, with the nutrients seeping out and sticking around. “This difference in dissolution time could mean that the killer cells load their spears with different toxins,” D’Souza said in another press release. While one toxin could eliminate the competition for space and for food when nutrients are available, another could create a food source, allowing bacteria to “absorb as many nutrients as possible” when sustenance is in short supply.Because of all this, this weapon system is more than ruthless; it’s also smart, and important to some species’ survival. When genetically unedited T6SS bacteria were put in an environment without food, they survived on spilled nutrients. But when genetically edited T6SS bacteria were placed in a similar environment, they died, because their ability to find food in their neighbors had been “turned off.”Harnessing Bacterial HarpoonsAccording to the study authors, the T6SS system is widely used by bacteria, both in and outside the lab. “It’s present in many different environments,” D’Souza said in one of the press releases. “It’s operational and happening in nature, from the oceans to the human gut.” The study authors add that their research could change the way we think about bacteria and could help in our fight against antibiotic resistance. In fact, the T6SS could one day serve as a foundation for targeted drug delivery systems, which could mitigate the development of broader bacterial resistance to antibiotics. But before that can happen, however, researchers have to learn more about bacterial harpoons, and about when and how bacteria use them, both to beat and eat their neighbors.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as /monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as !SubscribeAlready a subscriber?Register or Log In #hungry #bacteria #hunt #their #neighbors
    WWW.DISCOVERMAGAZINE.COM
    Hungry Bacteria Hunt Their Neighbors With Tiny, Poison-Tipped Harpoons
    Starving bacteria (cyan) use a microscopic harpoon—called the Type VI secretion system—to stab and kill neighboring cells (magenta). The prey burst, turning spherical and leaking nutrients, which the killers then use to survive and grow. (Image Credit: Glen D'Souza/ASU/Screen shot from video)NewsletterSign up for our email newsletter for the latest science newsBacteria are bad neighbors. And we’re not talking noisy, never-take-out-the-trash bad neighbors. We’re talking has-a-harpoon-gun-and-points-it-at-you bad neighbors. According to a new study in Science, some bacteria hunt nearby bacterial species when they’re hungry. Using a special weapon system called the Type VI Secretion System (T6SS), these bacteria shoot, spill, and then absorb the nutrients from the microbes they harpoon. “The punchline is: When things get tough, you eat your neighbors,” said Glen D’Souza, a study author and an assistant professor at Arizona State University, according to a press release. “We’ve known bacteria kill each other, that’s textbook. But what we’re seeing is that it’s not just important that the bacteria have weapons to kill, but they are controlling when they use those weapons specifically for situations to eat others where they can’t grow themselves.” According to the study authors, the research doesn’t just have implications for bacterial neighborhoods; it also has implications for human health and medicine. By harnessing these bacterial weapons, it may be possible to build better targeted antibiotics, designed to overcome antibiotic resistance. Ruthless Bacteria Use HarpoonsResearchers have long known that some bacteria can be ruthless, using weapons like the T6SS to clear out their competition. A nasty tool, the T6SS is essentially a tiny harpoon gun with a poison-tipped needle. When a bacterium shoots the weapon into another bacterium from a separate species, the needle pierces the microbe without killing it. Then, it injects toxins into the microbe that cause its internal nutrients to spill out.Up until now, researchers thought that this weapon helped bacteria eliminate their competition for space and for food, but after watching bacteria use the T6SS to attack their neighbors when food was scarce, the study authors concluded that these tiny harpooners use the weapon not only to remove rivals, but also to consume their competitors’ leaked nutrients.“Watching these cells in action really drives home how resourceful bacteria can be,” said Astrid Stubbusch, another study author and a researcher who worked on the study while at ETH Zurich, according to the press release. “By slowly releasing nutrients from their neighbors, they maximize their nutrient harvesting when every molecule counts.” Absorbing Food From NeighborsTo show that the bacteria used this system to eat when there was no food around, the study authors compared their attacks in both nutrient-rich and nutrient-poor environments. When supplied with ample resources, the bacteria used their harpoons to kill their neighbors quickly, with the released nutrients leaking out and dissolving immediately. But when resources were few and far between, they used their harpoons to kill their neighbors slowly, with the nutrients seeping out and sticking around. “This difference in dissolution time could mean that the killer cells load their spears with different toxins,” D’Souza said in another press release. While one toxin could eliminate the competition for space and for food when nutrients are available, another could create a food source, allowing bacteria to “absorb as many nutrients as possible” when sustenance is in short supply.Because of all this, this weapon system is more than ruthless; it’s also smart, and important to some species’ survival. When genetically unedited T6SS bacteria were put in an environment without food, they survived on spilled nutrients. But when genetically edited T6SS bacteria were placed in a similar environment, they died, because their ability to find food in their neighbors had been “turned off.”Harnessing Bacterial HarpoonsAccording to the study authors, the T6SS system is widely used by bacteria, both in and outside the lab. “It’s present in many different environments,” D’Souza said in one of the press releases. “It’s operational and happening in nature, from the oceans to the human gut.” The study authors add that their research could change the way we think about bacteria and could help in our fight against antibiotic resistance. In fact, the T6SS could one day serve as a foundation for targeted drug delivery systems, which could mitigate the development of broader bacterial resistance to antibiotics. But before that can happen, however, researchers have to learn more about bacterial harpoons, and about when and how bacteria use them, both to beat and eat their neighbors.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as $1.99/monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as $1.99!SubscribeAlready a subscriber?Register or Log In
    Like
    Love
    Wow
    Sad
    Angry
    375
    2 Commentaires 0 Parts
  • Can Sonic Racing: CrossWorlds Outrun Mario Kart World?

    Mario Kart World is one of the year's hottest games, but its pivot to an open world setting, while peeling back kart customization options, opened a massive rift for Sonic Racing: CrossWorlds to drift into. And Sega is determined to do everything possible to make its kart racer the one to beat by including numerous guest characters and cross-platform, multiplayer contests. I took Sonic Racing: CrossWorlds for a test drive at the Summer Game Fest, and it's a strong contender racing game of the year.Sonic Racing: CrossWorlds' Deep Kart CustomizationThe biggest difference between Sonic Racing: CrossWorlds and Mario Kart World is that Sega's title focuses on kart customization. I'm not just talking about colors and tires; CrossWorlds introduces Gadgets, add-ons that augment your car, giving your whip helpful abilities to bring into the race. Each ride has a license plate with six slots where you can slot your chosen Gadgets. A Gadget can take up one, two, or three slots, so the idea is to find a mix that pairs well with character traits. There's a surprising amount of depth for people who want to min/max their favorite anthropomorphic animal.I chose Sonic, a speed character, and added a Gadget that started him with two boosts, a Gadget that improved his speed while trailing an opponent, and a Gadget that improved acceleration. There were so many Gadgets that I could have easily spent my entire demo session building a car to match my playstyle. I envision people happily getting lost in the weeds before participating in their first race.Gameplay: This Ain't Mario Kart WorldAlthough it's not an open world like Mario Kart World, Sonic Racing: CrossWorlds injects a unique spin on traditional kart racing. The familiar trappings are all here, such as rings to boost your top speed. Each Grand Prix consists of three maps, but the gimmick at play is stage transitions. Recommended by Our EditorsAbout a third of the way down a course, a giant ring-portal opens, presenting a new world and track. The shift in tone and terrain keeps the races fast-paced and unpredictable. I particularly liked how whoever is in first place can sometimes choose which CrossWorlds track to go down, controlling the tempo. With every race completion, you earn credits based on your performance that you can cash in for new car parts.In a stark contrast to Mario Kart World, Sonic Racing: CrossWorlds is far more aggressive, even on lower difficulties. At the start of each grand prix, the game assigns you a rival—this is the character to beat, and the one who taunts you all match. Beat them all, and you can race high-powered Super variants.Just about everything caused you to lose rings: bumping into other players, the walls, and, of course, getting hit by items. The series' trademark rubberband AI is still in place, too. Even in the press demo, I wasn't safe from taking four items back to back and being knocked off the stage mere feet away from the finish line.The demo didn't include the new characters that debuted at the Summer Game Fest, but I studied the character screen to see who else could be coming to the game. Including the 12 Sonic characters available in the demo, I counted a whopping 64 character slots. They include Hatsune Miku, Joker, Ichiban Kasuga, and Steve. However, I hope to see other classic Sega IPs like in previous Sonic Racing titles.Platforms and Release DateWill Sega do what Nintendon't? I had an exhilarating time playing Sonic Racing: CrossWorld, and I can't wait to see more wild track compositions. Sonic Racing: CrossWorlds will be available on Nintendo Switch, PC, PlayStation 4, PlayStation 5, Xbox One, and Xbox Series X/S on Sept. 25, 2025. A Nintendo Switch 2 version is planned for later in the year.
    #can #sonic #racing #crossworlds #outrun
    Can Sonic Racing: CrossWorlds Outrun Mario Kart World?
    Mario Kart World is one of the year's hottest games, but its pivot to an open world setting, while peeling back kart customization options, opened a massive rift for Sonic Racing: CrossWorlds to drift into. And Sega is determined to do everything possible to make its kart racer the one to beat by including numerous guest characters and cross-platform, multiplayer contests. I took Sonic Racing: CrossWorlds for a test drive at the Summer Game Fest, and it's a strong contender racing game of the year.Sonic Racing: CrossWorlds' Deep Kart CustomizationThe biggest difference between Sonic Racing: CrossWorlds and Mario Kart World is that Sega's title focuses on kart customization. I'm not just talking about colors and tires; CrossWorlds introduces Gadgets, add-ons that augment your car, giving your whip helpful abilities to bring into the race. Each ride has a license plate with six slots where you can slot your chosen Gadgets. A Gadget can take up one, two, or three slots, so the idea is to find a mix that pairs well with character traits. There's a surprising amount of depth for people who want to min/max their favorite anthropomorphic animal.I chose Sonic, a speed character, and added a Gadget that started him with two boosts, a Gadget that improved his speed while trailing an opponent, and a Gadget that improved acceleration. There were so many Gadgets that I could have easily spent my entire demo session building a car to match my playstyle. I envision people happily getting lost in the weeds before participating in their first race.Gameplay: This Ain't Mario Kart WorldAlthough it's not an open world like Mario Kart World, Sonic Racing: CrossWorlds injects a unique spin on traditional kart racing. The familiar trappings are all here, such as rings to boost your top speed. Each Grand Prix consists of three maps, but the gimmick at play is stage transitions. Recommended by Our EditorsAbout a third of the way down a course, a giant ring-portal opens, presenting a new world and track. The shift in tone and terrain keeps the races fast-paced and unpredictable. I particularly liked how whoever is in first place can sometimes choose which CrossWorlds track to go down, controlling the tempo. With every race completion, you earn credits based on your performance that you can cash in for new car parts.In a stark contrast to Mario Kart World, Sonic Racing: CrossWorlds is far more aggressive, even on lower difficulties. At the start of each grand prix, the game assigns you a rival—this is the character to beat, and the one who taunts you all match. Beat them all, and you can race high-powered Super variants.Just about everything caused you to lose rings: bumping into other players, the walls, and, of course, getting hit by items. The series' trademark rubberband AI is still in place, too. Even in the press demo, I wasn't safe from taking four items back to back and being knocked off the stage mere feet away from the finish line.The demo didn't include the new characters that debuted at the Summer Game Fest, but I studied the character screen to see who else could be coming to the game. Including the 12 Sonic characters available in the demo, I counted a whopping 64 character slots. They include Hatsune Miku, Joker, Ichiban Kasuga, and Steve. However, I hope to see other classic Sega IPs like in previous Sonic Racing titles.Platforms and Release DateWill Sega do what Nintendon't? I had an exhilarating time playing Sonic Racing: CrossWorld, and I can't wait to see more wild track compositions. Sonic Racing: CrossWorlds will be available on Nintendo Switch, PC, PlayStation 4, PlayStation 5, Xbox One, and Xbox Series X/S on Sept. 25, 2025. A Nintendo Switch 2 version is planned for later in the year. #can #sonic #racing #crossworlds #outrun
    ME.PCMAG.COM
    Can Sonic Racing: CrossWorlds Outrun Mario Kart World?
    Mario Kart World is one of the year's hottest games, but its pivot to an open world setting, while peeling back kart customization options, opened a massive rift for Sonic Racing: CrossWorlds to drift into. And Sega is determined to do everything possible to make its kart racer the one to beat by including numerous guest characters and cross-platform, multiplayer contests. I took Sonic Racing: CrossWorlds for a test drive at the Summer Game Fest, and it's a strong contender racing game of the year.Sonic Racing: CrossWorlds' Deep Kart CustomizationThe biggest difference between Sonic Racing: CrossWorlds and Mario Kart World is that Sega's title focuses on kart customization. I'm not just talking about colors and tires; CrossWorlds introduces Gadgets, add-ons that augment your car, giving your whip helpful abilities to bring into the race. (Credit: Sega)Each ride has a license plate with six slots where you can slot your chosen Gadgets. A Gadget can take up one, two, or three slots, so the idea is to find a mix that pairs well with character traits. There's a surprising amount of depth for people who want to min/max their favorite anthropomorphic animal.I chose Sonic, a speed character, and added a Gadget that started him with two boosts (one slot), a Gadget that improved his speed while trailing an opponent (two slots), and a Gadget that improved acceleration (three slots). There were so many Gadgets that I could have easily spent my entire demo session building a car to match my playstyle. I envision people happily getting lost in the weeds before participating in their first race.(Credit: Sega)Gameplay: This Ain't Mario Kart WorldAlthough it's not an open world like Mario Kart World, Sonic Racing: CrossWorlds injects a unique spin on traditional kart racing. The familiar trappings are all here, such as rings to boost your top speed. Each Grand Prix consists of three maps, but the gimmick at play is stage transitions. Recommended by Our EditorsAbout a third of the way down a course, a giant ring-portal opens, presenting a new world and track (hence the name "CrossWorlds"). The shift in tone and terrain keeps the races fast-paced and unpredictable. I particularly liked how whoever is in first place can sometimes choose which CrossWorlds track to go down, controlling the tempo. With every race completion, you earn credits based on your performance that you can cash in for new car parts.In a stark contrast to Mario Kart World, Sonic Racing: CrossWorlds is far more aggressive, even on lower difficulties. At the start of each grand prix, the game assigns you a rival—this is the character to beat, and the one who taunts you all match. Beat them all, and you can race high-powered Super variants.Just about everything caused you to lose rings: bumping into other players, the walls, and, of course, getting hit by items. The series' trademark rubberband AI is still in place, too. Even in the press demo, I wasn't safe from taking four items back to back and being knocked off the stage mere feet away from the finish line.(Credit: Sega)The demo didn't include the new characters that debuted at the Summer Game Fest, but I studied the character screen to see who else could be coming to the game. Including the 12 Sonic characters available in the demo, I counted a whopping 64 character slots. They include Hatsune Miku (the ultra-popular Vocaloid), Joker (from Persona 5), Ichiban Kasuga (from Like a Dragon), and Steve (from Minecraft). However, I hope to see other classic Sega IPs like in previous Sonic Racing titles.Platforms and Release DateWill Sega do what Nintendon't? I had an exhilarating time playing Sonic Racing: CrossWorld, and I can't wait to see more wild track compositions. Sonic Racing: CrossWorlds will be available on Nintendo Switch, PC, PlayStation 4, PlayStation 5, Xbox One, and Xbox Series X/S on Sept. 25, 2025. A Nintendo Switch 2 version is planned for later in the year.
    0 Commentaires 0 Parts
  • MedTech AI, hardware, and clinical application programmes

    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #medtech #hardware #clinical #application #programmes
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #medtech #hardware #clinical #application #programmes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between $14 billion and $55 billion annually in productivity gains. Through GenAI adoption, an additional $50 billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experience (UX) being equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. (Image source: “IBM Achieves New Deep Learning Breakthrough” by IBM Research is licensed under CC BY-ND 2.0.)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 Commentaires 0 Parts
  • Christian Marclay explores a universe of thresholds in his latest single-channel montage of film clips

    DoorsChristian Marclay
    Institute of Contemporary Art Boston
    Through September 1, 2025Brooklyn Museum

    Through April 12, 2026On the screen, a movie clip plays of a character entering through a door to leave out another. It cuts to another clip of someone else doing the same thing over and over, all sourced from a panoply of Western cinema. The audience, sitting for an unknown amount of time, watches this shape-shifting protagonist from different cultural periods come and go, as the film endlessly loops.

    So goes Christian Marclay’s latest single-channel film, Doors, currently exhibited for the first time in the United States at the Institute of Contemporary Art Boston.. Assembled over ten years, the film is a dizzying feat, a carefully crafted montage of film clips revolving around the simple premise of someone entering through a door and then leaving out a door. In the exhibition, Marclay writes, “Doors are fascinating objects, rich with symbolism.” Here, he shows hundreds of them, examining through film how the simple act of moving through a threshold multiplied endlessly creates a profoundly new reading of what said threshold signifies.
    On paper, this may sound like an extremely jarring experience. But Marclay—a visual artist, composer, and DJ whose previous works such as The Clockinvolved similar mega-montages of disparate film clips—has a sensitive touch. The sequences feel incredibly smooth, the montage carefully constructed to mimic continuity as closely as possible. This is even more impressive when one imagines the constraints that a door’s movement offers; it must open and close a certain direction, with particular types of hinges or means of swinging. It makes the seamlessness of the film all the more fascinating to dissect. When a tiny wooden doorframe cuts to a large double steel door, my brain had no issue at all registering a sense of continued motion through the frame—a form of cinematic magic.
    Christian Marclay, Doors, 2022. Single-channel video projection.
    Watching the clips, there seemed to be no discernible meta narrative—simply movement through doors. Nevertheless, Marclay is a master of controlling tone. Though the relentlessness of watching the loops does create an overall feeling of tension that the film is clearly playing on, there are often moments of levity that interrupt, giving visitors a chance to breathe. The pacing too, swings from a person rushing in and out, to a slow stroll between doors in a corridor. It leaves one musing on just how ubiquitous this simple action is, and how mutable these simple acts of pulling a door and stepping inside can be. Sometimes mundane, sometimes thrilling, sometimes in anticipation, sometimes in search—Doors invites us to reflect on our own interaction with these objects, and with the very act of stepping through a doorframe.

    Much of the experience rests on the soundscape and music, which is equally—if not more heavily—important in creating the transition across clips. Marclay’s previous work leaned heavily on his interest in aural media; this added dimension only enriches Doors and elevates it beyond a formal visual study of clips that match each other. The film bleeds music from one scene to another, sometimes prematurely, to make believable the movement of one character across multiple movies. This overlap of sounds is essentially an echo of the space we left behind and are entering into. We as the audience almost believe—even if just for a second—that the transition is real.
    The effect is powerful and calls to mind several references. No doubt Doors owes some degree of inspiration to the lineage of surrealist art, perhaps in the work of Magritte or Duchamp. For those steeped in architecture, one may think of Bernard Tschumi’s Manhattan Transcripts, where his transcriptions of events, spaces, and movements similarly both shatter and call to attention simple spatial sequences. One may also be reminded of the work of Situationist International, particularly the psychogeography of Guy Debord. I confess that my first thought was theequally famous door-chase scene in Monsters, Inc. But regardless of what corollaries one may conjure, Doors has a wholly unique feel. It is simplistic and singular in constructing its webbed world.
    Installation view, Christian Marclay: Doors, the Institute of Contemporary Art/Boston, 2025.But what exactly are we to take away from this world? In an interview with Artforum, Marclay declares, “I’m building in people’s minds an architecture in which to get lost.” The clip evokes a certain act of labyrinthian mapping—or perhaps a mode of perpetual resetting. I began to imagine this almost as a non-Euclidean enfilade of sorts where each room invites you to quickly grasp a new environment and then very quickly anticipate what may be in the next. With the understanding that you can’t backtrack, and the unpredictability of the next door taking you anywhere, the film holds you in total suspense. The production of new spaces and new architecture is activated all at once in the moment someone steps into a new doorway.

    All of this is without even mentioning the chosen films themselves. There is a degree to which the pop-culture element of Marclay’s work makes certain moments click—I can’t help but laugh as I watch Adam Sandler in Punch Drunk Love exit a door and emerge as Bette Davis in All About Eve. But to a degree, I also see the references being secondary, and certainly unneeded to understand the visceral experience Marclay crafts. It helps that, aside from a couple of jarring character movements or one-off spoken jokes, the movement is repetitive and universal.
    Doors runs on a continuous loop. I sat watching for just under an hour before convincing myself that I would never find any appropriate or correct time to leave. Instead, I could sit endlessly and reflect on each character movement, each new reveal of a room. Is the door the most important architectural element in creating space? Marclay makes a strong case for it with this piece.
    Harish Krishnamoorthy is an architectural and urban designer based in Cambridge, Massachusetts, and Bangalore, India. He is an editor at PAIRS.
    #christian #marclay #explores #universe #thresholds
    Christian Marclay explores a universe of thresholds in his latest single-channel montage of film clips
    DoorsChristian Marclay Institute of Contemporary Art Boston Through September 1, 2025Brooklyn Museum Through April 12, 2026On the screen, a movie clip plays of a character entering through a door to leave out another. It cuts to another clip of someone else doing the same thing over and over, all sourced from a panoply of Western cinema. The audience, sitting for an unknown amount of time, watches this shape-shifting protagonist from different cultural periods come and go, as the film endlessly loops. So goes Christian Marclay’s latest single-channel film, Doors, currently exhibited for the first time in the United States at the Institute of Contemporary Art Boston.. Assembled over ten years, the film is a dizzying feat, a carefully crafted montage of film clips revolving around the simple premise of someone entering through a door and then leaving out a door. In the exhibition, Marclay writes, “Doors are fascinating objects, rich with symbolism.” Here, he shows hundreds of them, examining through film how the simple act of moving through a threshold multiplied endlessly creates a profoundly new reading of what said threshold signifies. On paper, this may sound like an extremely jarring experience. But Marclay—a visual artist, composer, and DJ whose previous works such as The Clockinvolved similar mega-montages of disparate film clips—has a sensitive touch. The sequences feel incredibly smooth, the montage carefully constructed to mimic continuity as closely as possible. This is even more impressive when one imagines the constraints that a door’s movement offers; it must open and close a certain direction, with particular types of hinges or means of swinging. It makes the seamlessness of the film all the more fascinating to dissect. When a tiny wooden doorframe cuts to a large double steel door, my brain had no issue at all registering a sense of continued motion through the frame—a form of cinematic magic. Christian Marclay, Doors, 2022. Single-channel video projection. Watching the clips, there seemed to be no discernible meta narrative—simply movement through doors. Nevertheless, Marclay is a master of controlling tone. Though the relentlessness of watching the loops does create an overall feeling of tension that the film is clearly playing on, there are often moments of levity that interrupt, giving visitors a chance to breathe. The pacing too, swings from a person rushing in and out, to a slow stroll between doors in a corridor. It leaves one musing on just how ubiquitous this simple action is, and how mutable these simple acts of pulling a door and stepping inside can be. Sometimes mundane, sometimes thrilling, sometimes in anticipation, sometimes in search—Doors invites us to reflect on our own interaction with these objects, and with the very act of stepping through a doorframe. Much of the experience rests on the soundscape and music, which is equally—if not more heavily—important in creating the transition across clips. Marclay’s previous work leaned heavily on his interest in aural media; this added dimension only enriches Doors and elevates it beyond a formal visual study of clips that match each other. The film bleeds music from one scene to another, sometimes prematurely, to make believable the movement of one character across multiple movies. This overlap of sounds is essentially an echo of the space we left behind and are entering into. We as the audience almost believe—even if just for a second—that the transition is real. The effect is powerful and calls to mind several references. No doubt Doors owes some degree of inspiration to the lineage of surrealist art, perhaps in the work of Magritte or Duchamp. For those steeped in architecture, one may think of Bernard Tschumi’s Manhattan Transcripts, where his transcriptions of events, spaces, and movements similarly both shatter and call to attention simple spatial sequences. One may also be reminded of the work of Situationist International, particularly the psychogeography of Guy Debord. I confess that my first thought was theequally famous door-chase scene in Monsters, Inc. But regardless of what corollaries one may conjure, Doors has a wholly unique feel. It is simplistic and singular in constructing its webbed world. Installation view, Christian Marclay: Doors, the Institute of Contemporary Art/Boston, 2025.But what exactly are we to take away from this world? In an interview with Artforum, Marclay declares, “I’m building in people’s minds an architecture in which to get lost.” The clip evokes a certain act of labyrinthian mapping—or perhaps a mode of perpetual resetting. I began to imagine this almost as a non-Euclidean enfilade of sorts where each room invites you to quickly grasp a new environment and then very quickly anticipate what may be in the next. With the understanding that you can’t backtrack, and the unpredictability of the next door taking you anywhere, the film holds you in total suspense. The production of new spaces and new architecture is activated all at once in the moment someone steps into a new doorway. All of this is without even mentioning the chosen films themselves. There is a degree to which the pop-culture element of Marclay’s work makes certain moments click—I can’t help but laugh as I watch Adam Sandler in Punch Drunk Love exit a door and emerge as Bette Davis in All About Eve. But to a degree, I also see the references being secondary, and certainly unneeded to understand the visceral experience Marclay crafts. It helps that, aside from a couple of jarring character movements or one-off spoken jokes, the movement is repetitive and universal. Doors runs on a continuous loop. I sat watching for just under an hour before convincing myself that I would never find any appropriate or correct time to leave. Instead, I could sit endlessly and reflect on each character movement, each new reveal of a room. Is the door the most important architectural element in creating space? Marclay makes a strong case for it with this piece. Harish Krishnamoorthy is an architectural and urban designer based in Cambridge, Massachusetts, and Bangalore, India. He is an editor at PAIRS. #christian #marclay #explores #universe #thresholds
    WWW.ARCHPAPER.COM
    Christian Marclay explores a universe of thresholds in his latest single-channel montage of film clips
    Doors (2022) Christian Marclay Institute of Contemporary Art Boston Through September 1, 2025Brooklyn Museum Through April 12, 2026On the screen, a movie clip plays of a character entering through a door to leave out another. It cuts to another clip of someone else doing the same thing over and over, all sourced from a panoply of Western cinema. The audience, sitting for an unknown amount of time, watches this shape-shifting protagonist from different cultural periods come and go, as the film endlessly loops. So goes Christian Marclay’s latest single-channel film, Doors (2022), currently exhibited for the first time in the United States at the Institute of Contemporary Art Boston. (It also premieres June 13 at the Brooklyn Museum and will run through April 12, 2026). Assembled over ten years, the film is a dizzying feat, a carefully crafted montage of film clips revolving around the simple premise of someone entering through a door and then leaving out a door. In the exhibition, Marclay writes, “Doors are fascinating objects, rich with symbolism.” Here, he shows hundreds of them, examining through film how the simple act of moving through a threshold multiplied endlessly creates a profoundly new reading of what said threshold signifies. On paper, this may sound like an extremely jarring experience. But Marclay—a visual artist, composer, and DJ whose previous works such as The Clock (2010) involved similar mega-montages of disparate film clips—has a sensitive touch. The sequences feel incredibly smooth, the montage carefully constructed to mimic continuity as closely as possible. This is even more impressive when one imagines the constraints that a door’s movement offers; it must open and close a certain direction, with particular types of hinges or means of swinging. It makes the seamlessness of the film all the more fascinating to dissect. When a tiny wooden doorframe cuts to a large double steel door, my brain had no issue at all registering a sense of continued motion through the frame—a form of cinematic magic. Christian Marclay, Doors (still), 2022. Single-channel video projection (color and black-and-white; 55:00 minutes on continuous loop). Watching the clips, there seemed to be no discernible meta narrative—simply movement through doors. Nevertheless, Marclay is a master of controlling tone. Though the relentlessness of watching the loops does create an overall feeling of tension that the film is clearly playing on, there are often moments of levity that interrupt, giving visitors a chance to breathe. The pacing too, swings from a person rushing in and out, to a slow stroll between doors in a corridor. It leaves one musing on just how ubiquitous this simple action is, and how mutable these simple acts of pulling a door and stepping inside can be. Sometimes mundane, sometimes thrilling, sometimes in anticipation, sometimes in search—Doors invites us to reflect on our own interaction with these objects, and with the very act of stepping through a doorframe. Much of the experience rests on the soundscape and music, which is equally—if not more heavily—important in creating the transition across clips. Marclay’s previous work leaned heavily on his interest in aural media; this added dimension only enriches Doors and elevates it beyond a formal visual study of clips that match each other. The film bleeds music from one scene to another, sometimes prematurely, to make believable the movement of one character across multiple movies. This overlap of sounds is essentially an echo of the space we left behind and are entering into. We as the audience almost believe—even if just for a second—that the transition is real. The effect is powerful and calls to mind several references. No doubt Doors owes some degree of inspiration to the lineage of surrealist art, perhaps in the work of Magritte or Duchamp. For those steeped in architecture, one may think of Bernard Tschumi’s Manhattan Transcripts, where his transcriptions of events, spaces, and movements similarly both shatter and call to attention simple spatial sequences. One may also be reminded of the work of Situationist International, particularly the psychogeography of Guy Debord. I confess that my first thought was the (in my view) equally famous door-chase scene in Monsters, Inc. But regardless of what corollaries one may conjure, Doors has a wholly unique feel. It is simplistic and singular in constructing its webbed world. Installation view, Christian Marclay: Doors, the Institute of Contemporary Art/Boston, 2025. (Mel Taing) But what exactly are we to take away from this world? In an interview with Artforum, Marclay declares, “I’m building in people’s minds an architecture in which to get lost.” The clip evokes a certain act of labyrinthian mapping—or perhaps a mode of perpetual resetting. I began to imagine this almost as a non-Euclidean enfilade of sorts where each room invites you to quickly grasp a new environment and then very quickly anticipate what may be in the next. With the understanding that you can’t backtrack, and the unpredictability of the next door taking you anywhere, the film holds you in total suspense. The production of new spaces and new architecture is activated all at once in the moment someone steps into a new doorway. All of this is without even mentioning the chosen films themselves. There is a degree to which the pop-culture element of Marclay’s work makes certain moments click—I can’t help but laugh as I watch Adam Sandler in Punch Drunk Love exit a door and emerge as Bette Davis in All About Eve. But to a degree, I also see the references being secondary, and certainly unneeded to understand the visceral experience Marclay crafts. It helps that, aside from a couple of jarring character movements or one-off spoken jokes, the movement is repetitive and universal. Doors runs on a continuous loop. I sat watching for just under an hour before convincing myself that I would never find any appropriate or correct time to leave. Instead, I could sit endlessly and reflect on each character movement, each new reveal of a room. Is the door the most important architectural element in creating space? Marclay makes a strong case for it with this piece. Harish Krishnamoorthy is an architectural and urban designer based in Cambridge, Massachusetts, and Bangalore, India. He is an editor at PAIRS.
    0 Commentaires 0 Parts
  • Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm

    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more

    When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development.
    What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute. 
    As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention.
    Engineering around constraints
    DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement.
    While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well.
    This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment.
    If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development.
    That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently.
    This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing.
    Pragmatism over process
    Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process.
    The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content.
    This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations. 
    Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance.
    Market reverberations
    Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders.
    Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI. 
    With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change.
    This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s.
    Beyond model training
    Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training.
    To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards.
    The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk.
    For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted.
    At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort.
    This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails.
    Moving into the future
    So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity. 
    Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market.
    Meta has also responded,
    With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail.
    Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching.
    Jae Lee is CEO and co-founder of TwelveLabs.

    Daily insights on business use cases with VB Daily
    If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.
    Read our Privacy Policy

    Thanks for subscribing. Check out more VB newsletters here.

    An error occured.
    #rethinking #deepseeks #playbook #shakes #highspend
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured. #rethinking #deepseeks #playbook #shakes #highspend
    VENTUREBEAT.COM
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere $6 million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent $500 million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just $5.6 million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate (even though it makes a good story). Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of experts (MoE) architectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending $7 to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending $7 billion or $8 billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive $40 billion funding round that valued the company at an unprecedented $300 billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute” (TTC). As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning” (SPCT). This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM” (generalist reward modeling). But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of others (think OpenAI’s “critique and revise” methods, Anthropic’s constitutional AI or research on self-rewarding agents) to create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately $80 billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured.
    0 Commentaires 0 Parts
  • How to take photos on your phone via remote control

    Get the Popular Science daily newsletter
    Breakthroughs, discoveries, and DIY tips sent every weekday.

    Our smartphones have transformed the way we take photos and videos and our relationship to these digital memories. Most of us will snap at least some pictures and clips every day with the gadget that’s always close at hand.
    If you want to get more creative with photos on your phone, you can. Sometimes you’re going to want to take a picture remotely, without your phone in your hand and your finger over the shutter button—maybe you’re taking a wide shot of a large group, or you want to capture a lot of your surroundings.
    Not only is this possible, there are multiple ways to go about it, no matter which flavor of phone you own. You can pick the one that you find the easiest to use, or switch between them as you need.
    Use your smartwatch
    If you’ve got an Apple Watch, it comes with a Camera app. Image: Raagesh C/Unsplash
    If you’ve got a smartwatch to match your smartphone, you can use it to take photos remotely, as long as you’re within about 33 feetof the phone. Get your handset in position first, then load up the relevant app on your watch—though you can then go back and readjust the phone if needed.
    With the Apple Watch and an iPhone, the app you want on your wrist is the Camera Remote app, which comes preinstalled. A viewfinder screen from your iPhone will appear: Use the digital crown to zoom, and the shutter buttonto take a shot. By default, a three-second timer is used, but you can change this by tapping the button with the three dots.
    For those of you with an Android phone and a Wear OS smartwatch, you can use Google’s default Camera app, which you should find preinstalled on your watch. Launch it from your wrist, and the Camera app should open on your connected phone: You can zoom using the slider on the right, and take a photoby tapping the shutter button with a 3 on it. To change this delay, tap the three lines at the top.
    Use your voice
    Settings for Voice Control on iOS. Screenshot: Apple
    No matter what phone you have, it’ll come with support for voice commands—and one of those commands will let you take photos. This will only work where your phone is close enough to hear you, and where you’re happy to talk to it, but it can be useful in certain situations for remote controlling the camera app.
    On the iPhone, Siri can open the Camera app but won’t actually take a photo. To enable voice controlled capture, open Settings and choose Accessibility > Voice Control, then turn the feature on. The same page has a Commands menu where you can set up your custom voice command for taking photos, which will work from the viewfinder screen.
    On Android, it’s even easier: Just say “hey Google, take a photo”—you can even add a number of seconds for a timer countdown. Gemini is now the default assistant for this task: To make sure it responds to voice commands, open the app, tap your profile picture, then choose Settings > “Hey Google ” & Voice Match.
    Use the timer
    Configuring the timer on a Pixel phone. Screenshot: Google
    This is a really straightforward one, and you don’t need any extra apps or devices to get it set up. Your phone’s camera app comes with a timer control, so you can position the shot, set the timer, and then get in the frame. There’s a bit of guesswork involved, especially if you’re using your phone’s rear camera, but it’s a simple option.
    On the iPhone, you can tap the arrow near the top of the Camera app screen to reveal extra camera options at the bottom. Scroll through the icons until you reach the one that looks like a stopwatch. Tap this, and you can choose between a 3-second, 5-second, and 10-second delay when you press the shutter button.
    On Pixel phones, tap the gear iconto find the timer control: As on the iPhone, the delay options are 3 seconds, 5 seconds, or 10 seconds. If you’re using the Camera app on a Galaxy phone, tap the four dots, then the timer icon, and you get the same delay options.
    Use another method
    The latest Pixel phones have a Connected Cameras feature too. Screenshot: Google
    You’ve got yet more options for this if you need them. One is to use a simple Bluetooth clicker as a remote control: There are a whole host to choose from, such as this CamKix model that will cost you a mere They work across iOS and Android and are easy to connect to your camera app.
    If you have two Pixel 9 phones, you can also use a special feature called Connected Cameras. You can find it from Settings by tapping Connected devices > Connection preferences > Connected Cameras: You get a brief explanation of what the feature does, and you can turn it on via the Use Connected Cameras toggle switch.
    This is a niche use case, as it only works with two handsets from the Pixel 9 series. But if those are the phones you and your family have, you can use one to take photos through the camera of the other; head to the official guide from Google for more details on how it works.
    #how #take #photos #your #phone
    How to take photos on your phone via remote control
    Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. Our smartphones have transformed the way we take photos and videos and our relationship to these digital memories. Most of us will snap at least some pictures and clips every day with the gadget that’s always close at hand. If you want to get more creative with photos on your phone, you can. Sometimes you’re going to want to take a picture remotely, without your phone in your hand and your finger over the shutter button—maybe you’re taking a wide shot of a large group, or you want to capture a lot of your surroundings. Not only is this possible, there are multiple ways to go about it, no matter which flavor of phone you own. You can pick the one that you find the easiest to use, or switch between them as you need. Use your smartwatch If you’ve got an Apple Watch, it comes with a Camera app. Image: Raagesh C/Unsplash If you’ve got a smartwatch to match your smartphone, you can use it to take photos remotely, as long as you’re within about 33 feetof the phone. Get your handset in position first, then load up the relevant app on your watch—though you can then go back and readjust the phone if needed. With the Apple Watch and an iPhone, the app you want on your wrist is the Camera Remote app, which comes preinstalled. A viewfinder screen from your iPhone will appear: Use the digital crown to zoom, and the shutter buttonto take a shot. By default, a three-second timer is used, but you can change this by tapping the button with the three dots. For those of you with an Android phone and a Wear OS smartwatch, you can use Google’s default Camera app, which you should find preinstalled on your watch. Launch it from your wrist, and the Camera app should open on your connected phone: You can zoom using the slider on the right, and take a photoby tapping the shutter button with a 3 on it. To change this delay, tap the three lines at the top. Use your voice Settings for Voice Control on iOS. Screenshot: Apple No matter what phone you have, it’ll come with support for voice commands—and one of those commands will let you take photos. This will only work where your phone is close enough to hear you, and where you’re happy to talk to it, but it can be useful in certain situations for remote controlling the camera app. On the iPhone, Siri can open the Camera app but won’t actually take a photo. To enable voice controlled capture, open Settings and choose Accessibility > Voice Control, then turn the feature on. The same page has a Commands menu where you can set up your custom voice command for taking photos, which will work from the viewfinder screen. On Android, it’s even easier: Just say “hey Google, take a photo”—you can even add a number of seconds for a timer countdown. Gemini is now the default assistant for this task: To make sure it responds to voice commands, open the app, tap your profile picture, then choose Settings > “Hey Google ” & Voice Match. Use the timer Configuring the timer on a Pixel phone. Screenshot: Google This is a really straightforward one, and you don’t need any extra apps or devices to get it set up. Your phone’s camera app comes with a timer control, so you can position the shot, set the timer, and then get in the frame. There’s a bit of guesswork involved, especially if you’re using your phone’s rear camera, but it’s a simple option. On the iPhone, you can tap the arrow near the top of the Camera app screen to reveal extra camera options at the bottom. Scroll through the icons until you reach the one that looks like a stopwatch. Tap this, and you can choose between a 3-second, 5-second, and 10-second delay when you press the shutter button. On Pixel phones, tap the gear iconto find the timer control: As on the iPhone, the delay options are 3 seconds, 5 seconds, or 10 seconds. If you’re using the Camera app on a Galaxy phone, tap the four dots, then the timer icon, and you get the same delay options. Use another method The latest Pixel phones have a Connected Cameras feature too. Screenshot: Google You’ve got yet more options for this if you need them. One is to use a simple Bluetooth clicker as a remote control: There are a whole host to choose from, such as this CamKix model that will cost you a mere They work across iOS and Android and are easy to connect to your camera app. If you have two Pixel 9 phones, you can also use a special feature called Connected Cameras. You can find it from Settings by tapping Connected devices > Connection preferences > Connected Cameras: You get a brief explanation of what the feature does, and you can turn it on via the Use Connected Cameras toggle switch. This is a niche use case, as it only works with two handsets from the Pixel 9 series. But if those are the phones you and your family have, you can use one to take photos through the camera of the other; head to the official guide from Google for more details on how it works. #how #take #photos #your #phone
    WWW.POPSCI.COM
    How to take photos on your phone via remote control
    Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. Our smartphones have transformed the way we take photos and videos and our relationship to these digital memories. Most of us will snap at least some pictures and clips every day with the gadget that’s always close at hand. If you want to get more creative with photos on your phone, you can. Sometimes you’re going to want to take a picture remotely, without your phone in your hand and your finger over the shutter button—maybe you’re taking a wide shot of a large group, or you want to capture a lot of your surroundings. Not only is this possible, there are multiple ways to go about it, no matter which flavor of phone you own. You can pick the one that you find the easiest to use, or switch between them as you need. Use your smartwatch If you’ve got an Apple Watch, it comes with a Camera app. Image: Raagesh C/Unsplash If you’ve got a smartwatch to match your smartphone, you can use it to take photos remotely, as long as you’re within about 33 feet (10 meters) of the phone. Get your handset in position first, then load up the relevant app on your watch—though you can then go back and readjust the phone if needed. With the Apple Watch and an iPhone, the app you want on your wrist is the Camera Remote app, which comes preinstalled. A viewfinder screen from your iPhone will appear: Use the digital crown to zoom, and the shutter button (in the middle) to take a shot. By default, a three-second timer is used, but you can change this by tapping the button with the three dots (lower right). For those of you with an Android phone and a Wear OS smartwatch, you can use Google’s default Camera app, which you should find preinstalled on your watch. Launch it from your wrist, and the Camera app should open on your connected phone: You can zoom using the slider on the right, and take a photo (with a three-second delay) by tapping the shutter button with a 3 on it. To change this delay, tap the three lines at the top. Use your voice Settings for Voice Control on iOS. Screenshot: Apple No matter what phone you have, it’ll come with support for voice commands—and one of those commands will let you take photos. This will only work where your phone is close enough to hear you, and where you’re happy to talk to it, but it can be useful in certain situations for remote controlling the camera app. On the iPhone, Siri can open the Camera app but won’t actually take a photo. To enable voice controlled capture, open Settings and choose Accessibility > Voice Control, then turn the feature on. The same page has a Commands menu where you can set up your custom voice command for taking photos, which will work from the viewfinder screen. On Android, it’s even easier: Just say “hey Google, take a photo”—you can even add a number of seconds for a timer countdown. Gemini is now the default assistant for this task: To make sure it responds to voice commands, open the app, tap your profile picture (top right), then choose Settings > “Hey Google ” & Voice Match. Use the timer Configuring the timer on a Pixel phone. Screenshot: Google This is a really straightforward one, and you don’t need any extra apps or devices to get it set up. Your phone’s camera app comes with a timer control, so you can position the shot, set the timer, and then get in the frame. There’s a bit of guesswork involved, especially if you’re using your phone’s rear camera (as you won’t be able to see yourself), but it’s a simple option. On the iPhone, you can tap the arrow near the top of the Camera app screen to reveal extra camera options at the bottom. Scroll through the icons until you reach the one that looks like a stopwatch. Tap this, and you can choose between a 3-second, 5-second, and 10-second delay when you press the shutter button. On Pixel phones, tap the gear icon (lower left in portrait mode) to find the timer control: As on the iPhone, the delay options are 3 seconds, 5 seconds, or 10 seconds. If you’re using the Camera app on a Galaxy phone, tap the four dots (to the right in portrait mode), then the timer icon (which looks like a stopwatch), and you get the same delay options. Use another method The latest Pixel phones have a Connected Cameras feature too. Screenshot: Google You’ve got yet more options for this if you need them. One is to use a simple Bluetooth clicker as a remote control: There are a whole host to choose from, such as this CamKix model that will cost you a mere $5.49. They work across iOS and Android and are easy to connect to your camera app. If you have two Pixel 9 phones, you can also use a special feature called Connected Cameras. You can find it from Settings by tapping Connected devices > Connection preferences > Connected Cameras: You get a brief explanation of what the feature does, and you can turn it on via the Use Connected Cameras toggle switch. This is a niche use case, as it only works with two handsets from the Pixel 9 series (at least for now). But if those are the phones you and your family have, you can use one to take photos through the camera of the other; head to the official guide from Google for more details on how it works.
    Like
    Love
    Wow
    Sad
    Angry
    638
    0 Commentaires 0 Parts