• Death Stranding 2 Could Be Causing Some PS5s To Overheat

    There are a lot of things for players to find in Death Stranding 2: On the Beach, but a few fans have already discovered an issue that appears to be overheating some PlayStation 5 consoles.Players on Reddithave shared multiple accounts of Death Stranding 2 apparently causing PlayStation 5s to overheat when the map menu is called up on-screen. Fans have reported that the map causes the PS5 fan to loudly go into overdrive before they receive an overheating warning from the console itself.The working theory among players is that the map menu is running with an unlocked frame rate, which may be causing the PS5's overheating issue. So far, no one has reported the same issue on PS5 Pro, but at least one user shared a video on YouTube that demonstrates the map menu's effect on their PS5 multiple times. Presumably this issue can be addressed in an update from Kojima Productions once it becomes aware of the problem.Continue Reading at GameSpot
    #death #stranding #could #causing #some
    Death Stranding 2 Could Be Causing Some PS5s To Overheat
    There are a lot of things for players to find in Death Stranding 2: On the Beach, but a few fans have already discovered an issue that appears to be overheating some PlayStation 5 consoles.Players on Reddithave shared multiple accounts of Death Stranding 2 apparently causing PlayStation 5s to overheat when the map menu is called up on-screen. Fans have reported that the map causes the PS5 fan to loudly go into overdrive before they receive an overheating warning from the console itself.The working theory among players is that the map menu is running with an unlocked frame rate, which may be causing the PS5's overheating issue. So far, no one has reported the same issue on PS5 Pro, but at least one user shared a video on YouTube that demonstrates the map menu's effect on their PS5 multiple times. Presumably this issue can be addressed in an update from Kojima Productions once it becomes aware of the problem.Continue Reading at GameSpot #death #stranding #could #causing #some
    WWW.GAMESPOT.COM
    Death Stranding 2 Could Be Causing Some PS5s To Overheat
    There are a lot of things for players to find in Death Stranding 2: On the Beach, but a few fans have already discovered an issue that appears to be overheating some PlayStation 5 consoles.Players on Reddit (via Push Square) have shared multiple accounts of Death Stranding 2 apparently causing PlayStation 5s to overheat when the map menu is called up on-screen. Fans have reported that the map causes the PS5 fan to loudly go into overdrive before they receive an overheating warning from the console itself.The working theory among players is that the map menu is running with an unlocked frame rate, which may be causing the PS5's overheating issue. So far, no one has reported the same issue on PS5 Pro, but at least one user shared a video on YouTube that demonstrates the map menu's effect on their PS5 multiple times. Presumably this issue can be addressed in an update from Kojima Productions once it becomes aware of the problem.Continue Reading at GameSpot
    0 Комментарии 0 Поделились
  • The 15 Best Games Like Hollow Knight To Get Lost In

    The hardest part about finding games like Hollow Knight is knowing where to start. The overwhelming success of Team Cherry's award-winning 2017 game--and anticipation of its long-awaited sequel, Silksong--prompted a flood of similar games all looking to capture the magic of combining soulslike combat with deep exploration. Some are more inventive than others, building on Hollow Knight's foundations to push that style of game forward in new or unexpected ways. Others take a specific aspect, such as grueling boss fights, and run with it. We've combed through the lot and picked out 15 of the best games like Hollow Knight to get you started in this impressively varied sub-genre.If you're not as excited about combat and want puzzles and exploration instead, head over to our list of the best metroidvania games. Nine Sols Platforms: PC, Xbox Series X|S, PlayStation 5Release Date: May 29, 2024Developer: Red Candle GamesIf Hollow Knight is the Dark Souls of metroidvanias, then Nine Sols is the genre's Sekiro: Shadows Die Twice. Parrying is at the core of almost everything you do in Nine Sols, from dealing with standard enemies to wearing down some of its relentless bosses. Among games like Hollow Knight, it's also one of the most thematically and visually distinct. Developer Red Candle Games call Nine Sols a "Taopunk," a blend of sci-fi punk with traditional Taoist architecture and symbolism. Most protagonists in games like these are blank slates, but Nine Sols adds a personal touch by making the personality of its hero, Yi, an important part of the story. Yi starts out seeking revenge, and ends up on a journey to save the world and himself, becoming a reluctant hero in the process. See at Humble Animal Well Platforms: PC, Nintendo Switch, Xbox Series X|S, PlayStation 5Release Date: May 9, 2024Developer: Billy Basso, Shared Memory LLCAnimal Well is a puzzle, or more accurately, a lot of puzzles. There's a bit of combat and some platforming, but mostly, it's about trying to unravel dozens of mysteries big and small as you delve ever further into a maze that wouldn't be out of place in Lewis Carrol's Wonderland stories. Explaining too much about what's going on would spoil what makes Animal Well special, but the most interesting and even subversive parts of it is that you have almost nothing to guide you and can make discoveries in any order. That freedom creates a sense of discovery and wonder that's often absent from the procedural methods inherent in these kinds of games.Read our Animal Well review. See at Steam Ultros Platforms: PC, PlayStation 5Release Date: February 13, 2024Developer: HadoqueOf all the games like Hollow Knight, Ultros takes the most organic approach to metroidvanias, and we mean that literally. You, an intergalactic explorer, arrive on a psychedelic space colony called The Sarcophagus and find it teeming with exotic life and mysterious spiritual energies. You use the life force and remains of enemies to nourish your mind and unlock new abilities, and there's a scoring system that ranks how efficiently you defeat your foes. That determines the quality of the loot they drop, so if you want to unlock and improve your skills, you have to plan each encounter carefully. Ultros is also absolutely beautiful, a dream-like blend of esoteric architecture and wild ecosystems with closer ties to the Sarcophagus' secrets than Ultros initially suggests. Read our Ultros review. See at Fanatical Blasphemous 2 Platforms: PlayStation 5, PC, Xbox Series X|S, Nintendo SwitchRelease Date: August 24, 2023Developer: The Game KitchenBlasphemous 2's big addition over its predecessor--apart from even more ghoulish and gory moments--is the inclusion of more platforming. The first Blasphemous is a bit one-note, which is great if you're just here for the combat, but not so much if you want, well, anything else. Blasphemous 2 throws in some challenging and smartly-designed platforming as well, bringing it closer to the likes of Hollow Knight. Better still, developer The Game Kitchen was more ambitious with its environment design as well, with more complex layouts, better backgrounds and lighting, and even colors that aren't brown, grey, and blood.Read our Blasphemous 2 review. See at Fanatical Elden Ring Platforms: PlayStation 5, PlayStation 4, Xbox One, Xbox Series X|S, PCRelease Date: February 25, 2022Developer: FromSoftwareOkay, so Elden Ring doesn't have the exploration style of Hollow Knight, but it does have the kind of grueling combat that inspired Team Cherry's spectacular boss fights, and lots of it. It's FromSoftware's first open-world game, one that follows a lone, nameless warrior in their bid to bring salvation to a shattered land--or make its ruin everlasting. Mostly, though, it's a giant playground for dozens of exceptionally well-designed and challenging bosses to stomparound in, with enemies ranging from fire-spewing land dragons to the spirits of an ancient civilization and a gigantic, greatsword-wielding prince on his favorite little horsey. Read our Elden Ring review. See at Fanatical Cuphead Platforms: PlayStation 4, Xbox One, PCRelease Date: September 29, 2017Developer: Studio MDHRIf you really like boss fights and are less bothered about exploration and all the other Hollow Knight-y bits, Cuphead is definitely worth checking out. Don't let Studio MDHR's retro cartoon style give you the wrong impression, either. Cuphead's cutesy bosses demand careful planning, precise execution, and a lot of patience. The battles aren't the only thing Cuphead has going for it, though. Studio MDHR's exquisite animation, the soundtrack, even period-specific cartoon-style sound effects--the entire game is a spectacle in the best way. Read our Cuphead review. See at Fanatical Castlevania Advance Collection Platforms: Xbox Series X|S, Nintendo Switch, PlayStation 5, PCRelease Date: September 23, 2021Developer: KonamiAny of the Castlevania bundles are strong picks, but the Castlevania Advance Collection should be your go-to choice for the kind of classic action that Hollow Knight builds on. The collection includes Circle of the Moon--low on our list of the best Castlevania games only on account of it not really doing anything that Symphony of the Night didn't--the excellent Harmony of Dissonance, and the even better Aria of Sorrow. Sorrow is the standout inclusion, one that radically shook up the Castlevania formula by removing the Belmonts from the equation, telling an entirely new story set in the distantfuture, and giving the protagonist, Soma, an ability that absorbed enemy souls for use in combat. It helps that Sorrow, as well as Harmony, have an excellent selection of bosses and some fantastically moody settings, too. See at Humble Ori and the Blind Forest Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: March 11, 2015Developer: Moon StudiosOri and the Blind Forest starts like the end of a Disney movie. A cute little creature finds a family in the middle of a dream-like forest, and that family gets taken away from them. Your job is to figure out why and find a way to save the woods. Ori takes Hollow Knight's demanding platforming even further with some segments that wouldn't feel out of place in something like Celeste, but the real stand-out feature is the map. In addition to being a well-designed metroidvania world, it's absolutely gorgeous and a delight to explore. Blind Forest is a modern classic, and its sequel, Ori and the Will of the Wisps manages to improve it even further.Read our Ori and the Blind Forest review and Ori and the Will of the Wisps review. See at Fanatical Ender Magnolia: Bloom in the Mist Platforms: PlayStation 4, PlayStation 5, Xbox One, Xbox Series X|S, PC, Nintendo SwitchRelease Date: March 25, 2024Developer: AdiglobeIn the onslaught of games like Hollow Knight that released following Team Cherry's success, developer Adiglobe decided to shake up the formula first with Ender Lilies and then with the more refined Ender Magnolia. You've got your standard elements, such as gigantic bosses that force you to learn their patterns and a puzzle-like map that unfolds as you gain more powers. Those powers, however, are the spirits of fallen friends who also aid you in combat. You find several, but can only recruit a handful at a time, which adds a layer of strategy to exploration and combat. There's also a strong sense of emotional attachment, since you and your ghostly allies have history and connection of a kind that's often missing in these games when you just play as an outside observer. See at Steam Metroid Dread Platforms: Nintendo SwitchRelease Date: October 8, 2021Developer: Nintendo EADAny Metroid game is going to have something of that Hollow Knight feel, since the sci-fi series is a big part of where the genre and Hollow Knight in particular came from. However, the easiest to get your hands on without having to pay for a subscription is Metroid Dread. It's the culmination of the 2D Metroid saga that started in 1986, but you can get by just fine if this is your first. Dread follows bounty hunter Samus Aran as she searches for evidence of a deadly parasite, only to find herself hunted by rogue robots--sometimes. The stealth segments that have you hiding from your metal hunters only take place in specific areas. The rest of Metroid Dread takes you across sprawling subterranean research labs, sunken testing stations, and extravagant dwellings, featuring the series' biggest map ever, stuffed with secrets and formidable bosses. If you really enjoy those bosses, Dread has a boss rush mode you can test yourself against as well.Read our Metroid Dread review. See Lone Fungus Platforms: PCRelease Date: September 21, 2021Developer: Basti GamesThese types of games tend to lean more toward the edgy, dark, and broody side of things, which makes Lone Fungus a gem in the genre. You're the last mushroom on Earth, exploring a vast network of tunnels and temples in search of treasure and using magic skills that change form depending on how you swing your sword. A ball of energy is damaging to one enemy, for example, but you can smack it and shatter it into several projectiles to clear out lots of foes at once. Best of all, though, Lone Fungus has a robust Assist Mode that lets people of all skill levels enjoy the game and just makes it more relaxed in general, with features such as extra platforms, no costs for spells, slower platforms, and invincibility so your little fungus won't die. See at Humble Dead Cells Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: May 10, 2027Developer: Motion TwinDead Cells throws roguelite randomness into the metroidvania mix and ratchets up the challenge as well. You're a spirit determined to figure out why you died, and in the absence of a tangible vessel for your ethereal self, you pilot shambling corpses in a bid to make it through streets, swamps, dungeons, and horrors untold. These corpses aren't the sturdiest, so when you fail, they fall apart and you start again. Eventually, you can unlock permanent upgrades, but with no checkpoints at any stage of the journey, you'll have to rely on your skill with weapons and knowledge of enemy behavior to make it through. If you enjoy Dead Cells, you can pick up DLC packs that add new locations, weapons, and enemies, and there's even a Castlevania-themed expansion as well. Read our Dead Cells review. See at Fanatical Monster Sanctuary Platforms: Nintendo Switch, Xbox Series One, PCRelease Date: August 28, 2019Developer: DeveloperIf you like a bit of Pokemon with your Hollow Knight, Monster Sanctuary might be for you. You play as a fledgling monster tamer out to explore the vast world with just a single critter by your side. You'll find and tame more, training them into the best versions of themselves and using their abilities not just to deal with threats in the sanctuary, but to explore its secrets and hidden areas as well. Monster Sanctuary is lighter and breezier than some games on this list, but if something more intense is to your liking, there's a robust PvP element where you can challenge other players and their monster teams too. See at Fanatical Salt and Sanctuary Platforms: PC, PlayStation 4, Xbox One, Nintendo SwitchRelease Date: March 15, 2016Developer: Ska StudiosUpgrades and customization are usually rather limited in games like Hollow Knight, which makes Salt and Sanctuary, with its more in-depth RPG components, something special. You play as a sailor, washed up on some evil-looking, godforsaken island and foolish enough to go exploring the mysterious labyrinth underneath. What you find is a parade of nightmares and some spectacular, bone-crunchingly hard 2D boss fights that are among the best Soulslike challenges out there. Read our Salt and Sanctuary review. See at Humble Bō: Path of the Teal Lotus Platforms: PC, Xbox Series X|S, PlayStation 5, Nintendo SwitchRelease Date: July 17, 2024Developer: Squid Shock StudiosMovement is often a means to an end in video games--a double jump that propels you higher, for example, or a dash that lets you avoid dangerous terrain. In Bō: Path of the Teal Lotus, it's no exaggeration to say movement is everything. Bouncing, flying, gliding, moving with magical speed, and navigating the hand-drawn world is just as important as battling the myths and monsters inspired by Japanese folklore. Path of the Teal Lotus is one of the most elegant platformers around, and it even has a reset system where you can pause or rewind a failed jump to try again, perfect for learning some of the more difficult segments. Read our Path of the Teal Lotus review. See at Fanatical
    #best #games #like #hollow #knight
    The 15 Best Games Like Hollow Knight To Get Lost In
    The hardest part about finding games like Hollow Knight is knowing where to start. The overwhelming success of Team Cherry's award-winning 2017 game--and anticipation of its long-awaited sequel, Silksong--prompted a flood of similar games all looking to capture the magic of combining soulslike combat with deep exploration. Some are more inventive than others, building on Hollow Knight's foundations to push that style of game forward in new or unexpected ways. Others take a specific aspect, such as grueling boss fights, and run with it. We've combed through the lot and picked out 15 of the best games like Hollow Knight to get you started in this impressively varied sub-genre.If you're not as excited about combat and want puzzles and exploration instead, head over to our list of the best metroidvania games. Nine Sols Platforms: PC, Xbox Series X|S, PlayStation 5Release Date: May 29, 2024Developer: Red Candle GamesIf Hollow Knight is the Dark Souls of metroidvanias, then Nine Sols is the genre's Sekiro: Shadows Die Twice. Parrying is at the core of almost everything you do in Nine Sols, from dealing with standard enemies to wearing down some of its relentless bosses. Among games like Hollow Knight, it's also one of the most thematically and visually distinct. Developer Red Candle Games call Nine Sols a "Taopunk," a blend of sci-fi punk with traditional Taoist architecture and symbolism. Most protagonists in games like these are blank slates, but Nine Sols adds a personal touch by making the personality of its hero, Yi, an important part of the story. Yi starts out seeking revenge, and ends up on a journey to save the world and himself, becoming a reluctant hero in the process. See at Humble Animal Well Platforms: PC, Nintendo Switch, Xbox Series X|S, PlayStation 5Release Date: May 9, 2024Developer: Billy Basso, Shared Memory LLCAnimal Well is a puzzle, or more accurately, a lot of puzzles. There's a bit of combat and some platforming, but mostly, it's about trying to unravel dozens of mysteries big and small as you delve ever further into a maze that wouldn't be out of place in Lewis Carrol's Wonderland stories. Explaining too much about what's going on would spoil what makes Animal Well special, but the most interesting and even subversive parts of it is that you have almost nothing to guide you and can make discoveries in any order. That freedom creates a sense of discovery and wonder that's often absent from the procedural methods inherent in these kinds of games.Read our Animal Well review. See at Steam Ultros Platforms: PC, PlayStation 5Release Date: February 13, 2024Developer: HadoqueOf all the games like Hollow Knight, Ultros takes the most organic approach to metroidvanias, and we mean that literally. You, an intergalactic explorer, arrive on a psychedelic space colony called The Sarcophagus and find it teeming with exotic life and mysterious spiritual energies. You use the life force and remains of enemies to nourish your mind and unlock new abilities, and there's a scoring system that ranks how efficiently you defeat your foes. That determines the quality of the loot they drop, so if you want to unlock and improve your skills, you have to plan each encounter carefully. Ultros is also absolutely beautiful, a dream-like blend of esoteric architecture and wild ecosystems with closer ties to the Sarcophagus' secrets than Ultros initially suggests. Read our Ultros review. See at Fanatical Blasphemous 2 Platforms: PlayStation 5, PC, Xbox Series X|S, Nintendo SwitchRelease Date: August 24, 2023Developer: The Game KitchenBlasphemous 2's big addition over its predecessor--apart from even more ghoulish and gory moments--is the inclusion of more platforming. The first Blasphemous is a bit one-note, which is great if you're just here for the combat, but not so much if you want, well, anything else. Blasphemous 2 throws in some challenging and smartly-designed platforming as well, bringing it closer to the likes of Hollow Knight. Better still, developer The Game Kitchen was more ambitious with its environment design as well, with more complex layouts, better backgrounds and lighting, and even colors that aren't brown, grey, and blood.Read our Blasphemous 2 review. See at Fanatical Elden Ring Platforms: PlayStation 5, PlayStation 4, Xbox One, Xbox Series X|S, PCRelease Date: February 25, 2022Developer: FromSoftwareOkay, so Elden Ring doesn't have the exploration style of Hollow Knight, but it does have the kind of grueling combat that inspired Team Cherry's spectacular boss fights, and lots of it. It's FromSoftware's first open-world game, one that follows a lone, nameless warrior in their bid to bring salvation to a shattered land--or make its ruin everlasting. Mostly, though, it's a giant playground for dozens of exceptionally well-designed and challenging bosses to stomparound in, with enemies ranging from fire-spewing land dragons to the spirits of an ancient civilization and a gigantic, greatsword-wielding prince on his favorite little horsey. Read our Elden Ring review. See at Fanatical Cuphead Platforms: PlayStation 4, Xbox One, PCRelease Date: September 29, 2017Developer: Studio MDHRIf you really like boss fights and are less bothered about exploration and all the other Hollow Knight-y bits, Cuphead is definitely worth checking out. Don't let Studio MDHR's retro cartoon style give you the wrong impression, either. Cuphead's cutesy bosses demand careful planning, precise execution, and a lot of patience. The battles aren't the only thing Cuphead has going for it, though. Studio MDHR's exquisite animation, the soundtrack, even period-specific cartoon-style sound effects--the entire game is a spectacle in the best way. Read our Cuphead review. See at Fanatical Castlevania Advance Collection Platforms: Xbox Series X|S, Nintendo Switch, PlayStation 5, PCRelease Date: September 23, 2021Developer: KonamiAny of the Castlevania bundles are strong picks, but the Castlevania Advance Collection should be your go-to choice for the kind of classic action that Hollow Knight builds on. The collection includes Circle of the Moon--low on our list of the best Castlevania games only on account of it not really doing anything that Symphony of the Night didn't--the excellent Harmony of Dissonance, and the even better Aria of Sorrow. Sorrow is the standout inclusion, one that radically shook up the Castlevania formula by removing the Belmonts from the equation, telling an entirely new story set in the distantfuture, and giving the protagonist, Soma, an ability that absorbed enemy souls for use in combat. It helps that Sorrow, as well as Harmony, have an excellent selection of bosses and some fantastically moody settings, too. See at Humble Ori and the Blind Forest Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: March 11, 2015Developer: Moon StudiosOri and the Blind Forest starts like the end of a Disney movie. A cute little creature finds a family in the middle of a dream-like forest, and that family gets taken away from them. Your job is to figure out why and find a way to save the woods. Ori takes Hollow Knight's demanding platforming even further with some segments that wouldn't feel out of place in something like Celeste, but the real stand-out feature is the map. In addition to being a well-designed metroidvania world, it's absolutely gorgeous and a delight to explore. Blind Forest is a modern classic, and its sequel, Ori and the Will of the Wisps manages to improve it even further.Read our Ori and the Blind Forest review and Ori and the Will of the Wisps review. See at Fanatical Ender Magnolia: Bloom in the Mist Platforms: PlayStation 4, PlayStation 5, Xbox One, Xbox Series X|S, PC, Nintendo SwitchRelease Date: March 25, 2024Developer: AdiglobeIn the onslaught of games like Hollow Knight that released following Team Cherry's success, developer Adiglobe decided to shake up the formula first with Ender Lilies and then with the more refined Ender Magnolia. You've got your standard elements, such as gigantic bosses that force you to learn their patterns and a puzzle-like map that unfolds as you gain more powers. Those powers, however, are the spirits of fallen friends who also aid you in combat. You find several, but can only recruit a handful at a time, which adds a layer of strategy to exploration and combat. There's also a strong sense of emotional attachment, since you and your ghostly allies have history and connection of a kind that's often missing in these games when you just play as an outside observer. See at Steam Metroid Dread Platforms: Nintendo SwitchRelease Date: October 8, 2021Developer: Nintendo EADAny Metroid game is going to have something of that Hollow Knight feel, since the sci-fi series is a big part of where the genre and Hollow Knight in particular came from. However, the easiest to get your hands on without having to pay for a subscription is Metroid Dread. It's the culmination of the 2D Metroid saga that started in 1986, but you can get by just fine if this is your first. Dread follows bounty hunter Samus Aran as she searches for evidence of a deadly parasite, only to find herself hunted by rogue robots--sometimes. The stealth segments that have you hiding from your metal hunters only take place in specific areas. The rest of Metroid Dread takes you across sprawling subterranean research labs, sunken testing stations, and extravagant dwellings, featuring the series' biggest map ever, stuffed with secrets and formidable bosses. If you really enjoy those bosses, Dread has a boss rush mode you can test yourself against as well.Read our Metroid Dread review. See Lone Fungus Platforms: PCRelease Date: September 21, 2021Developer: Basti GamesThese types of games tend to lean more toward the edgy, dark, and broody side of things, which makes Lone Fungus a gem in the genre. You're the last mushroom on Earth, exploring a vast network of tunnels and temples in search of treasure and using magic skills that change form depending on how you swing your sword. A ball of energy is damaging to one enemy, for example, but you can smack it and shatter it into several projectiles to clear out lots of foes at once. Best of all, though, Lone Fungus has a robust Assist Mode that lets people of all skill levels enjoy the game and just makes it more relaxed in general, with features such as extra platforms, no costs for spells, slower platforms, and invincibility so your little fungus won't die. See at Humble Dead Cells Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: May 10, 2027Developer: Motion TwinDead Cells throws roguelite randomness into the metroidvania mix and ratchets up the challenge as well. You're a spirit determined to figure out why you died, and in the absence of a tangible vessel for your ethereal self, you pilot shambling corpses in a bid to make it through streets, swamps, dungeons, and horrors untold. These corpses aren't the sturdiest, so when you fail, they fall apart and you start again. Eventually, you can unlock permanent upgrades, but with no checkpoints at any stage of the journey, you'll have to rely on your skill with weapons and knowledge of enemy behavior to make it through. If you enjoy Dead Cells, you can pick up DLC packs that add new locations, weapons, and enemies, and there's even a Castlevania-themed expansion as well. Read our Dead Cells review. See at Fanatical Monster Sanctuary Platforms: Nintendo Switch, Xbox Series One, PCRelease Date: August 28, 2019Developer: DeveloperIf you like a bit of Pokemon with your Hollow Knight, Monster Sanctuary might be for you. You play as a fledgling monster tamer out to explore the vast world with just a single critter by your side. You'll find and tame more, training them into the best versions of themselves and using their abilities not just to deal with threats in the sanctuary, but to explore its secrets and hidden areas as well. Monster Sanctuary is lighter and breezier than some games on this list, but if something more intense is to your liking, there's a robust PvP element where you can challenge other players and their monster teams too. See at Fanatical Salt and Sanctuary Platforms: PC, PlayStation 4, Xbox One, Nintendo SwitchRelease Date: March 15, 2016Developer: Ska StudiosUpgrades and customization are usually rather limited in games like Hollow Knight, which makes Salt and Sanctuary, with its more in-depth RPG components, something special. You play as a sailor, washed up on some evil-looking, godforsaken island and foolish enough to go exploring the mysterious labyrinth underneath. What you find is a parade of nightmares and some spectacular, bone-crunchingly hard 2D boss fights that are among the best Soulslike challenges out there. Read our Salt and Sanctuary review. See at Humble Bō: Path of the Teal Lotus Platforms: PC, Xbox Series X|S, PlayStation 5, Nintendo SwitchRelease Date: July 17, 2024Developer: Squid Shock StudiosMovement is often a means to an end in video games--a double jump that propels you higher, for example, or a dash that lets you avoid dangerous terrain. In Bō: Path of the Teal Lotus, it's no exaggeration to say movement is everything. Bouncing, flying, gliding, moving with magical speed, and navigating the hand-drawn world is just as important as battling the myths and monsters inspired by Japanese folklore. Path of the Teal Lotus is one of the most elegant platformers around, and it even has a reset system where you can pause or rewind a failed jump to try again, perfect for learning some of the more difficult segments. Read our Path of the Teal Lotus review. See at Fanatical #best #games #like #hollow #knight
    WWW.GAMESPOT.COM
    The 15 Best Games Like Hollow Knight To Get Lost In
    The hardest part about finding games like Hollow Knight is knowing where to start. The overwhelming success of Team Cherry's award-winning 2017 game--and anticipation of its long-awaited sequel, Silksong--prompted a flood of similar games all looking to capture the magic of combining soulslike combat with deep exploration. Some are more inventive than others, building on Hollow Knight's foundations to push that style of game forward in new or unexpected ways. Others take a specific aspect, such as grueling boss fights, and run with it. We've combed through the lot and picked out 15 of the best games like Hollow Knight to get you started in this impressively varied sub-genre.If you're not as excited about combat and want puzzles and exploration instead, head over to our list of the best metroidvania games. Nine Sols Platforms: PC, Xbox Series X|S, PlayStation 5Release Date: May 29, 2024Developer: Red Candle GamesIf Hollow Knight is the Dark Souls of metroidvanias, then Nine Sols is the genre's Sekiro: Shadows Die Twice. Parrying is at the core of almost everything you do in Nine Sols, from dealing with standard enemies to wearing down some of its relentless bosses. Among games like Hollow Knight, it's also one of the most thematically and visually distinct. Developer Red Candle Games call Nine Sols a "Taopunk," a blend of sci-fi punk with traditional Taoist architecture and symbolism. Most protagonists in games like these are blank slates, but Nine Sols adds a personal touch by making the personality of its hero, Yi, an important part of the story. Yi starts out seeking revenge, and ends up on a journey to save the world and himself, becoming a reluctant hero in the process. See at Humble Animal Well Platforms: PC, Nintendo Switch, Xbox Series X|S, PlayStation 5Release Date: May 9, 2024Developer: Billy Basso, Shared Memory LLCAnimal Well is a puzzle, or more accurately, a lot of puzzles. There's a bit of combat and some platforming, but mostly, it's about trying to unravel dozens of mysteries big and small as you delve ever further into a maze that wouldn't be out of place in Lewis Carrol's Wonderland stories. Explaining too much about what's going on would spoil what makes Animal Well special, but the most interesting and even subversive parts of it is that you have almost nothing to guide you and can make discoveries in any order. That freedom creates a sense of discovery and wonder that's often absent from the procedural methods inherent in these kinds of games.Read our Animal Well review. See at Steam Ultros Platforms: PC, PlayStation 5Release Date: February 13, 2024Developer: HadoqueOf all the games like Hollow Knight, Ultros takes the most organic approach to metroidvanias, and we mean that literally. You, an intergalactic explorer, arrive on a psychedelic space colony called The Sarcophagus and find it teeming with exotic life and mysterious spiritual energies. You use the life force and remains of enemies to nourish your mind and unlock new abilities, and there's a scoring system that ranks how efficiently you defeat your foes. That determines the quality of the loot they drop, so if you want to unlock and improve your skills, you have to plan each encounter carefully. Ultros is also absolutely beautiful, a dream-like blend of esoteric architecture and wild ecosystems with closer ties to the Sarcophagus' secrets than Ultros initially suggests. Read our Ultros review. See at Fanatical Blasphemous 2 Platforms: PlayStation 5, PC, Xbox Series X|S, Nintendo SwitchRelease Date: August 24, 2023Developer: The Game KitchenBlasphemous 2's big addition over its predecessor--apart from even more ghoulish and gory moments--is the inclusion of more platforming. The first Blasphemous is a bit one-note, which is great if you're just here for the combat, but not so much if you want, well, anything else. Blasphemous 2 throws in some challenging and smartly-designed platforming as well, bringing it closer to the likes of Hollow Knight. Better still, developer The Game Kitchen was more ambitious with its environment design as well, with more complex layouts, better backgrounds and lighting, and even colors that aren't brown, grey, and blood.Read our Blasphemous 2 review. See at Fanatical Elden Ring Platforms: PlayStation 5, PlayStation 4, Xbox One, Xbox Series X|S, PCRelease Date: February 25, 2022Developer: FromSoftwareOkay, so Elden Ring doesn't have the exploration style of Hollow Knight, but it does have the kind of grueling combat that inspired Team Cherry's spectacular boss fights, and lots of it. It's FromSoftware's first open-world game, one that follows a lone, nameless warrior in their bid to bring salvation to a shattered land--or make its ruin everlasting. Mostly, though, it's a giant playground for dozens of exceptionally well-designed and challenging bosses to stomp (you) around in, with enemies ranging from fire-spewing land dragons to the spirits of an ancient civilization and a gigantic, greatsword-wielding prince on his favorite little horsey. Read our Elden Ring review. See at Fanatical Cuphead Platforms: PlayStation 4, Xbox One, PCRelease Date: September 29, 2017Developer: Studio MDHRIf you really like boss fights and are less bothered about exploration and all the other Hollow Knight-y bits, Cuphead is definitely worth checking out. Don't let Studio MDHR's retro cartoon style give you the wrong impression, either. Cuphead's cutesy bosses demand careful planning, precise execution, and a lot of patience. The battles aren't the only thing Cuphead has going for it, though. Studio MDHR's exquisite animation, the soundtrack, even period-specific cartoon-style sound effects--the entire game is a spectacle in the best way. Read our Cuphead review. See at Fanatical Castlevania Advance Collection Platforms: Xbox Series X|S, Nintendo Switch, PlayStation 5, PCRelease Date: September 23, 2021Developer: KonamiAny of the Castlevania bundles are strong picks, but the Castlevania Advance Collection should be your go-to choice for the kind of classic action that Hollow Knight builds on. The collection includes Circle of the Moon--low on our list of the best Castlevania games only on account of it not really doing anything that Symphony of the Night didn't--the excellent Harmony of Dissonance, and the even better Aria of Sorrow. Sorrow is the standout inclusion, one that radically shook up the Castlevania formula by removing the Belmonts from the equation, telling an entirely new story set in the distant (at the time) future, and giving the protagonist, Soma, an ability that absorbed enemy souls for use in combat. It helps that Sorrow, as well as Harmony, have an excellent selection of bosses and some fantastically moody settings, too. See at Humble Ori and the Blind Forest Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: March 11, 2015Developer: Moon StudiosOri and the Blind Forest starts like the end of a Disney movie. A cute little creature finds a family in the middle of a dream-like forest, and that family gets taken away from them. Your job is to figure out why and find a way to save the woods. Ori takes Hollow Knight's demanding platforming even further with some segments that wouldn't feel out of place in something like Celeste, but the real stand-out feature is the map. In addition to being a well-designed metroidvania world, it's absolutely gorgeous and a delight to explore. Blind Forest is a modern classic, and its sequel, Ori and the Will of the Wisps manages to improve it even further.Read our Ori and the Blind Forest review and Ori and the Will of the Wisps review. See at Fanatical Ender Magnolia: Bloom in the Mist Platforms: PlayStation 4, PlayStation 5, Xbox One, Xbox Series X|S, PC, Nintendo SwitchRelease Date: March 25, 2024Developer: AdiglobeIn the onslaught of games like Hollow Knight that released following Team Cherry's success, developer Adiglobe decided to shake up the formula first with Ender Lilies and then with the more refined Ender Magnolia. You've got your standard elements, such as gigantic bosses that force you to learn their patterns and a puzzle-like map that unfolds as you gain more powers. Those powers, however, are the spirits of fallen friends who also aid you in combat. You find several, but can only recruit a handful at a time, which adds a layer of strategy to exploration and combat. There's also a strong sense of emotional attachment, since you and your ghostly allies have history and connection of a kind that's often missing in these games when you just play as an outside observer. See at Steam Metroid Dread Platforms: Nintendo SwitchRelease Date: October 8, 2021Developer: Nintendo EADAny Metroid game is going to have something of that Hollow Knight feel, since the sci-fi series is a big part of where the genre and Hollow Knight in particular came from. However, the easiest to get your hands on without having to pay for a subscription is Metroid Dread. It's the culmination of the 2D Metroid saga that started in 1986, but you can get by just fine if this is your first. Dread follows bounty hunter Samus Aran as she searches for evidence of a deadly parasite, only to find herself hunted by rogue robots--sometimes. The stealth segments that have you hiding from your metal hunters only take place in specific areas. The rest of Metroid Dread takes you across sprawling subterranean research labs, sunken testing stations, and extravagant dwellings, featuring the series' biggest map ever, stuffed with secrets and formidable bosses. If you really enjoy those bosses, Dread has a boss rush mode you can test yourself against as well.Read our Metroid Dread review. See at Amazon Lone Fungus Platforms: PCRelease Date: September 21, 2021Developer: Basti GamesThese types of games tend to lean more toward the edgy, dark, and broody side of things, which makes Lone Fungus a gem in the genre. You're the last mushroom on Earth, exploring a vast network of tunnels and temples in search of treasure and using magic skills that change form depending on how you swing your sword. A ball of energy is damaging to one enemy, for example, but you can smack it and shatter it into several projectiles to clear out lots of foes at once. Best of all, though, Lone Fungus has a robust Assist Mode that lets people of all skill levels enjoy the game and just makes it more relaxed in general, with features such as extra platforms, no costs for spells, slower platforms, and invincibility so your little fungus won't die. See at Humble Dead Cells Platforms: PlayStation 4, Xbox One, Nintendo Switch, PCRelease Date: May 10, 2027Developer: Motion TwinDead Cells throws roguelite randomness into the metroidvania mix and ratchets up the challenge as well. You're a spirit determined to figure out why you died, and in the absence of a tangible vessel for your ethereal self, you pilot shambling corpses in a bid to make it through streets, swamps, dungeons, and horrors untold. These corpses aren't the sturdiest, so when you fail, they fall apart and you start again. Eventually, you can unlock permanent upgrades, but with no checkpoints at any stage of the journey, you'll have to rely on your skill with weapons and knowledge of enemy behavior to make it through. If you enjoy Dead Cells, you can pick up DLC packs that add new locations, weapons, and enemies, and there's even a Castlevania-themed expansion as well. Read our Dead Cells review. See at Fanatical Monster Sanctuary Platforms: Nintendo Switch, Xbox Series One, PCRelease Date: August 28, 2019Developer: DeveloperIf you like a bit of Pokemon with your Hollow Knight, Monster Sanctuary might be for you. You play as a fledgling monster tamer out to explore the vast world with just a single critter by your side. You'll find and tame more, training them into the best versions of themselves and using their abilities not just to deal with threats in the sanctuary, but to explore its secrets and hidden areas as well. Monster Sanctuary is lighter and breezier than some games on this list, but if something more intense is to your liking, there's a robust PvP element where you can challenge other players and their monster teams too. See at Fanatical Salt and Sanctuary Platforms: PC, PlayStation 4, Xbox One, Nintendo SwitchRelease Date: March 15, 2016Developer: Ska StudiosUpgrades and customization are usually rather limited in games like Hollow Knight, which makes Salt and Sanctuary, with its more in-depth RPG components, something special. You play as a sailor, washed up on some evil-looking, godforsaken island and foolish enough to go exploring the mysterious labyrinth underneath. What you find is a parade of nightmares and some spectacular, bone-crunchingly hard 2D boss fights that are among the best Soulslike challenges out there. Read our Salt and Sanctuary review. See at Humble Bō: Path of the Teal Lotus Platforms: PC, Xbox Series X|S, PlayStation 5, Nintendo SwitchRelease Date: July 17, 2024Developer: Squid Shock StudiosMovement is often a means to an end in video games--a double jump that propels you higher, for example, or a dash that lets you avoid dangerous terrain. In Bō: Path of the Teal Lotus, it's no exaggeration to say movement is everything. Bouncing, flying, gliding, moving with magical speed, and navigating the hand-drawn world is just as important as battling the myths and monsters inspired by Japanese folklore. Path of the Teal Lotus is one of the most elegant platformers around, and it even has a reset system where you can pause or rewind a failed jump to try again, perfect for learning some of the more difficult segments. Read our Path of the Teal Lotus review. See at Fanatical
    Like
    Love
    Wow
    Sad
    32
    0 Комментарии 0 Поделились
  • Fancy airplane seats have officially reached their peak! I mean, what’s next? A personal butler serving caviar at 30,000 feet? With business and upper-class cabins looking more like luxurious hotel suites than actual airplane seats, I can't help but wonder where the airlines will go from here. Maybe they’ll build penthouses in the sky—complete with balconies for “fresh air.” Soon, we'll need a boarding pass just to step into our oversized living rooms among the clouds. Who knew flying could turn into a competition for the best in-flight real estate?

    #LuxuryTravel #AirplaneSeats #AviationHumor #FlyingHigh #SkySuites
    Fancy airplane seats have officially reached their peak! I mean, what’s next? A personal butler serving caviar at 30,000 feet? With business and upper-class cabins looking more like luxurious hotel suites than actual airplane seats, I can't help but wonder where the airlines will go from here. Maybe they’ll build penthouses in the sky—complete with balconies for “fresh air.” Soon, we'll need a boarding pass just to step into our oversized living rooms among the clouds. Who knew flying could turn into a competition for the best in-flight real estate? #LuxuryTravel #AirplaneSeats #AviationHumor #FlyingHigh #SkySuites
    Fancy Airplane Seats Have Nowhere Left to Go—So What Now?
    Upper and business class cabins have expanded to the point where the top tier resemble hotel suites more than passenger pods. But what happens now airlines have no more room to offer?
    Like
    Love
    Sad
    Wow
    Angry
    42
    1 Комментарии 0 Поделились
  • 10 Forgotten Disney Movies That Deserve More Love

    Disney has been one of the biggest references in animation since its first films, with many beloved classics like The Lion King, Pinocchio, and Snow White, and many more when looking at its past. However, among the many other movies released by the company since it started in the business, it is expected that some of these titles didn’t reach classic status.
    #forgotten #disney #movies #that #deserve
    10 Forgotten Disney Movies That Deserve More Love
    Disney has been one of the biggest references in animation since its first films, with many beloved classics like The Lion King, Pinocchio, and Snow White, and many more when looking at its past. However, among the many other movies released by the company since it started in the business, it is expected that some of these titles didn’t reach classic status. #forgotten #disney #movies #that #deserve
    GAMERANT.COM
    10 Forgotten Disney Movies That Deserve More Love
    Disney has been one of the biggest references in animation since its first films, with many beloved classics like The Lion King, Pinocchio, and Snow White, and many more when looking at its past. However, among the many other movies released by the company since it started in the business, it is expected that some of these titles didn’t reach classic status.
    Like
    Wow
    Love
    Angry
    18
    0 Комментарии 0 Поделились
  • European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets

    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven.
    To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing.
    At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem.
    NVIDIA Releases Tools for Accelerating Robot Development and Safety
    NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview.
    In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots.
    The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Boardto perform inspections across functional safety for robotics, in addition to automotive vehicles.
    “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB.
    Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements.
    To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide:

    Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX.
    A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety.
    An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety.

    Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers
    Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments.
    Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments.
    Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects.
    Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment.
    Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics.
    Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing.
    Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots.
    Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment.
    Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model.
    SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management.
    Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment.
    NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    See notice regarding software product information.
    #european #robot #makers #adopt #nvidia
    European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets
    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven. To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing. At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem. NVIDIA Releases Tools for Accelerating Robot Development and Safety NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview. In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots. The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Boardto perform inspections across functional safety for robotics, in addition to automotive vehicles. “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB. Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements. To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide: Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX. A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety. An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety. Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments. Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments. Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects. Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment. Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics. Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing. Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots. Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment. Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model. SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management. Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment. NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. See notice regarding software product information. #european #robot #makers #adopt #nvidia
    BLOGS.NVIDIA.COM
    European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets
    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven. To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing. At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem. NVIDIA Releases Tools for Accelerating Robot Development and Safety NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview. In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots. The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Board (ANAB) to perform inspections across functional safety for robotics, in addition to automotive vehicles. “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB. Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements. To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide: Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX. A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety. An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety. Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments. Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments. Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects. Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment. Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics. Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing. Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots. Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment. Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model. SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management. Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment. NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. See notice regarding software product information.
    Like
    Love
    Wow
    Angry
    15
    0 Комментарии 0 Поделились
  • NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI

    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions.
    Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges.
    To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure.
    Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations.
    Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint.

    NVIDIA Omniverse Blueprint for Smart City AI 
    The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes:

    NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale.
    NVIDIA Cosmos to generate synthetic data at scale for post-training AI models.
    NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language modelsand large language models.
    NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization, helping process vast amounts of video data and provide critical insights to optimize business processes.

    The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint.
    NVIDIA Partner Ecosystem Powers Smart Cities Worldwide
    The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own.
    SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning.
    This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management.
    Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption.

    The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second.
    Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events.
    To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second.

    Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance.
    Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases.
    The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems.

    Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins.
    Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%.

    Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance.
    Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities.
    Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents.
    Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #nvidia #brings #physical #european #cities
    NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI
    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions. Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges. To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure. Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations. Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint. NVIDIA Omniverse Blueprint for Smart City AI  The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes: NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale. NVIDIA Cosmos to generate synthetic data at scale for post-training AI models. NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language modelsand large language models. NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization, helping process vast amounts of video data and provide critical insights to optimize business processes. The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint. NVIDIA Partner Ecosystem Powers Smart Cities Worldwide The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own. SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning. This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management. Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption. The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second. Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events. To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second. Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance. Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases. The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems. Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins. Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%. Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance. Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities. Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents. Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #nvidia #brings #physical #european #cities
    BLOGS.NVIDIA.COM
    NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI
    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions. Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges. To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure. Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations. Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint. NVIDIA Omniverse Blueprint for Smart City AI  The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes: NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale. NVIDIA Cosmos to generate synthetic data at scale for post-training AI models. NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language models (VLMs) and large language models. NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization (VSS), helping process vast amounts of video data and provide critical insights to optimize business processes. The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint. NVIDIA Partner Ecosystem Powers Smart Cities Worldwide The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own. SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning. This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management. Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption. https://blogs.nvidia.com/wp-content/uploads/2025/06/01-Monaco-Akila.mp4 The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second. Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events. To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second. https://blogs.nvidia.com/wp-content/uploads/2025/06/02-K2K-Polermo-1600x900-1.mp4 Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance. Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases. The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems. https://blogs.nvidia.com/wp-content/uploads/2025/06/03-Milestone.mp4 Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins. Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%. https://blogs.nvidia.com/wp-content/uploads/2025/06/02-Linker-Vision-1280x680-1.mp4 Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance. Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities. Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents. Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Wow
    34
    0 Комментарии 0 Поделились
  • HPE and NVIDIA Debut AI Factory Stack to Power Next Industrial Shift

    To speed up AI adoption across industries, HPE and NVIDIA today launched new AI factory offerings at HPE Discover in Las Vegas.
    The new lineup includes everything from modular AI factory infrastructure and HPE’s AI-ready RTX PRO Servers, to the next generation of HPE’s turnkey AI platform, HPE Private Cloud AI. The goal: give enterprises a framework to build and scale generative, agentic and industrial AI.
    The NVIDIA AI Computing by HPE portfolio is now among the broadest in the market.
    The portfolio combines NVIDIA Blackwell accelerated computing, NVIDIA Spectrum-X Ethernet and NVIDIA BlueField-3 networking technologies, NVIDIA AI Enterprise software and HPE’s full portfolio of servers, storage, services and software. This now includes HPE OpsRamp Software, a validated observability solution for the NVIDIA Enterprise AI Factory, and HPE Morpheus Enterprise Software for orchestration. The result is a pre-integrated, modular infrastructure stack to help teams get AI into production faster.
    This includes the next-generation HPE Private Cloud AI, co-engineered with NVIDIA and validated as part of the NVIDIA Enterprise AI Factory framework. This full-stack, turnkey AI factory solution will offer HPE ProLiant Compute DL380a Gen12 servers with the new NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs.
    These new NVIDIA RTX PRO Servers from HPE provide a universal data center platform for a wide range of enterprise AI and industrial AI use cases, and are now available to order from HPE. HPE Private Cloud AI includes the latest NVIDIA AI Blueprints, including the NVIDIA AI-Q Blueprint for AI agent creation and workflows.
    HPE also announced a new NVIDIA HGX B300 system, the HPE Compute XD690, built with NVIDIA Blackwell Ultra GPUs. It’s the latest entry in the NVIDIA AI Computing by HPE lineup and is expected to ship in October.
    In Japan, KDDI is working with HPE to build NVIDIA AI infrastructure to accelerate global adoption.
    The HPE-built KDDI system will be based on the NVIDIA GB200 NVL72 platform, built on the NVIDIA Grace Blackwell architecture, at the KDDI Osaka Sakai Data Center.
    To accelerate AI for financial services, HPE will co-test agentic AI workflows built on Accenture’s AI Refinery with NVIDIA, running on HPE Private Cloud AI. Initial use cases include sourcing, procurement and risk analysis.
    HPE said it’s adding 26 new partners to its “Unleash AI” ecosystem to support more NVIDIA AI use cases. The company now offers more than 70 packaged AI workloads, from fraud detection and video analytics to sovereign AI and cybersecurity.
    Security and governance were a focus, too. HPE Private Cloud AI supports air-gapped management, multi-tenancy and post-quantum cryptography. HPE’s try-before-you-buy program lets customers test the system in Equinix data centers before purchase. HPE also introduced new programs, including AI Acceleration Workshops with NVIDIA, to help scale AI deployments.

    Watch the keynote: HPE CEO Antonio Neri announced the news from the Las Vegas Sphere on Tuesday at 9 a.m. PT. Register for the livestream and watch the replay.
    Explore more: Learn how NVIDIA and HPE build AI factories for every industry. Visit the partner page.
    #hpe #nvidia #debut #factory #stack
    HPE and NVIDIA Debut AI Factory Stack to Power Next Industrial Shift
    To speed up AI adoption across industries, HPE and NVIDIA today launched new AI factory offerings at HPE Discover in Las Vegas. The new lineup includes everything from modular AI factory infrastructure and HPE’s AI-ready RTX PRO Servers, to the next generation of HPE’s turnkey AI platform, HPE Private Cloud AI. The goal: give enterprises a framework to build and scale generative, agentic and industrial AI. The NVIDIA AI Computing by HPE portfolio is now among the broadest in the market. The portfolio combines NVIDIA Blackwell accelerated computing, NVIDIA Spectrum-X Ethernet and NVIDIA BlueField-3 networking technologies, NVIDIA AI Enterprise software and HPE’s full portfolio of servers, storage, services and software. This now includes HPE OpsRamp Software, a validated observability solution for the NVIDIA Enterprise AI Factory, and HPE Morpheus Enterprise Software for orchestration. The result is a pre-integrated, modular infrastructure stack to help teams get AI into production faster. This includes the next-generation HPE Private Cloud AI, co-engineered with NVIDIA and validated as part of the NVIDIA Enterprise AI Factory framework. This full-stack, turnkey AI factory solution will offer HPE ProLiant Compute DL380a Gen12 servers with the new NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs. These new NVIDIA RTX PRO Servers from HPE provide a universal data center platform for a wide range of enterprise AI and industrial AI use cases, and are now available to order from HPE. HPE Private Cloud AI includes the latest NVIDIA AI Blueprints, including the NVIDIA AI-Q Blueprint for AI agent creation and workflows. HPE also announced a new NVIDIA HGX B300 system, the HPE Compute XD690, built with NVIDIA Blackwell Ultra GPUs. It’s the latest entry in the NVIDIA AI Computing by HPE lineup and is expected to ship in October. In Japan, KDDI is working with HPE to build NVIDIA AI infrastructure to accelerate global adoption. The HPE-built KDDI system will be based on the NVIDIA GB200 NVL72 platform, built on the NVIDIA Grace Blackwell architecture, at the KDDI Osaka Sakai Data Center. To accelerate AI for financial services, HPE will co-test agentic AI workflows built on Accenture’s AI Refinery with NVIDIA, running on HPE Private Cloud AI. Initial use cases include sourcing, procurement and risk analysis. HPE said it’s adding 26 new partners to its “Unleash AI” ecosystem to support more NVIDIA AI use cases. The company now offers more than 70 packaged AI workloads, from fraud detection and video analytics to sovereign AI and cybersecurity. Security and governance were a focus, too. HPE Private Cloud AI supports air-gapped management, multi-tenancy and post-quantum cryptography. HPE’s try-before-you-buy program lets customers test the system in Equinix data centers before purchase. HPE also introduced new programs, including AI Acceleration Workshops with NVIDIA, to help scale AI deployments. Watch the keynote: HPE CEO Antonio Neri announced the news from the Las Vegas Sphere on Tuesday at 9 a.m. PT. Register for the livestream and watch the replay. Explore more: Learn how NVIDIA and HPE build AI factories for every industry. Visit the partner page. #hpe #nvidia #debut #factory #stack
    BLOGS.NVIDIA.COM
    HPE and NVIDIA Debut AI Factory Stack to Power Next Industrial Shift
    To speed up AI adoption across industries, HPE and NVIDIA today launched new AI factory offerings at HPE Discover in Las Vegas. The new lineup includes everything from modular AI factory infrastructure and HPE’s AI-ready RTX PRO Servers (HPE ProLiant Compute DL380a Gen12), to the next generation of HPE’s turnkey AI platform, HPE Private Cloud AI. The goal: give enterprises a framework to build and scale generative, agentic and industrial AI. The NVIDIA AI Computing by HPE portfolio is now among the broadest in the market. The portfolio combines NVIDIA Blackwell accelerated computing, NVIDIA Spectrum-X Ethernet and NVIDIA BlueField-3 networking technologies, NVIDIA AI Enterprise software and HPE’s full portfolio of servers, storage, services and software. This now includes HPE OpsRamp Software, a validated observability solution for the NVIDIA Enterprise AI Factory, and HPE Morpheus Enterprise Software for orchestration. The result is a pre-integrated, modular infrastructure stack to help teams get AI into production faster. This includes the next-generation HPE Private Cloud AI, co-engineered with NVIDIA and validated as part of the NVIDIA Enterprise AI Factory framework. This full-stack, turnkey AI factory solution will offer HPE ProLiant Compute DL380a Gen12 servers with the new NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs. These new NVIDIA RTX PRO Servers from HPE provide a universal data center platform for a wide range of enterprise AI and industrial AI use cases, and are now available to order from HPE. HPE Private Cloud AI includes the latest NVIDIA AI Blueprints, including the NVIDIA AI-Q Blueprint for AI agent creation and workflows. HPE also announced a new NVIDIA HGX B300 system, the HPE Compute XD690, built with NVIDIA Blackwell Ultra GPUs. It’s the latest entry in the NVIDIA AI Computing by HPE lineup and is expected to ship in October. In Japan, KDDI is working with HPE to build NVIDIA AI infrastructure to accelerate global adoption. The HPE-built KDDI system will be based on the NVIDIA GB200 NVL72 platform, built on the NVIDIA Grace Blackwell architecture, at the KDDI Osaka Sakai Data Center. To accelerate AI for financial services, HPE will co-test agentic AI workflows built on Accenture’s AI Refinery with NVIDIA, running on HPE Private Cloud AI. Initial use cases include sourcing, procurement and risk analysis. HPE said it’s adding 26 new partners to its “Unleash AI” ecosystem to support more NVIDIA AI use cases. The company now offers more than 70 packaged AI workloads, from fraud detection and video analytics to sovereign AI and cybersecurity. Security and governance were a focus, too. HPE Private Cloud AI supports air-gapped management, multi-tenancy and post-quantum cryptography. HPE’s try-before-you-buy program lets customers test the system in Equinix data centers before purchase. HPE also introduced new programs, including AI Acceleration Workshops with NVIDIA, to help scale AI deployments. Watch the keynote: HPE CEO Antonio Neri announced the news from the Las Vegas Sphere on Tuesday at 9 a.m. PT. Register for the livestream and watch the replay. Explore more: Learn how NVIDIA and HPE build AI factories for every industry. Visit the partner page.
    0 Комментарии 0 Поделились
  • Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety

    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse.
    Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing.
    These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation.
    To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools.
    Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale.
    Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale.
    NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale.
    Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models.

    Foundations for Scalable, Realistic Simulation
    Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots.

    In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools.
    Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos.
    Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing.
    The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases.
    Driving the Future of AV Safety
    To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety.
    The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems.
    These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks.

    At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance.
    Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay:

    Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks.
    Get Plugged Into the World of OpenUSD
    Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote.
    Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14.
    Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute.
    Explore the Alliance for OpenUSD forum and the AOUSD website.
    Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    #into #omniverse #world #foundation #models
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X. #into #omniverse #world #foundation #models
    BLOGS.NVIDIA.COM
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehicles (AVs) across countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models (WFMs) — neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description (OpenUSD), a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    0 Комментарии 0 Поделились
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Комментарии 0 Поделились
  • Ah, the great Pixar movies ranking saga! Who knew that a bunch of animated characters could cause such fierce debates among adults? It’s almost as if we’re all art critics now, judging the crème de la crème of animated filmmaking from the ‘90s to the mid-2010s. Remember when Pixar was the beacon of creativity? Well, it seems like the magic dust has settled a bit in the last decade—nothing like a sequel to remind us that sometimes, it’s okay to just let the original be great. So, here’s to ranking Pixar’s masterpieces from “Oh, that was cute” to “Did they really think we’d buy that?” Let the debates begin, folks.

    #Pixar #AnimationDebate #
    Ah, the great Pixar movies ranking saga! Who knew that a bunch of animated characters could cause such fierce debates among adults? It’s almost as if we’re all art critics now, judging the crème de la crème of animated filmmaking from the ‘90s to the mid-2010s. Remember when Pixar was the beacon of creativity? Well, it seems like the magic dust has settled a bit in the last decade—nothing like a sequel to remind us that sometimes, it’s okay to just let the original be great. So, here’s to ranking Pixar’s masterpieces from “Oh, that was cute” to “Did they really think we’d buy that?” Let the debates begin, folks. #Pixar #AnimationDebate #
    KOTAKU.COM
    The Pixar Movies, Ranked From Worst To Best
    For several decades, Pixar was the king of animated filmmaking. Its run from the ‘90s to the mid-2010s was marked by the kind of dazzling creative output that makes you believe art isn’t dead and was so inspiring, it no doubt made many consider pivot
    1 Комментарии 0 Поделились
Расширенные страницы