• Just when you thought your game assets couldn’t get any more stylized, SideFX drops Project Skylark like a magician pulling a rabbit from a hat. Now you can download free Houdini tools that promise to turn your 3D buildings into architectural masterpieces and your clouds into fluffy, Instagrammable puffs. Who knew procedural generators could make you feel like a real artist without the need for actual talent?

    So, grab your free tools and let the world believe your game is a work of art, while you sit back and enjoy the virtual applause. Remember, it’s not about the destination; it’s about pretending you know what you’re doing along the way!

    #HoudiniTools #GameAssets #ProjectSkylark #3
    Just when you thought your game assets couldn’t get any more stylized, SideFX drops Project Skylark like a magician pulling a rabbit from a hat. Now you can download free Houdini tools that promise to turn your 3D buildings into architectural masterpieces and your clouds into fluffy, Instagrammable puffs. Who knew procedural generators could make you feel like a real artist without the need for actual talent? So, grab your free tools and let the world believe your game is a work of art, while you sit back and enjoy the virtual applause. Remember, it’s not about the destination; it’s about pretending you know what you’re doing along the way! #HoudiniTools #GameAssets #ProjectSkylark #3
    Download free Houdini tools from SideFX’s Project Skylark
    Get custom tools for creating stylized game assets, including procedural generators for 3D buildings, bridges and clouds.
    1 Comments 0 Shares
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comments 0 Shares
  • You can now sell MetaHumans, or use them in Unity or Godot

    html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";

    The MetaHuman client reel. Epic Games’ framework for generating realistic 3D characters for games is out of early access, and can now be used with any DCC app or game engine.

    Epic Games has officially launched MetaHuman, its framework for generating realistic 3D characters for games, animation and VFX work, after four years in early access.The core applications, MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator, are now integrated into Unreal Engine 5.6, the latest version of the game engine.
    In addition, Epic has updated the licensing for MetaHuman characters, making it possible to use them in any game engine or DCC application, including in commercial projects.
    There are also two new free plugins, MetaHuman for Maya and MetaHuman for Houdini, intended to streamline the process of editing MetaHumans in Maya and Houdini.
    A suite of tools for generating and animating realistic real-time 3D characters

    First launched in early access in 2021, MetaHuman is a framework of tools for generating realistic 3D characters for next-gen games, animation, virtual production and VFX.The first component, MetaHuman Creator, enables users to design realistic digital humans.
    Users can generate new characters by blending between presets, then adjusting the proportions of the face by hand, and customising readymade hairstyles and clothing.
    The second component, Mesh to MetaHuman, makes it possible to create MetaHumans matching 3D scans or facial models created in other DCC apps.
    The final component, MetaHuman Animator, streamlines the process of transferring the facial performance of an actor from video footage to a MetaHuman character.
    MetaHuman Creator was originally a cloud-based tool, while Mesh to MetaHuman and MetaHuman Animator were available via the old MetaHuman plugin for Unreal Engine.
    Now integrated directly into Unreal Engine 5.6

    That changes with the end of early access, with MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator all now integrated directly into Unreal Engine itself.Integration – available in Unreal Engine 5.6, the latest version of the engine – is intended to simplify character creation and asset management worklows.
    Studios also get access to the MetaHuman source code, since Unreal Engine itself comes with full C++ source code access.
    However, the tools still cannot be run entirely locally: according to Epic, in-editor workflow is “enhanced by cloud services that deliver autorigging and texture synthesis”.


    Users can now adjust MetaHumans’ bodies, with a new unified Outfit Asset making it possible to create 3D clothing that adjusts automatically to bodily proportions.

    Updates to both MetaHuman Creator and MetaHuman Animator

    In addition, the official release introduces new features, with MetaHuman Creator’s parametric system for creating faces now extended to body shapes.Users can now adjust proportions like height, chest and waist measurements, and leg length, rather than simply selecting preset body types.
    Similarly, a new unified Outfit Asset makes it possible to author custom 3D clothing, rather than selecting readymade presets, with garments resizing to characters’ body shapes.
    MetaHuman Animator – which previously required footage from stereo head-mounted cameras or iPhones – now supports footage from mono cameras like webcams.
    The toolset can also now generate facial animation – both lip sync and head movement – solely from audio recordings, as well as from video footage.
    You can find fuller descriptions of the new features in Epic Games’ blog post.
    Use MetaHumans in Unity or Godot games, or sell them on online marketplaces

    Equally significantly, Epic has changed the licensing for MetaHumans.The MetaHuman toolset is now covered by the standard Unreal Engine EULA, meaning that it can be used for free by any artist or studio with under million/year in revenue.
    MetaHuman characters and clothing can also now be sold on online marketplaces, or used in commercial projects created with other DCC apps or game engines.
    The only exception is for AI: you can use MetaHumans in “workflows that incorporate artificial intelligence technology”, but not to train or enhance the AI models themselves.
    Studios earning more than million/year from projects that use MetaHuman characters need Unreal Engine seat licenses, with currently cost /year.
    However, since MetaHuman characters and animations are classed as ‘non-engine products’, they can be used in games created in other engines, like Unity or Godot, without incurring the 5% cut of the revenue that Epic takes from Unreal Engine games.

    The free MetaHuman for Maya plugin lets you edit MetaHumans with Maya’s native tools.

    New plugins streamline editing MetaHumans in Maya and Houdini

    Last but not least, Epic Games has released new free add-ons intended to streamline the process of editing MetaHumans in other DCC software.The MetaHuman for Maya plugin makes it possible to manipulate the MetaHuman mesh directly with Maya’s standard mesh-editing and sculpting tools.
    Users can also create MetaHuman-compatible hair grooms using Maya’s XGen toolset, and export them in Alembic format.
    The MetaHuman for Houdini plugin seems to be confined to grooming, with users able to create hairstyles using Houdini’s native tools, and export them in Alembic format.
    The plugins themselves are supplemented by MetaHuman Groom Starter Kits for Maya and Houdini, which provide readymade sample files for generating grooms.
    Price, licensing and system requirements

    MetaHuman Creator and MetaHuman Animator are integrated into Unreal Engine 5.6. The Unreal Editor is compatible with Windows 10+, macOS 14.0+ and RHEL/Rocky Linux 8+.The MetaHuman plugin for Maya is compatible with Maya 2022-2025. The MetaHuman for Houdini plugin is compatible with Houdini 20.5 with SideFX Labs installed.
    All of the software is free to use, including for commercial projects, if you earn under million/year. You can find more information on licensing in the story above.
    Read an overview of the changes to the MetaHuman software on Epic Games’ blog
    Download the free MetaHuman for Maya and Houdini plugins and starter kits
    Read Epic Games’ FAQs about the changes to licensing for MetaHumans

    Have your say on this story by following CG Channel on Facebook, Instagram and X. As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.
    #you #can #now #sell #metahumans
    You can now sell MetaHumans, or use them in Unity or Godot
    html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; The MetaHuman client reel. Epic Games’ framework for generating realistic 3D characters for games is out of early access, and can now be used with any DCC app or game engine. Epic Games has officially launched MetaHuman, its framework for generating realistic 3D characters for games, animation and VFX work, after four years in early access.The core applications, MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator, are now integrated into Unreal Engine 5.6, the latest version of the game engine. In addition, Epic has updated the licensing for MetaHuman characters, making it possible to use them in any game engine or DCC application, including in commercial projects. There are also two new free plugins, MetaHuman for Maya and MetaHuman for Houdini, intended to streamline the process of editing MetaHumans in Maya and Houdini. A suite of tools for generating and animating realistic real-time 3D characters First launched in early access in 2021, MetaHuman is a framework of tools for generating realistic 3D characters for next-gen games, animation, virtual production and VFX.The first component, MetaHuman Creator, enables users to design realistic digital humans. Users can generate new characters by blending between presets, then adjusting the proportions of the face by hand, and customising readymade hairstyles and clothing. The second component, Mesh to MetaHuman, makes it possible to create MetaHumans matching 3D scans or facial models created in other DCC apps. The final component, MetaHuman Animator, streamlines the process of transferring the facial performance of an actor from video footage to a MetaHuman character. MetaHuman Creator was originally a cloud-based tool, while Mesh to MetaHuman and MetaHuman Animator were available via the old MetaHuman plugin for Unreal Engine. Now integrated directly into Unreal Engine 5.6 That changes with the end of early access, with MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator all now integrated directly into Unreal Engine itself.Integration – available in Unreal Engine 5.6, the latest version of the engine – is intended to simplify character creation and asset management worklows. Studios also get access to the MetaHuman source code, since Unreal Engine itself comes with full C++ source code access. However, the tools still cannot be run entirely locally: according to Epic, in-editor workflow is “enhanced by cloud services that deliver autorigging and texture synthesis”. Users can now adjust MetaHumans’ bodies, with a new unified Outfit Asset making it possible to create 3D clothing that adjusts automatically to bodily proportions. Updates to both MetaHuman Creator and MetaHuman Animator In addition, the official release introduces new features, with MetaHuman Creator’s parametric system for creating faces now extended to body shapes.Users can now adjust proportions like height, chest and waist measurements, and leg length, rather than simply selecting preset body types. Similarly, a new unified Outfit Asset makes it possible to author custom 3D clothing, rather than selecting readymade presets, with garments resizing to characters’ body shapes. MetaHuman Animator – which previously required footage from stereo head-mounted cameras or iPhones – now supports footage from mono cameras like webcams. The toolset can also now generate facial animation – both lip sync and head movement – solely from audio recordings, as well as from video footage. You can find fuller descriptions of the new features in Epic Games’ blog post. Use MetaHumans in Unity or Godot games, or sell them on online marketplaces Equally significantly, Epic has changed the licensing for MetaHumans.The MetaHuman toolset is now covered by the standard Unreal Engine EULA, meaning that it can be used for free by any artist or studio with under million/year in revenue. MetaHuman characters and clothing can also now be sold on online marketplaces, or used in commercial projects created with other DCC apps or game engines. The only exception is for AI: you can use MetaHumans in “workflows that incorporate artificial intelligence technology”, but not to train or enhance the AI models themselves. Studios earning more than million/year from projects that use MetaHuman characters need Unreal Engine seat licenses, with currently cost /year. However, since MetaHuman characters and animations are classed as ‘non-engine products’, they can be used in games created in other engines, like Unity or Godot, without incurring the 5% cut of the revenue that Epic takes from Unreal Engine games. The free MetaHuman for Maya plugin lets you edit MetaHumans with Maya’s native tools. New plugins streamline editing MetaHumans in Maya and Houdini Last but not least, Epic Games has released new free add-ons intended to streamline the process of editing MetaHumans in other DCC software.The MetaHuman for Maya plugin makes it possible to manipulate the MetaHuman mesh directly with Maya’s standard mesh-editing and sculpting tools. Users can also create MetaHuman-compatible hair grooms using Maya’s XGen toolset, and export them in Alembic format. The MetaHuman for Houdini plugin seems to be confined to grooming, with users able to create hairstyles using Houdini’s native tools, and export them in Alembic format. The plugins themselves are supplemented by MetaHuman Groom Starter Kits for Maya and Houdini, which provide readymade sample files for generating grooms. Price, licensing and system requirements MetaHuman Creator and MetaHuman Animator are integrated into Unreal Engine 5.6. The Unreal Editor is compatible with Windows 10+, macOS 14.0+ and RHEL/Rocky Linux 8+.The MetaHuman plugin for Maya is compatible with Maya 2022-2025. The MetaHuman for Houdini plugin is compatible with Houdini 20.5 with SideFX Labs installed. All of the software is free to use, including for commercial projects, if you earn under million/year. You can find more information on licensing in the story above. Read an overview of the changes to the MetaHuman software on Epic Games’ blog Download the free MetaHuman for Maya and Houdini plugins and starter kits Read Epic Games’ FAQs about the changes to licensing for MetaHumans Have your say on this story by following CG Channel on Facebook, Instagram and X. As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects. #you #can #now #sell #metahumans
    You can now sell MetaHumans, or use them in Unity or Godot
    html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" The MetaHuman client reel. Epic Games’ framework for generating realistic 3D characters for games is out of early access, and can now be used with any DCC app or game engine. Epic Games has officially launched MetaHuman, its framework for generating realistic 3D characters for games, animation and VFX work, after four years in early access.The core applications, MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator, are now integrated into Unreal Engine 5.6, the latest version of the game engine. In addition, Epic has updated the licensing for MetaHuman characters, making it possible to use them in any game engine or DCC application, including in commercial projects. There are also two new free plugins, MetaHuman for Maya and MetaHuman for Houdini, intended to streamline the process of editing MetaHumans in Maya and Houdini. A suite of tools for generating and animating realistic real-time 3D characters First launched in early access in 2021, MetaHuman is a framework of tools for generating realistic 3D characters for next-gen games, animation, virtual production and VFX.The first component, MetaHuman Creator, enables users to design realistic digital humans. Users can generate new characters by blending between presets, then adjusting the proportions of the face by hand, and customising readymade hairstyles and clothing. The second component, Mesh to MetaHuman, makes it possible to create MetaHumans matching 3D scans or facial models created in other DCC apps. The final component, MetaHuman Animator, streamlines the process of transferring the facial performance of an actor from video footage to a MetaHuman character. MetaHuman Creator was originally a cloud-based tool, while Mesh to MetaHuman and MetaHuman Animator were available via the old MetaHuman plugin for Unreal Engine. Now integrated directly into Unreal Engine 5.6 That changes with the end of early access, with MetaHuman Creator, Mesh to MetaHuman and MetaHuman Animator all now integrated directly into Unreal Engine itself.Integration – available in Unreal Engine 5.6, the latest version of the engine – is intended to simplify character creation and asset management worklows. Studios also get access to the MetaHuman source code, since Unreal Engine itself comes with full C++ source code access. However, the tools still cannot be run entirely locally: according to Epic, in-editor workflow is “enhanced by cloud services that deliver autorigging and texture synthesis”. https://www.cgchannel.com/wp-content/uploads/2025/06/250604_MetaHumanOfficialLaunch_LicensingChanges_UnifiedClothing.mp4 Users can now adjust MetaHumans’ bodies, with a new unified Outfit Asset making it possible to create 3D clothing that adjusts automatically to bodily proportions. Updates to both MetaHuman Creator and MetaHuman Animator In addition, the official release introduces new features, with MetaHuman Creator’s parametric system for creating faces now extended to body shapes.Users can now adjust proportions like height, chest and waist measurements, and leg length, rather than simply selecting preset body types. Similarly, a new unified Outfit Asset makes it possible to author custom 3D clothing, rather than selecting readymade presets, with garments resizing to characters’ body shapes. MetaHuman Animator – which previously required footage from stereo head-mounted cameras or iPhones – now supports footage from mono cameras like webcams. The toolset can also now generate facial animation – both lip sync and head movement – solely from audio recordings, as well as from video footage. You can find fuller descriptions of the new features in Epic Games’ blog post. Use MetaHumans in Unity or Godot games, or sell them on online marketplaces Equally significantly, Epic has changed the licensing for MetaHumans.The MetaHuman toolset is now covered by the standard Unreal Engine EULA, meaning that it can be used for free by any artist or studio with under $1 million/year in revenue. MetaHuman characters and clothing can also now be sold on online marketplaces, or used in commercial projects created with other DCC apps or game engines. The only exception is for AI: you can use MetaHumans in “workflows that incorporate artificial intelligence technology”, but not to train or enhance the AI models themselves. Studios earning more than $1 million/year from projects that use MetaHuman characters need Unreal Engine seat licenses, with currently cost $1,850/year. However, since MetaHuman characters and animations are classed as ‘non-engine products’, they can be used in games created in other engines, like Unity or Godot, without incurring the 5% cut of the revenue that Epic takes from Unreal Engine games. https://www.cgchannel.com/wp-content/uploads/2025/06/250604_MetaHumanOfficialLaunch_LicensingChanges_MetaHumanForMaya.mp4 The free MetaHuman for Maya plugin lets you edit MetaHumans with Maya’s native tools. New plugins streamline editing MetaHumans in Maya and Houdini Last but not least, Epic Games has released new free add-ons intended to streamline the process of editing MetaHumans in other DCC software.The MetaHuman for Maya plugin makes it possible to manipulate the MetaHuman mesh directly with Maya’s standard mesh-editing and sculpting tools. Users can also create MetaHuman-compatible hair grooms using Maya’s XGen toolset, and export them in Alembic format. The MetaHuman for Houdini plugin seems to be confined to grooming, with users able to create hairstyles using Houdini’s native tools, and export them in Alembic format. The plugins themselves are supplemented by MetaHuman Groom Starter Kits for Maya and Houdini, which provide readymade sample files for generating grooms. Price, licensing and system requirements MetaHuman Creator and MetaHuman Animator are integrated into Unreal Engine 5.6. The Unreal Editor is compatible with Windows 10+, macOS 14.0+ and RHEL/Rocky Linux 8+.The MetaHuman plugin for Maya is compatible with Maya 2022-2025. The MetaHuman for Houdini plugin is compatible with Houdini 20.5 with SideFX Labs installed. All of the software is free to use, including for commercial projects, if you earn under $1 million/year. You can find more information on licensing in the story above. Read an overview of the changes to the MetaHuman software on Epic Games’ blog Download the free MetaHuman for Maya and Houdini plugins and starter kits Read Epic Games’ FAQs about the changes to licensing for MetaHumans Have your say on this story by following CG Channel on Facebook, Instagram and X (formerly Twitter). As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.
    Like
    Love
    Wow
    Sad
    Angry
    184
    0 Comments 0 Shares
  • Watch As Sparkling Cubes Assemble Into Human Figure In This Simulation

    Посмотреть эту публикацию в InstagramПубликация от AleefFXInspired by a scene from the Bollywood superhero film Ra.One, this simulation by Aqib Aleef was created using Houdini and rendered in Blender's EEVEE. You can also catch a behind-the-scenes look at its early stages:If you use Houdini, you'll want to check out Project Grot, an in-house tech demo released last spring to highlight the power of procedural generation in game development. It recently got a fresh batch of free tutorials and tools focusing on procedural bridge rock structures, smoke simulations, lava terrain, and more.We also suggest keeping an eye on Julian Bragagna's upcoming tool, which will be featured in SideFX's next free learning project and tech demo, Project Skylark:Also, join our 80 Level Talent platform and our new Discord server, follow us on Instagram, Twitter, LinkedIn, Telegram, TikTok, and Threads, where we share breakdowns, the latest news, awesome artworks, and more. 
    #watch #sparkling #cubes #assemble #into
    Watch As Sparkling Cubes Assemble Into Human Figure In This Simulation
    Посмотреть эту публикацию в InstagramПубликация от AleefFXInspired by a scene from the Bollywood superhero film Ra.One, this simulation by Aqib Aleef was created using Houdini and rendered in Blender's EEVEE. You can also catch a behind-the-scenes look at its early stages:If you use Houdini, you'll want to check out Project Grot, an in-house tech demo released last spring to highlight the power of procedural generation in game development. It recently got a fresh batch of free tutorials and tools focusing on procedural bridge rock structures, smoke simulations, lava terrain, and more.We also suggest keeping an eye on Julian Bragagna's upcoming tool, which will be featured in SideFX's next free learning project and tech demo, Project Skylark:Also, join our 80 Level Talent platform and our new Discord server, follow us on Instagram, Twitter, LinkedIn, Telegram, TikTok, and Threads, where we share breakdowns, the latest news, awesome artworks, and more.  #watch #sparkling #cubes #assemble #into
    80.LV
    Watch As Sparkling Cubes Assemble Into Human Figure In This Simulation
    Посмотреть эту публикацию в InstagramПубликация от AleefFX (@aleeffxtd)Inspired by a scene from the Bollywood superhero film Ra.One, this simulation by Aqib Aleef was created using Houdini and rendered in Blender's EEVEE. You can also catch a behind-the-scenes look at its early stages:If you use Houdini, you'll want to check out Project Grot, an in-house tech demo released last spring to highlight the power of procedural generation in game development. It recently got a fresh batch of free tutorials and tools focusing on procedural bridge rock structures, smoke simulations, lava terrain, and more.We also suggest keeping an eye on Julian Bragagna's upcoming tool, which will be featured in SideFX's next free learning project and tech demo, Project Skylark:Also, join our 80 Level Talent platform and our new Discord server, follow us on Instagram, Twitter, LinkedIn, Telegram, TikTok, and Threads, where we share breakdowns, the latest news, awesome artworks, and more. 
    0 Comments 0 Shares
  • Stunning wave simulation in Houdini

    Charlie Chapman shared a look at an awesome project made with the help of Houdini. The simulation took approximately 4,5 minutes for a frame at 2K (700 samples, RTX 4090). The work was crafted entirely in SideFX Houdini, rendered in Karma XPU, comp in Nuke.

    #houdini #nuke #karmaxpu #karma #3dart #rendering #render #simulation #3dmodeling
    Stunning wave simulation in Houdini 🌊 Charlie Chapman shared a look at an awesome project made with the help of Houdini. The simulation took approximately 4,5 minutes for a frame at 2K (700 samples, RTX 4090). The work was crafted entirely in SideFX Houdini, rendered in Karma XPU, comp in Nuke. #houdini #nuke #karmaxpu #karma #3dart #rendering #render #simulation #3dmodeling
    Like
    Love
    2
    0 Comments 0 Shares 4
  • Matthieu Pujol presented a realistic icebreaker simulation set up using Vellum grains in Houdini 19.5 and rendered with Arnold.
    #houdini #sidefx #sidefxhoudini #3dsimulation #3d #3dart #arnoldrender #3dsim
    Matthieu Pujol presented a realistic icebreaker simulation set up using Vellum grains in Houdini 19.5 and rendered with Arnold. #houdini #sidefx #sidefxhoudini #3dsimulation #3d #3dart #arnoldrender #3dsim
    Love
    2
    0 Comments 0 Shares 42
  • VFX Artist José A. Martínez shared a great setup for creating procedural and customizable rope bridges in Houdini.
    Download the setup for free:
    https://80.lv/articles/grab-a-free-setup-for-creating-procedural-rope-bridges-in-houdini/

    #houdini #sidefx #sidefxhoudini #procedural #proceduralart #techart #technicalart #3d #gamedev #indiedev
    VFX Artist José A. Martínez shared a great setup for creating procedural and customizable rope bridges in Houdini. Download the setup for free: https://80.lv/articles/grab-a-free-setup-for-creating-procedural-rope-bridges-in-houdini/ #houdini #sidefx #sidefxhoudini #procedural #proceduralart #techart #technicalart #3d #gamedev #indiedev
    Love
    1
    0 Comments 0 Shares 56
  • Setting up simulations with Houdini's Vellum

    3D and VFX Artist Emīls Geršinskis-Ješinskis shared an impressive new simulation powered by SideFX's Houdini. The artist utilized Vellum, a simulation framework that uses an extended Position Based Dynamics approach to enable the simulation of cloth, hair, soft bodies, balloons, and grains.

    #houdini #simulation #3dart #gamedev #indiedev #3dmodeling
    Setting up simulations with Houdini's Vellum 📌 3D and VFX Artist Emīls Geršinskis-Ješinskis shared an impressive new simulation powered by SideFX's Houdini. The artist utilized Vellum, a simulation framework that uses an extended Position Based Dynamics approach to enable the simulation of cloth, hair, soft bodies, balloons, and grains. #houdini #simulation #3dart #gamedev #indiedev #3dmodeling
    Love
    1
    0 Comments 0 Shares 33