• So, there’s this thing called the Franck-Hertz experiment. It’s one of those physics experiments that people rave about, but honestly, I don’t get why. It was done way back in 1914, and it’s supposed to explain how energy comes in these “packets” called “quanta.” Sounds fancy, but like, does it really change anything?

    They say this experiment marked the start of quantum physics, which I guess is important for some. It’s all about those little particles and how they behave. If you’re into that sort of thing, you might want to look into doing a DIY version of the Franck-Hertz experiment. Apparently, it’s not too hard and you can even do it at home. But let’s be real, who has the energy for that?

    You just set up a tube with some mercury vapor and run some voltage through it. Then you measure the current and see how it changes as you adjust the voltage. It’s all about those energy levels and how electrons bounce around. But, like, I don’t know how many people are actually excited to do this. Maybe if you’re a physics enthusiast, it’ll be fun for you.

    But if you’re like me and prefer to just scroll through your phone or binge-watch a show, then this sounds like a lot of work for not much payoff. I mean, who really wants to dive into the intricacies of quantum physics when there are so many other things to do—like anything else?

    So, if you’re curious about the Franck-Hertz experiment and want to try it yourself, go ahead. Just know that you might end up feeling a bit underwhelmed. Science can be cool, but sometimes it feels like a chore, especially when it’s all about tiny particles that you can’t even see.

    Anyway, that’s my take on it. If you’re still interested in quantum physics after this, good for you. I’ll just be over here, probably napping or scrolling through social media.

    #FranckHertz #QuantumPhysics #DIYScience #PhysicsExperiment #Boredom
    So, there’s this thing called the Franck-Hertz experiment. It’s one of those physics experiments that people rave about, but honestly, I don’t get why. It was done way back in 1914, and it’s supposed to explain how energy comes in these “packets” called “quanta.” Sounds fancy, but like, does it really change anything? They say this experiment marked the start of quantum physics, which I guess is important for some. It’s all about those little particles and how they behave. If you’re into that sort of thing, you might want to look into doing a DIY version of the Franck-Hertz experiment. Apparently, it’s not too hard and you can even do it at home. But let’s be real, who has the energy for that? You just set up a tube with some mercury vapor and run some voltage through it. Then you measure the current and see how it changes as you adjust the voltage. It’s all about those energy levels and how electrons bounce around. But, like, I don’t know how many people are actually excited to do this. Maybe if you’re a physics enthusiast, it’ll be fun for you. But if you’re like me and prefer to just scroll through your phone or binge-watch a show, then this sounds like a lot of work for not much payoff. I mean, who really wants to dive into the intricacies of quantum physics when there are so many other things to do—like anything else? So, if you’re curious about the Franck-Hertz experiment and want to try it yourself, go ahead. Just know that you might end up feeling a bit underwhelmed. Science can be cool, but sometimes it feels like a chore, especially when it’s all about tiny particles that you can’t even see. Anyway, that’s my take on it. If you’re still interested in quantum physics after this, good for you. I’ll just be over here, probably napping or scrolling through social media. #FranckHertz #QuantumPhysics #DIYScience #PhysicsExperiment #Boredom
    A DIY Version of the Franck-Hertz Experiment
    The Franck–Hertz experiment was a pioneering physics observation announced in 1914 which explained that energy came in “packets” which we call “quanta”, marking the beginning of quantum physics. Recently, [Markus …read m
    Like
    Love
    Wow
    Sad
    Angry
    536
    1 Comentários 0 Compartilhamentos 0 Anterior
  • Scientists Detect Unusual Airborne Toxin in the United States for the First Time

    Researchers unexpectedly discovered toxic airborne pollutants in Oklahoma. The image above depicts a field in Oklahoma. Credit: Shutterstock
    University of Colorado Boulder researchers made the first-ever airborne detection of Medium Chain Chlorinated Paraffinsin the Western Hemisphere.
    Sometimes, scientific research feels a lot like solving a mystery. Scientists head into the field with a clear goal and a solid hypothesis, but then the data reveals something surprising. That’s when the real detective work begins.
    This is exactly what happened to a team from the University of Colorado Boulder during a recent field study in rural Oklahoma. They were using a state-of-the-art instrument to track how tiny particles form and grow in the air. But instead of just collecting expected data, they uncovered something completely new: the first-ever airborne detection of Medium Chain Chlorinated Paraffins, a kind of toxic organic pollutant, in the Western Hemisphere. The teams findings were published in ACS Environmental Au.
    “It’s very exciting as a scientist to find something unexpected like this that we weren’t looking for,” said Daniel Katz, CU Boulder chemistry PhD student and lead author of the study. “We’re starting to learn more about this toxic, organic pollutant that we know is out there, and which we need to understand better.”
    MCCPs are currently under consideration for regulation by the Stockholm Convention, a global treaty to protect human health from long-standing and widespread chemicals. While the toxic pollutants have been measured in Antarctica and Asia, researchers haven’t been sure how to document them in the Western Hemisphere’s atmosphere until now.
    From Wastewater to Farmlands
    MCCPs are used in fluids for metal working and in the construction of PVC and textiles. They are often found in wastewater and as a result, can end up in biosolid fertilizer, also called sewage sludge, which is created when liquid is removed from wastewater in a treatment plant. In Oklahoma, researchers suspect the MCCPs they identified came from biosolid fertilizer in the fields near where they set up their instrument.
    “When sewage sludges are spread across the fields, those toxic compounds could be released into the air,” Katz said. “We can’t show directly that that’s happening, but we think it’s a reasonable way that they could be winding up in the air. Sewage sludge fertilizers have been shown to release similar compounds.”
    MCCPs little cousins, Short Chain Chlorinated Paraffins, are currently regulated by the Stockholm Convention, and since 2009, by the EPA here in the United States. Regulation came after studies found the toxic pollutants, which travel far and last a long time in the atmosphere, were harmful to human health. But researchers hypothesize that the regulation of SCCPs may have increased MCCPs in the environment.
    “We always have these unintended consequences of regulation, where you regulate something, and then there’s still a need for the products that those were in,” said Ellie Browne, CU Boulder chemistry professor, CIRES Fellow, and co-author of the study. “So they get replaced by something.”
    Measurement of aerosols led to a new and surprising discovery
    Using a nitrate chemical ionization mass spectrometer, which allows scientists to identify chemical compounds in the air, the team measured air at the agricultural site 24 hours a day for one month. As Katz cataloged the data, he documented the different isotopic patterns in the compounds. The compounds measured by the team had distinct patterns, and he noticed new patterns that he immediately identified as different from the known chemical compounds. With some additional research, he identified them as chlorinated paraffins found in MCCPs.
    Katz says the makeup of MCCPs are similar to PFAS, long-lasting toxic chemicals that break down slowly over time. Known as “forever chemicals,” their presence in soils recently led the Oklahoma Senate to ban biosolid fertilizer.
    Now that researchers know how to measure MCCPs, the next step might be to measure the pollutants at different times throughout the year to understand how levels change each season. Many unknowns surrounding MCCPs remain, and there’s much more to learn about their environmental impacts.
    “We identified them, but we still don’t know exactly what they do when they are in the atmosphere, and they need to be investigated further,” Katz said. “I think it’s important that we continue to have governmental agencies that are capable of evaluating the science and regulating these chemicals as necessary for public health and safety.”
    Reference: “Real-Time Measurements of Gas-Phase Medium-Chain Chlorinated Paraffins Reveal Daily Changes in Gas-Particle Partitioning Controlled by Ambient Temperature” by Daniel John Katz, Bri Dobson, Mitchell Alton, Harald Stark, Douglas R. Worsnop, Manjula R. Canagaratna and Eleanor C. Browne, 5 June 2025, ACS Environmental Au.
    DOI: 10.1021/acsenvironau.5c00038
    Never miss a breakthrough: Join the SciTechDaily newsletter.
    #scientists #detect #unusual #airborne #toxin
    Scientists Detect Unusual Airborne Toxin in the United States for the First Time
    Researchers unexpectedly discovered toxic airborne pollutants in Oklahoma. The image above depicts a field in Oklahoma. Credit: Shutterstock University of Colorado Boulder researchers made the first-ever airborne detection of Medium Chain Chlorinated Paraffinsin the Western Hemisphere. Sometimes, scientific research feels a lot like solving a mystery. Scientists head into the field with a clear goal and a solid hypothesis, but then the data reveals something surprising. That’s when the real detective work begins. This is exactly what happened to a team from the University of Colorado Boulder during a recent field study in rural Oklahoma. They were using a state-of-the-art instrument to track how tiny particles form and grow in the air. But instead of just collecting expected data, they uncovered something completely new: the first-ever airborne detection of Medium Chain Chlorinated Paraffins, a kind of toxic organic pollutant, in the Western Hemisphere. The teams findings were published in ACS Environmental Au. “It’s very exciting as a scientist to find something unexpected like this that we weren’t looking for,” said Daniel Katz, CU Boulder chemistry PhD student and lead author of the study. “We’re starting to learn more about this toxic, organic pollutant that we know is out there, and which we need to understand better.” MCCPs are currently under consideration for regulation by the Stockholm Convention, a global treaty to protect human health from long-standing and widespread chemicals. While the toxic pollutants have been measured in Antarctica and Asia, researchers haven’t been sure how to document them in the Western Hemisphere’s atmosphere until now. From Wastewater to Farmlands MCCPs are used in fluids for metal working and in the construction of PVC and textiles. They are often found in wastewater and as a result, can end up in biosolid fertilizer, also called sewage sludge, which is created when liquid is removed from wastewater in a treatment plant. In Oklahoma, researchers suspect the MCCPs they identified came from biosolid fertilizer in the fields near where they set up their instrument. “When sewage sludges are spread across the fields, those toxic compounds could be released into the air,” Katz said. “We can’t show directly that that’s happening, but we think it’s a reasonable way that they could be winding up in the air. Sewage sludge fertilizers have been shown to release similar compounds.” MCCPs little cousins, Short Chain Chlorinated Paraffins, are currently regulated by the Stockholm Convention, and since 2009, by the EPA here in the United States. Regulation came after studies found the toxic pollutants, which travel far and last a long time in the atmosphere, were harmful to human health. But researchers hypothesize that the regulation of SCCPs may have increased MCCPs in the environment. “We always have these unintended consequences of regulation, where you regulate something, and then there’s still a need for the products that those were in,” said Ellie Browne, CU Boulder chemistry professor, CIRES Fellow, and co-author of the study. “So they get replaced by something.” Measurement of aerosols led to a new and surprising discovery Using a nitrate chemical ionization mass spectrometer, which allows scientists to identify chemical compounds in the air, the team measured air at the agricultural site 24 hours a day for one month. As Katz cataloged the data, he documented the different isotopic patterns in the compounds. The compounds measured by the team had distinct patterns, and he noticed new patterns that he immediately identified as different from the known chemical compounds. With some additional research, he identified them as chlorinated paraffins found in MCCPs. Katz says the makeup of MCCPs are similar to PFAS, long-lasting toxic chemicals that break down slowly over time. Known as “forever chemicals,” their presence in soils recently led the Oklahoma Senate to ban biosolid fertilizer. Now that researchers know how to measure MCCPs, the next step might be to measure the pollutants at different times throughout the year to understand how levels change each season. Many unknowns surrounding MCCPs remain, and there’s much more to learn about their environmental impacts. “We identified them, but we still don’t know exactly what they do when they are in the atmosphere, and they need to be investigated further,” Katz said. “I think it’s important that we continue to have governmental agencies that are capable of evaluating the science and regulating these chemicals as necessary for public health and safety.” Reference: “Real-Time Measurements of Gas-Phase Medium-Chain Chlorinated Paraffins Reveal Daily Changes in Gas-Particle Partitioning Controlled by Ambient Temperature” by Daniel John Katz, Bri Dobson, Mitchell Alton, Harald Stark, Douglas R. Worsnop, Manjula R. Canagaratna and Eleanor C. Browne, 5 June 2025, ACS Environmental Au. DOI: 10.1021/acsenvironau.5c00038 Never miss a breakthrough: Join the SciTechDaily newsletter. #scientists #detect #unusual #airborne #toxin
    SCITECHDAILY.COM
    Scientists Detect Unusual Airborne Toxin in the United States for the First Time
    Researchers unexpectedly discovered toxic airborne pollutants in Oklahoma. The image above depicts a field in Oklahoma. Credit: Shutterstock University of Colorado Boulder researchers made the first-ever airborne detection of Medium Chain Chlorinated Paraffins (MCCPs) in the Western Hemisphere. Sometimes, scientific research feels a lot like solving a mystery. Scientists head into the field with a clear goal and a solid hypothesis, but then the data reveals something surprising. That’s when the real detective work begins. This is exactly what happened to a team from the University of Colorado Boulder during a recent field study in rural Oklahoma. They were using a state-of-the-art instrument to track how tiny particles form and grow in the air. But instead of just collecting expected data, they uncovered something completely new: the first-ever airborne detection of Medium Chain Chlorinated Paraffins (MCCPs), a kind of toxic organic pollutant, in the Western Hemisphere. The teams findings were published in ACS Environmental Au. “It’s very exciting as a scientist to find something unexpected like this that we weren’t looking for,” said Daniel Katz, CU Boulder chemistry PhD student and lead author of the study. “We’re starting to learn more about this toxic, organic pollutant that we know is out there, and which we need to understand better.” MCCPs are currently under consideration for regulation by the Stockholm Convention, a global treaty to protect human health from long-standing and widespread chemicals. While the toxic pollutants have been measured in Antarctica and Asia, researchers haven’t been sure how to document them in the Western Hemisphere’s atmosphere until now. From Wastewater to Farmlands MCCPs are used in fluids for metal working and in the construction of PVC and textiles. They are often found in wastewater and as a result, can end up in biosolid fertilizer, also called sewage sludge, which is created when liquid is removed from wastewater in a treatment plant. In Oklahoma, researchers suspect the MCCPs they identified came from biosolid fertilizer in the fields near where they set up their instrument. “When sewage sludges are spread across the fields, those toxic compounds could be released into the air,” Katz said. “We can’t show directly that that’s happening, but we think it’s a reasonable way that they could be winding up in the air. Sewage sludge fertilizers have been shown to release similar compounds.” MCCPs little cousins, Short Chain Chlorinated Paraffins (SCCPs), are currently regulated by the Stockholm Convention, and since 2009, by the EPA here in the United States. Regulation came after studies found the toxic pollutants, which travel far and last a long time in the atmosphere, were harmful to human health. But researchers hypothesize that the regulation of SCCPs may have increased MCCPs in the environment. “We always have these unintended consequences of regulation, where you regulate something, and then there’s still a need for the products that those were in,” said Ellie Browne, CU Boulder chemistry professor, CIRES Fellow, and co-author of the study. “So they get replaced by something.” Measurement of aerosols led to a new and surprising discovery Using a nitrate chemical ionization mass spectrometer, which allows scientists to identify chemical compounds in the air, the team measured air at the agricultural site 24 hours a day for one month. As Katz cataloged the data, he documented the different isotopic patterns in the compounds. The compounds measured by the team had distinct patterns, and he noticed new patterns that he immediately identified as different from the known chemical compounds. With some additional research, he identified them as chlorinated paraffins found in MCCPs. Katz says the makeup of MCCPs are similar to PFAS, long-lasting toxic chemicals that break down slowly over time. Known as “forever chemicals,” their presence in soils recently led the Oklahoma Senate to ban biosolid fertilizer. Now that researchers know how to measure MCCPs, the next step might be to measure the pollutants at different times throughout the year to understand how levels change each season. Many unknowns surrounding MCCPs remain, and there’s much more to learn about their environmental impacts. “We identified them, but we still don’t know exactly what they do when they are in the atmosphere, and they need to be investigated further,” Katz said. “I think it’s important that we continue to have governmental agencies that are capable of evaluating the science and regulating these chemicals as necessary for public health and safety.” Reference: “Real-Time Measurements of Gas-Phase Medium-Chain Chlorinated Paraffins Reveal Daily Changes in Gas-Particle Partitioning Controlled by Ambient Temperature” by Daniel John Katz, Bri Dobson, Mitchell Alton, Harald Stark, Douglas R. Worsnop, Manjula R. Canagaratna and Eleanor C. Browne, 5 June 2025, ACS Environmental Au. DOI: 10.1021/acsenvironau.5c00038 Never miss a breakthrough: Join the SciTechDaily newsletter.
    Like
    Love
    Wow
    Sad
    Angry
    411
    2 Comentários 0 Compartilhamentos 0 Anterior
  • Blood decals that stay in the world

    Hello, I currently have a Niagara system setup that allows for some blood decals to be spawned in the world when the particles collide with something but, the particles obviously have a lifetime to them and don’t persist infinitely. How would I go about doing this? I guessed spawning decals with a blueprint actor but I honestly have no idea where to start.
    #blood #decals #that #stay #world
    Blood decals that stay in the world
    Hello, I currently have a Niagara system setup that allows for some blood decals to be spawned in the world when the particles collide with something but, the particles obviously have a lifetime to them and don’t persist infinitely. How would I go about doing this? I guessed spawning decals with a blueprint actor but I honestly have no idea where to start. #blood #decals #that #stay #world
    REALTIMEVFX.COM
    Blood decals that stay in the world
    Hello, I currently have a Niagara system setup that allows for some blood decals to be spawned in the world when the particles collide with something but, the particles obviously have a lifetime to them and don’t persist infinitely. How would I go about doing this? I guessed spawning decals with a blueprint actor but I honestly have no idea where to start.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • UMass and MIT Test Cold Spray 3D Printing to Repair Aging Massachusetts Bridge

    Researchers from the US-based University of Massachusetts Amherst, in collaboration with the Massachusetts Institute of TechnologyDepartment of Mechanical Engineering, have applied cold spray to repair the deteriorating “Brown Bridge” in Great Barrington, built in 1949. The project marks the first known use of this method on bridge infrastructure and aims to evaluate its effectiveness as a faster, more cost-effective, and less disruptive alternative to conventional repair techniques.
    “Now that we’ve completed this proof-of-concept repair, we see a clear path to a solution that is much faster, less costly, easier, and less invasive,” said Simos Gerasimidis, associate professor of civil and environmental engineering at the University of Massachusetts Amherst. “To our knowledge, this is a first. Of course, there is some R&D that needs to be developed, but this is a huge milestone to that,” he added.
    The pilot project is also a collaboration with the Massachusetts Department of Transportation, the Massachusetts Technology Collaborative, the U.S. Department of Transportation, and the Federal Highway Administration. It was supported by the Massachusetts Manufacturing Innovation Initiative, which provided essential equipment for the demonstration.
    Members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis. Photo via UMass Amherst.
    Tackling America’s Bridge Crisis with Cold Spray Technology
    Nearly half of the bridges across the United States are in “fair” condition, while 6.8% are classified as “poor,” according to the 2025 Report Card for America’s Infrastructure. In Massachusetts, about 9% of the state’s 5,295 bridges are considered structurally deficient. The costs of restoring this infrastructure are projected to exceed billion—well beyond current funding levels. 
    The cold spray method consists of propelling metal powder particles at high velocity onto the beam’s surface. Successive applications build up additional layers, helping restore its thickness and structural integrity. This method has successfully been used to repair large structures such as submarines, airplanes, and ships, but this marks the first instance of its application to a bridge.
    One of cold spray’s key advantages is its ability to be deployed with minimal traffic disruption.  “Every time you do repairs on a bridge you have to block traffic, you have to make traffic controls for substantial amounts of time,” explained Gerasimidis. “This will allow us toon this actual bridge while cars are going.”
    To enhance precision, the research team integrated 3D LiDAR scanning technology into the process. Unlike visual inspections, which can be subjective and time-consuming, LiDAR creates high-resolution digital models that pinpoint areas of corrosion. This allows teams to develop targeted repair plans and deposit materials only where needed—reducing waste and potentially extending a bridge’s lifespan.
    Next steps: Testing Cold-Sprayed Repairs
    The bridge is scheduled for demolition in the coming years. When that happens, researchers will retrieve the repaired sections for further analysis. They plan to assess the durability, corrosion resistance, and mechanical performance of the cold-sprayed steel in real-world conditions, comparing it to results from laboratory tests.
    “This is a tremendous collaboration where cutting-edge technology is brought to address a critical need for infrastructure in the commonwealth and across the United States,” said John Hart, Class of 1922 Professor in the Department of Mechanical Engineering at MIT. “I think we’re just at the beginning of a digital transformation of bridge inspection, repair and maintenance, among many other important use cases.”
    3D Printing for Infrastructure Repairs
    Beyond cold spray techniques, other innovative 3D printing methods are emerging to address construction repair challenges. For example, researchers at University College Londonhave developed an asphalt 3D printer specifically designed to repair road cracks and potholes. “The material properties of 3D printed asphalt are tunable, and combined with the flexibility and efficiency of the printing platform, this technique offers a compelling new design approach to the maintenance of infrastructure,” the UCL team explained.
    Similarly, in 2018, Cintec, a Wales-based international structural engineering firm, contributed to restoring the historic Government building known as the Red House in the Republic of Trinidad and Tobago. This project, managed by Cintec’s North American branch, marked the first use of additive manufacturing within sacrificial structures. It also featured the installation of what are claimed to be the longest reinforcement anchors ever inserted into a structure—measuring an impressive 36.52 meters.
    Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now.
    Who won the2024 3D Printing Industry Awards?
    Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news.
    You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content.
    Featured image shows members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis. Photo via UMass Amherst.
    #umass #mit #test #cold #spray
    UMass and MIT Test Cold Spray 3D Printing to Repair Aging Massachusetts Bridge
    Researchers from the US-based University of Massachusetts Amherst, in collaboration with the Massachusetts Institute of TechnologyDepartment of Mechanical Engineering, have applied cold spray to repair the deteriorating “Brown Bridge” in Great Barrington, built in 1949. The project marks the first known use of this method on bridge infrastructure and aims to evaluate its effectiveness as a faster, more cost-effective, and less disruptive alternative to conventional repair techniques. “Now that we’ve completed this proof-of-concept repair, we see a clear path to a solution that is much faster, less costly, easier, and less invasive,” said Simos Gerasimidis, associate professor of civil and environmental engineering at the University of Massachusetts Amherst. “To our knowledge, this is a first. Of course, there is some R&D that needs to be developed, but this is a huge milestone to that,” he added. The pilot project is also a collaboration with the Massachusetts Department of Transportation, the Massachusetts Technology Collaborative, the U.S. Department of Transportation, and the Federal Highway Administration. It was supported by the Massachusetts Manufacturing Innovation Initiative, which provided essential equipment for the demonstration. Members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis. Photo via UMass Amherst. Tackling America’s Bridge Crisis with Cold Spray Technology Nearly half of the bridges across the United States are in “fair” condition, while 6.8% are classified as “poor,” according to the 2025 Report Card for America’s Infrastructure. In Massachusetts, about 9% of the state’s 5,295 bridges are considered structurally deficient. The costs of restoring this infrastructure are projected to exceed billion—well beyond current funding levels.  The cold spray method consists of propelling metal powder particles at high velocity onto the beam’s surface. Successive applications build up additional layers, helping restore its thickness and structural integrity. This method has successfully been used to repair large structures such as submarines, airplanes, and ships, but this marks the first instance of its application to a bridge. One of cold spray’s key advantages is its ability to be deployed with minimal traffic disruption.  “Every time you do repairs on a bridge you have to block traffic, you have to make traffic controls for substantial amounts of time,” explained Gerasimidis. “This will allow us toon this actual bridge while cars are going.” To enhance precision, the research team integrated 3D LiDAR scanning technology into the process. Unlike visual inspections, which can be subjective and time-consuming, LiDAR creates high-resolution digital models that pinpoint areas of corrosion. This allows teams to develop targeted repair plans and deposit materials only where needed—reducing waste and potentially extending a bridge’s lifespan. Next steps: Testing Cold-Sprayed Repairs The bridge is scheduled for demolition in the coming years. When that happens, researchers will retrieve the repaired sections for further analysis. They plan to assess the durability, corrosion resistance, and mechanical performance of the cold-sprayed steel in real-world conditions, comparing it to results from laboratory tests. “This is a tremendous collaboration where cutting-edge technology is brought to address a critical need for infrastructure in the commonwealth and across the United States,” said John Hart, Class of 1922 Professor in the Department of Mechanical Engineering at MIT. “I think we’re just at the beginning of a digital transformation of bridge inspection, repair and maintenance, among many other important use cases.” 3D Printing for Infrastructure Repairs Beyond cold spray techniques, other innovative 3D printing methods are emerging to address construction repair challenges. For example, researchers at University College Londonhave developed an asphalt 3D printer specifically designed to repair road cracks and potholes. “The material properties of 3D printed asphalt are tunable, and combined with the flexibility and efficiency of the printing platform, this technique offers a compelling new design approach to the maintenance of infrastructure,” the UCL team explained. Similarly, in 2018, Cintec, a Wales-based international structural engineering firm, contributed to restoring the historic Government building known as the Red House in the Republic of Trinidad and Tobago. This project, managed by Cintec’s North American branch, marked the first use of additive manufacturing within sacrificial structures. It also featured the installation of what are claimed to be the longest reinforcement anchors ever inserted into a structure—measuring an impressive 36.52 meters. Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis. Photo via UMass Amherst. #umass #mit #test #cold #spray
    3DPRINTINGINDUSTRY.COM
    UMass and MIT Test Cold Spray 3D Printing to Repair Aging Massachusetts Bridge
    Researchers from the US-based University of Massachusetts Amherst (UMass), in collaboration with the Massachusetts Institute of Technology (MIT) Department of Mechanical Engineering, have applied cold spray to repair the deteriorating “Brown Bridge” in Great Barrington, built in 1949. The project marks the first known use of this method on bridge infrastructure and aims to evaluate its effectiveness as a faster, more cost-effective, and less disruptive alternative to conventional repair techniques. “Now that we’ve completed this proof-of-concept repair, we see a clear path to a solution that is much faster, less costly, easier, and less invasive,” said Simos Gerasimidis, associate professor of civil and environmental engineering at the University of Massachusetts Amherst. “To our knowledge, this is a first. Of course, there is some R&D that needs to be developed, but this is a huge milestone to that,” he added. The pilot project is also a collaboration with the Massachusetts Department of Transportation (MassDOT), the Massachusetts Technology Collaborative (MassTech), the U.S. Department of Transportation, and the Federal Highway Administration. It was supported by the Massachusetts Manufacturing Innovation Initiative, which provided essential equipment for the demonstration. Members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis (left, standing). Photo via UMass Amherst. Tackling America’s Bridge Crisis with Cold Spray Technology Nearly half of the bridges across the United States are in “fair” condition, while 6.8% are classified as “poor,” according to the 2025 Report Card for America’s Infrastructure. In Massachusetts, about 9% of the state’s 5,295 bridges are considered structurally deficient. The costs of restoring this infrastructure are projected to exceed $190 billion—well beyond current funding levels.  The cold spray method consists of propelling metal powder particles at high velocity onto the beam’s surface. Successive applications build up additional layers, helping restore its thickness and structural integrity. This method has successfully been used to repair large structures such as submarines, airplanes, and ships, but this marks the first instance of its application to a bridge. One of cold spray’s key advantages is its ability to be deployed with minimal traffic disruption.  “Every time you do repairs on a bridge you have to block traffic, you have to make traffic controls for substantial amounts of time,” explained Gerasimidis. “This will allow us to [apply the technique] on this actual bridge while cars are going [across].” To enhance precision, the research team integrated 3D LiDAR scanning technology into the process. Unlike visual inspections, which can be subjective and time-consuming, LiDAR creates high-resolution digital models that pinpoint areas of corrosion. This allows teams to develop targeted repair plans and deposit materials only where needed—reducing waste and potentially extending a bridge’s lifespan. Next steps: Testing Cold-Sprayed Repairs The bridge is scheduled for demolition in the coming years. When that happens, researchers will retrieve the repaired sections for further analysis. They plan to assess the durability, corrosion resistance, and mechanical performance of the cold-sprayed steel in real-world conditions, comparing it to results from laboratory tests. “This is a tremendous collaboration where cutting-edge technology is brought to address a critical need for infrastructure in the commonwealth and across the United States,” said John Hart, Class of 1922 Professor in the Department of Mechanical Engineering at MIT. “I think we’re just at the beginning of a digital transformation of bridge inspection, repair and maintenance, among many other important use cases.” 3D Printing for Infrastructure Repairs Beyond cold spray techniques, other innovative 3D printing methods are emerging to address construction repair challenges. For example, researchers at University College London (UCL) have developed an asphalt 3D printer specifically designed to repair road cracks and potholes. “The material properties of 3D printed asphalt are tunable, and combined with the flexibility and efficiency of the printing platform, this technique offers a compelling new design approach to the maintenance of infrastructure,” the UCL team explained. Similarly, in 2018, Cintec, a Wales-based international structural engineering firm, contributed to restoring the historic Government building known as the Red House in the Republic of Trinidad and Tobago. This project, managed by Cintec’s North American branch, marked the first use of additive manufacturing within sacrificial structures. It also featured the installation of what are claimed to be the longest reinforcement anchors ever inserted into a structure—measuring an impressive 36.52 meters. Join our Additive Manufacturing Advantage (AMAA) event on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows members of the UMass Amherst and MIT Department of Mechanical Engineering research team, led by Simos Gerasimidis (left, standing). Photo via UMass Amherst.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory

    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light

    Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask.
    Courtesy of the researchers via MIT

    In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred.
    Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work.
    In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light.
    The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt.

    Meet the engineer who invented an AI-powered way to restore art
    Watch on

    While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible.
    As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory.
    That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today.
    The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible.
    “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.”
    The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study.
    Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken.

    Overview of Physically-Applied Digital Restoration
    Watch on

    “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.”
    The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply.
    Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.”

    Get the latest stories in your inbox every weekday.
    #graduate #student #develops #aibased #approach
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask. Courtesy of the researchers via MIT In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred. Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work. In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light. The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt. Meet the engineer who invented an AI-powered way to restore art Watch on While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible. As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory. That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today. The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible. “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.” The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study. Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken. Overview of Physically-Applied Digital Restoration Watch on “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.” The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply. Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.” Get the latest stories in your inbox every weekday. #graduate #student #develops #aibased #approach
    WWW.SMITHSONIANMAG.COM
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask. Courtesy of the researchers via MIT In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred. Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work. In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light. The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt. Meet the engineer who invented an AI-powered way to restore art Watch on While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible. As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory. That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today. The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible. “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.” The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study. Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken. Overview of Physically-Applied Digital Restoration Watch on “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.” The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply. Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.” Get the latest stories in your inbox every weekday.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Hell is Us terrifies in all the best ways

    Hell is Us has been on my radar since it was first announced in April 2022, and I’ve finally been able to spend some time with it via its demo. The war-torn world of Hell is Us is immediately chilling and the demo’s brief glimpse of the gameplay, despite some minor hang-ups, has me eager for more.

    You play as Remi as he ventures to the fictional country of Hadea. A civil war has broken out, dividing and devastating Hadea’s people. Remi must travel through the war zone in search of his parents, and quickly comes across a farmer who exposition-dumps plenty of information that may or may not stick. Essentially, shit is bad, tragically so, and Remi is about to discover just how bad.

    You wander around a forest while an unsettling Returnal-esque score accompanies you. Eventually you gain access to ruins that turn out to have been some sort of dungeon for prisoners long ago. It’s here that Remi encounters the first of hopefully many “oh, shit!” moments. He comes across a creepy-ass enemy I can best describe as if Spot from Spider-Man: Across the Spider-Verse was designed to horrify — a pale white humanoid with a black circle for a face who contorts around the level like a marionette. A mask-wearing woman shows up out of nowhere to take down the creepy foe, but dies saving Remi. Without explanation, Remi decides to don her poncho, take her drone, and wield her BGS.

    Turns out he’s pretty good with a sword. Remi will encounter a couple dozen enemies throughout the demo; the combat is easy to pick up and is somewhat standard third-person-melee, though it does rely heavily on stamina management. Your max stamina is also reduced when you take damage, so you really don’t wanna get hit much.

    You can heal using consumable med kits as well as a pulse mechanic. Attacking enemies creates floating particles around Remi and once those particles form into a circle, you can press your controller’s right bumper to activate a healing pulse. It’s an interesting mechanic, and I like how Hell is Us is giving players a way to recoup health in the midst of combat. However, actually doing it is a bit clunky; keeping one eye on an enemy and the other on the particles around Remi is distracting, and timing the pulse is a challenge — you can only activate it during a brief window, and you’ll likely be in the middle of a combo when a pulse opportunity presents itself.

    While Hell is Us’ combat has surface similarities to Soulslikes — like parrying blows from creepy enemies — it felt less punishing and more forgiving than what you’d expect from a FromSoftware title. I only died once in the demo, compared to countless deaths in the opening hours of Soulslikes such as Lies of P or Elden Ring. Notably, enemies don’t respawn when you save your game, so you don’t have to worry about repeatedly striking down the same foes.

    Because dead enemies remain dead, exploration is encouraged in Hell is Us. Developer Rogue Factor boasts that the game has “no map, no compass, no quest markers,” so you’re free to wander around the game’s world without a guiding hand and discover its secrets. For example, that farmer I mentioned earlier told Remi about how three of his sons died in this war. Later on, when exploring the World War I-like trenches outside of the ruins, I found a note from a soldier on the other side of the conflict bragging about killing three brothers “cowering in a farmhouse.”

    The note also mentioned taking a gold watch from one of the boys, which I grabbed and returned to the farmer — without a quest marker to guide me or a journal entry saying “give this item to the farmer.” This completed a “Good Deed” and I was told a reward may come from it later in the game; I’m curious how these types of quests will play out in the full release. The prospect of doing good deeds in this torn-asunder country is especially appealing.

    A Soulslike-adjacent game placing greater emphasis on user-guided exploration than combat sounds enticing, and Hell is Us is delivering on that promise so far. Its demo is available on Steam through June 16 before the full game launches Sept. 4 for PC, PlayStation 5, and Xbox Series X.
    #hell #terrifies #all #best #ways
    Hell is Us terrifies in all the best ways
    Hell is Us has been on my radar since it was first announced in April 2022, and I’ve finally been able to spend some time with it via its demo. The war-torn world of Hell is Us is immediately chilling and the demo’s brief glimpse of the gameplay, despite some minor hang-ups, has me eager for more. You play as Remi as he ventures to the fictional country of Hadea. A civil war has broken out, dividing and devastating Hadea’s people. Remi must travel through the war zone in search of his parents, and quickly comes across a farmer who exposition-dumps plenty of information that may or may not stick. Essentially, shit is bad, tragically so, and Remi is about to discover just how bad. You wander around a forest while an unsettling Returnal-esque score accompanies you. Eventually you gain access to ruins that turn out to have been some sort of dungeon for prisoners long ago. It’s here that Remi encounters the first of hopefully many “oh, shit!” moments. He comes across a creepy-ass enemy I can best describe as if Spot from Spider-Man: Across the Spider-Verse was designed to horrify — a pale white humanoid with a black circle for a face who contorts around the level like a marionette. A mask-wearing woman shows up out of nowhere to take down the creepy foe, but dies saving Remi. Without explanation, Remi decides to don her poncho, take her drone, and wield her BGS. Turns out he’s pretty good with a sword. Remi will encounter a couple dozen enemies throughout the demo; the combat is easy to pick up and is somewhat standard third-person-melee, though it does rely heavily on stamina management. Your max stamina is also reduced when you take damage, so you really don’t wanna get hit much. You can heal using consumable med kits as well as a pulse mechanic. Attacking enemies creates floating particles around Remi and once those particles form into a circle, you can press your controller’s right bumper to activate a healing pulse. It’s an interesting mechanic, and I like how Hell is Us is giving players a way to recoup health in the midst of combat. However, actually doing it is a bit clunky; keeping one eye on an enemy and the other on the particles around Remi is distracting, and timing the pulse is a challenge — you can only activate it during a brief window, and you’ll likely be in the middle of a combo when a pulse opportunity presents itself. While Hell is Us’ combat has surface similarities to Soulslikes — like parrying blows from creepy enemies — it felt less punishing and more forgiving than what you’d expect from a FromSoftware title. I only died once in the demo, compared to countless deaths in the opening hours of Soulslikes such as Lies of P or Elden Ring. Notably, enemies don’t respawn when you save your game, so you don’t have to worry about repeatedly striking down the same foes. Because dead enemies remain dead, exploration is encouraged in Hell is Us. Developer Rogue Factor boasts that the game has “no map, no compass, no quest markers,” so you’re free to wander around the game’s world without a guiding hand and discover its secrets. For example, that farmer I mentioned earlier told Remi about how three of his sons died in this war. Later on, when exploring the World War I-like trenches outside of the ruins, I found a note from a soldier on the other side of the conflict bragging about killing three brothers “cowering in a farmhouse.” The note also mentioned taking a gold watch from one of the boys, which I grabbed and returned to the farmer — without a quest marker to guide me or a journal entry saying “give this item to the farmer.” This completed a “Good Deed” and I was told a reward may come from it later in the game; I’m curious how these types of quests will play out in the full release. The prospect of doing good deeds in this torn-asunder country is especially appealing. A Soulslike-adjacent game placing greater emphasis on user-guided exploration than combat sounds enticing, and Hell is Us is delivering on that promise so far. Its demo is available on Steam through June 16 before the full game launches Sept. 4 for PC, PlayStation 5, and Xbox Series X. #hell #terrifies #all #best #ways
    WWW.POLYGON.COM
    Hell is Us terrifies in all the best ways
    Hell is Us has been on my radar since it was first announced in April 2022, and I’ve finally been able to spend some time with it via its demo. The war-torn world of Hell is Us is immediately chilling and the demo’s brief glimpse of the gameplay, despite some minor hang-ups, has me eager for more. You play as Remi as he ventures to the fictional country of Hadea. A civil war has broken out, dividing and devastating Hadea’s people. Remi must travel through the war zone in search of his parents, and quickly comes across a farmer who exposition-dumps plenty of information that may or may not stick. Essentially, shit is bad, tragically so, and Remi is about to discover just how bad. You wander around a forest while an unsettling Returnal-esque score accompanies you. Eventually you gain access to ruins that turn out to have been some sort of dungeon for prisoners long ago. It’s here that Remi encounters the first of hopefully many “oh, shit!” moments. He comes across a creepy-ass enemy I can best describe as if Spot from Spider-Man: Across the Spider-Verse was designed to horrify — a pale white humanoid with a black circle for a face who contorts around the level like a marionette. A mask-wearing woman shows up out of nowhere to take down the creepy foe, but dies saving Remi. Without explanation, Remi decides to don her poncho, take her drone, and wield her BGS (big glowing sword). Turns out he’s pretty good with a sword. Remi will encounter a couple dozen enemies throughout the demo; the combat is easy to pick up and is somewhat standard third-person-melee, though it does rely heavily on stamina management. Your max stamina is also reduced when you take damage, so you really don’t wanna get hit much. You can heal using consumable med kits as well as a pulse mechanic. Attacking enemies creates floating particles around Remi and once those particles form into a circle, you can press your controller’s right bumper to activate a healing pulse. It’s an interesting mechanic, and I like how Hell is Us is giving players a way to recoup health in the midst of combat. However, actually doing it is a bit clunky; keeping one eye on an enemy and the other on the particles around Remi is distracting, and timing the pulse is a challenge — you can only activate it during a brief window, and you’ll likely be in the middle of a combo when a pulse opportunity presents itself. While Hell is Us’ combat has surface similarities to Soulslikes — like parrying blows from creepy enemies — it felt less punishing and more forgiving than what you’d expect from a FromSoftware title. I only died once in the demo, compared to countless deaths in the opening hours of Soulslikes such as Lies of P or Elden Ring. Notably, enemies don’t respawn when you save your game, so you don’t have to worry about repeatedly striking down the same foes. Because dead enemies remain dead, exploration is encouraged in Hell is Us. Developer Rogue Factor boasts that the game has “no map, no compass, no quest markers,” so you’re free to wander around the game’s world without a guiding hand and discover its secrets. For example, that farmer I mentioned earlier told Remi about how three of his sons died in this war. Later on, when exploring the World War I-like trenches outside of the ruins, I found a note from a soldier on the other side of the conflict bragging about killing three brothers “cowering in a farmhouse.” The note also mentioned taking a gold watch from one of the boys, which I grabbed and returned to the farmer — without a quest marker to guide me or a journal entry saying “give this item to the farmer.” This completed a “Good Deed” and I was told a reward may come from it later in the game; I’m curious how these types of quests will play out in the full release. The prospect of doing good deeds in this torn-asunder country is especially appealing. A Soulslike-adjacent game placing greater emphasis on user-guided exploration than combat sounds enticing, and Hell is Us is delivering on that promise so far. Its demo is available on Steam through June 16 before the full game launches Sept. 4 for PC, PlayStation 5, and Xbox Series X.
    0 Comentários 0 Compartilhamentos 0 Anterior
CGShares https://cgshares.com