• IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    By John P. Mello Jr.
    June 11, 2025 5:00 AM PT

    IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT
    Enterprise IT Lead Generation Services
    Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more.

    IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible.
    The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent.
    “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.”
    IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del.
    “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.”
    A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time.
    Realistic Roadmap
    Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld.
    “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany.
    “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.”
    Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.”
    “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.”
    “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada.
    “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.”
    Solving the Quantum Error Correction Puzzle
    To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits.
    “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.”
    IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published.

    Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices.
    In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer.
    One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing.
    According to IBM, a practical fault-tolerant quantum architecture must:

    Suppress enough errors for useful algorithms to succeed
    Prepare and measure logical qubits during computation
    Apply universal instructions to logical qubits
    Decode measurements from logical qubits in real time and guide subsequent operations
    Scale modularly across hundreds or thousands of logical qubits
    Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources

    Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained.
    “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.”
    Q-Day Approaching Faster Than Expected
    For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated.
    “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif.
    “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.”

    “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said.
    Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years.
    “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld.
    “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.”
    “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.”
    “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.”

    John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

    Leave a Comment

    Click here to cancel reply.
    Please sign in to post or reply to a comment. New users create a free account.

    Related Stories

    More by John P. Mello Jr.

    view all

    More in Emerging Tech
    #ibm #plans #largescale #faulttolerant #quantum
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech #ibm #plans #largescale #faulttolerant #quantum
    WWW.TECHNEWSWORLD.COM
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system. (Image Credit: IBM) ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillion (10⁴⁸) of the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed $30 billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity check (qLDPC) codes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling [RCS], can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQC [post-quantum cryptography] preparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EO [Executive Order] that relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech
    0 Comments 0 Shares 0 Reviews
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comments 0 Shares 0 Reviews
  • DISCOVERING ELIO

    By TREVOR HOGG

    Images courtesy of Pixar.

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red.
    “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.”

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’”

    Green is the thematic color for Elio.

    Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?”

    The Communiverse was meant to feel like a place that a child would love to visit and explore.

    Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’”

    The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena.

    The variety in the Communiverse is a contrast to the regimented world on the military base.

    There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’”

    Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters.

    Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.”

    An aerial image of Elio as he attempts to get abducted by aliens.

    Part of the design of the Coummuniverse was inspired by Chinese puzzle balls.

    A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.”
    Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.”

    Exploring various facial expressions for Elio.

    A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew.

    Character designs of Elio and Glordon. which shows them interacting with each other.

    Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.”
    Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.”

    Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.”
    Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    #discovering #elio
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.” #discovering #elio
    WWW.VFXVOICE.COM
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when those [alien] characters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Sanii [Visual Effects Supervisor] gave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Hom [Production Manager] and I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowds [department] is dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessup [Production Designer] because sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Muren [VES] was keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter (CCO, Pixar). “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    0 Comments 0 Shares 0 Reviews
  • AI robots help nurses beat burnout and transform hospital care

    Tech AI robots help nurses beat burnout and transform hospital care Hospitals using AI-powered robots to support nurses, redefine patient care
    Published
    June 4, 2025 6:00am EDT close AI robots help nurses beat burnout and transform hospital care Artificial intelligence and robotics may help with nursing shortage. NEWYou can now listen to Fox News articles!
    The global healthcare system is expected to face a shortage of 4.5 million nurses by 2030, with burnout identified as a leading cause for this deficit. In response, Taiwan's hospitals are taking decisive action by integrating artificial intelligence and robotics to support their staff and maintain high standards of patient care. AI-powered NurabotNurabot: The AI nursing robot changing patient careNurabot, a collaborative nursing robot developed by Foxconn and Kawasaki Heavy Industries with Nvidia's AI technology, is designed to take on some of the most physically demanding and repetitive tasks in clinical care. These include delivering medications, transporting samples, patrolling wards and guiding visitors through hospital corridors. By handling these responsibilities, Nurabot allows nurses to focus on more meaningful aspects of patient care and helps reduce the physical fatigue that often leads to burnout. AI-powered NurabotUsing AI to build the hospitals of the futureFoxconn's approach to smart hospitals goes beyond deploying robots. The company has developed a suite of digital tools using Nvidia platforms, including AI models that monitor patient vitals and digital twins that simulate hospital environments for planning and training purposes.The process starts in the data center, where large AI models are trained on Nvidia supercomputers. Hospitals then use digital twins to test and train robots in virtual settings before deploying them in real-world scenarios, ensuring that these systems are both safe and effective.ARTIFICIAL INTELLIGENCE TRANSFORMS PATIENT CARE AND REDUCES BURNOUT, PHYSICIAN SAYS AI-powered NurabotAI robots in real hospitals: Results from Taiwan's Healthcare SystemTaichung Veterans General Hospital, along with other top hospitals in Taiwan, is at the forefront of this digital transformation. TCVGH has built digital twins of its wards and nursing stations, providing a virtual training ground for Nurabot before it is introduced to real hospital floors. According to Shu-Fang Liu, deputy director of the nursing department at TCVGH, robots like Nurabot are augmenting the capabilities of healthcare staff, enabling them to deliver more focused and meaningful care to patients. AI-powered NurabotWays Nurabot reduces nurse burnout and boosts efficiencyNurabot is already making a difference in daily hospital operations. The robot handles medicine deliveries, ward patrols and visitor guidance, which Foxconn estimates can reduce nurse workloads by up to 30%. In one ward, Nurabot delivers wound care kits and educational materials directly to patient bedsides, saving nurses multiple trips to supply rooms and allowing them to dedicate more time to their patients. The robot is also especially helpful during visiting hours and night shifts, when staffing levels are typically lower.Nurses hope future versions of Nurabot will be able to converse with patients in multiple languages, recognize faces for personalized interactions and even assist with lifting patients when needed. For example, a lung patient who needs two nurses to sit up for breathing exercises might only require one nurse with Nurabot's help, freeing the other to care for other patients. AI-powered NurabotKurt's key takeawaysWhen it comes to addressing the nursing shortage, Taiwan is demonstrating that AI and robotics can make a significant difference in hospitals. Instead of spending their shifts running errands or handling repetitive tasks, nurses now have robots like Nurabot to lend a hand. This means nurses can focus their energy on what matters most – caring for patients – while robots handle tasks such as delivering medication or guiding visitors around the hospital.It's a team effort between people and technology, and it's already helping healthcare staff provide better care for everyone.CLICK HERE TO GET THE FOX NEWS APPHow would you feel if a robot, not a human, delivered your medication during a hospital stay? Let us know by writing us at Cyberguy.com/Contact.For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.Ask Kurt a question or let us know what stories you'd like us to cover.Follow Kurt on his social channels:Answers to the most-asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com. All rights reserved. Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    #robots #help #nurses #beat #burnout
    AI robots help nurses beat burnout and transform hospital care
    Tech AI robots help nurses beat burnout and transform hospital care Hospitals using AI-powered robots to support nurses, redefine patient care Published June 4, 2025 6:00am EDT close AI robots help nurses beat burnout and transform hospital care Artificial intelligence and robotics may help with nursing shortage. NEWYou can now listen to Fox News articles! The global healthcare system is expected to face a shortage of 4.5 million nurses by 2030, with burnout identified as a leading cause for this deficit. In response, Taiwan's hospitals are taking decisive action by integrating artificial intelligence and robotics to support their staff and maintain high standards of patient care. AI-powered NurabotNurabot: The AI nursing robot changing patient careNurabot, a collaborative nursing robot developed by Foxconn and Kawasaki Heavy Industries with Nvidia's AI technology, is designed to take on some of the most physically demanding and repetitive tasks in clinical care. These include delivering medications, transporting samples, patrolling wards and guiding visitors through hospital corridors. By handling these responsibilities, Nurabot allows nurses to focus on more meaningful aspects of patient care and helps reduce the physical fatigue that often leads to burnout. AI-powered NurabotUsing AI to build the hospitals of the futureFoxconn's approach to smart hospitals goes beyond deploying robots. The company has developed a suite of digital tools using Nvidia platforms, including AI models that monitor patient vitals and digital twins that simulate hospital environments for planning and training purposes.The process starts in the data center, where large AI models are trained on Nvidia supercomputers. Hospitals then use digital twins to test and train robots in virtual settings before deploying them in real-world scenarios, ensuring that these systems are both safe and effective.ARTIFICIAL INTELLIGENCE TRANSFORMS PATIENT CARE AND REDUCES BURNOUT, PHYSICIAN SAYS AI-powered NurabotAI robots in real hospitals: Results from Taiwan's Healthcare SystemTaichung Veterans General Hospital, along with other top hospitals in Taiwan, is at the forefront of this digital transformation. TCVGH has built digital twins of its wards and nursing stations, providing a virtual training ground for Nurabot before it is introduced to real hospital floors. According to Shu-Fang Liu, deputy director of the nursing department at TCVGH, robots like Nurabot are augmenting the capabilities of healthcare staff, enabling them to deliver more focused and meaningful care to patients. AI-powered NurabotWays Nurabot reduces nurse burnout and boosts efficiencyNurabot is already making a difference in daily hospital operations. The robot handles medicine deliveries, ward patrols and visitor guidance, which Foxconn estimates can reduce nurse workloads by up to 30%. In one ward, Nurabot delivers wound care kits and educational materials directly to patient bedsides, saving nurses multiple trips to supply rooms and allowing them to dedicate more time to their patients. The robot is also especially helpful during visiting hours and night shifts, when staffing levels are typically lower.Nurses hope future versions of Nurabot will be able to converse with patients in multiple languages, recognize faces for personalized interactions and even assist with lifting patients when needed. For example, a lung patient who needs two nurses to sit up for breathing exercises might only require one nurse with Nurabot's help, freeing the other to care for other patients. AI-powered NurabotKurt's key takeawaysWhen it comes to addressing the nursing shortage, Taiwan is demonstrating that AI and robotics can make a significant difference in hospitals. Instead of spending their shifts running errands or handling repetitive tasks, nurses now have robots like Nurabot to lend a hand. This means nurses can focus their energy on what matters most – caring for patients – while robots handle tasks such as delivering medication or guiding visitors around the hospital.It's a team effort between people and technology, and it's already helping healthcare staff provide better care for everyone.CLICK HERE TO GET THE FOX NEWS APPHow would you feel if a robot, not a human, delivered your medication during a hospital stay? Let us know by writing us at Cyberguy.com/Contact.For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.Ask Kurt a question or let us know what stories you'd like us to cover.Follow Kurt on his social channels:Answers to the most-asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com. All rights reserved. Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com. #robots #help #nurses #beat #burnout
    WWW.FOXNEWS.COM
    AI robots help nurses beat burnout and transform hospital care
    Tech AI robots help nurses beat burnout and transform hospital care Hospitals using AI-powered robots to support nurses, redefine patient care Published June 4, 2025 6:00am EDT close AI robots help nurses beat burnout and transform hospital care Artificial intelligence and robotics may help with nursing shortage. NEWYou can now listen to Fox News articles! The global healthcare system is expected to face a shortage of 4.5 million nurses by 2030, with burnout identified as a leading cause for this deficit. In response, Taiwan's hospitals are taking decisive action by integrating artificial intelligence and robotics to support their staff and maintain high standards of patient care. AI-powered Nurabot (Nvidia)Nurabot: The AI nursing robot changing patient careNurabot, a collaborative nursing robot developed by Foxconn and Kawasaki Heavy Industries with Nvidia's AI technology, is designed to take on some of the most physically demanding and repetitive tasks in clinical care. These include delivering medications, transporting samples, patrolling wards and guiding visitors through hospital corridors. By handling these responsibilities, Nurabot allows nurses to focus on more meaningful aspects of patient care and helps reduce the physical fatigue that often leads to burnout. AI-powered Nurabot (Nvidia)Using AI to build the hospitals of the futureFoxconn's approach to smart hospitals goes beyond deploying robots. The company has developed a suite of digital tools using Nvidia platforms, including AI models that monitor patient vitals and digital twins that simulate hospital environments for planning and training purposes.The process starts in the data center, where large AI models are trained on Nvidia supercomputers. Hospitals then use digital twins to test and train robots in virtual settings before deploying them in real-world scenarios, ensuring that these systems are both safe and effective.ARTIFICIAL INTELLIGENCE TRANSFORMS PATIENT CARE AND REDUCES BURNOUT, PHYSICIAN SAYS AI-powered Nurabot (Nvidia)AI robots in real hospitals: Results from Taiwan's Healthcare SystemTaichung Veterans General Hospital (TCVGH), along with other top hospitals in Taiwan, is at the forefront of this digital transformation. TCVGH has built digital twins of its wards and nursing stations, providing a virtual training ground for Nurabot before it is introduced to real hospital floors. According to Shu-Fang Liu, deputy director of the nursing department at TCVGH, robots like Nurabot are augmenting the capabilities of healthcare staff, enabling them to deliver more focused and meaningful care to patients. AI-powered Nurabot (Nvidia)Ways Nurabot reduces nurse burnout and boosts efficiencyNurabot is already making a difference in daily hospital operations. The robot handles medicine deliveries, ward patrols and visitor guidance, which Foxconn estimates can reduce nurse workloads by up to 30%. In one ward, Nurabot delivers wound care kits and educational materials directly to patient bedsides, saving nurses multiple trips to supply rooms and allowing them to dedicate more time to their patients. The robot is also especially helpful during visiting hours and night shifts, when staffing levels are typically lower.Nurses hope future versions of Nurabot will be able to converse with patients in multiple languages, recognize faces for personalized interactions and even assist with lifting patients when needed. For example, a lung patient who needs two nurses to sit up for breathing exercises might only require one nurse with Nurabot's help, freeing the other to care for other patients. AI-powered Nurabot (Nvidia)Kurt's key takeawaysWhen it comes to addressing the nursing shortage, Taiwan is demonstrating that AI and robotics can make a significant difference in hospitals. Instead of spending their shifts running errands or handling repetitive tasks, nurses now have robots like Nurabot to lend a hand. This means nurses can focus their energy on what matters most – caring for patients – while robots handle tasks such as delivering medication or guiding visitors around the hospital.It's a team effort between people and technology, and it's already helping healthcare staff provide better care for everyone.CLICK HERE TO GET THE FOX NEWS APPHow would you feel if a robot, not a human, delivered your medication during a hospital stay? Let us know by writing us at Cyberguy.com/Contact.For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.Ask Kurt a question or let us know what stories you'd like us to cover.Follow Kurt on his social channels:Answers to the most-asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com. All rights reserved. Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    Like
    Love
    Wow
    Sad
    Angry
    251
    0 Comments 0 Shares 0 Reviews
  • The Download: AI’s role in math, and calculating its energy footprint

    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

    What’s next for AI and math

    The modern world is built on mathematics. Math lets us model complex systems such as the way air flows around an aircraft, the way financial markets fluctuate, and the way blood flows through the heart. Mathematicians have used computers for decades, but the new vision is that AI might help them crack problems that were previously uncrackable.  

    However, there’s a huge difference between AI that can solve the kinds of problems set in high school—math that the latest generation of models has already mastered—and AI that couldsolve the kinds of problems that professional mathematicians spend careers chipping away at. Here are three ways to understand that gulf. 

    —Will Douglas HeavenThis story is from our What’s Next series, which looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here.

    Inside the effort to tally AI’s energy appetite

    —James O’Donnell

    After working on it for months, my colleague Casey Crownhart and I finally saw our story on AI’s energy and emissions burden go live last week. 

    The initial goal sounded simple: Calculate how much energy is used when we interact with a chatbot, then tally that up to understand why leaders in tech and politics are so keen to harness unprecedented levels of electricity to power AI and reshape our energy grids in the process.It was, of course, not so simple. After speaking with dozens of researchers, we realized that the common understanding of AI’s energy appetite is full of holes. I encourage you to read the full story, which has some incredible graphics to help you understand this topic. But here are three takeaways I have after the project.

    This story originally appeared in The Algorithm, our weekly newsletter on AI. To get it in your inbox first, sign up here, and check out the rest of our Power Hungry package about AI here.

    The must-reads

    I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

    1 Elon Musk has turned on Trump He called Trump’s domestic policy agenda a “disgusting abomination.”+ House Speaker Mike Johnson has, naturally, hit back. 2 NASA is in crisisIts budget has been cut by a quarter, and now its new leader has had his nomination revoked.+ What’s next for NASA’s giant moon rocket? 3 Here’s how Big Tech plans to wield AITo build ‘everything apps’ that keep you inside their ecosystem, forever.+ The trouble is, the experience isn’t always slick enough, as Google has discovered with its ‘Ask Photos’ feature.+ How to fight your instinct to blindly trust AI. 4 Meta has signed a 20-year deal to buy nuclear power It’s the latest in a race to try to keep up with AI’s surging energy demands.+ Can nuclear power really fuel the rise of AI?  5 Extreme heat takes a huge toll on people’s mental healthIt’s yet another issue we’re failing to prepare for, as summers get hotter and hotter.+ The quest to protect farmworkers from extreme heat. 6 China’s robotaxi companies are planning to expand in the Middle East And they’re getting a warmer welcome than in the US or Europe.+ China’s EV giants are also betting big on humanoid robots. 7 AI will supercharge hackersThe full impact of new AI techniques is yet to be felt, but experts say it’s only a matter of time.+ Five ways criminals are using AI. 8 It’s an exciting time to be working on Alzheimer’s treatments 12 of them are moving to the final phase of clinical trials this year.+ The innovation that gets an Alzheimer’s drug through the blood-brain barrier. 9 Workers are being subjected to more and more surveillanceNot just in the gig economy either—’bossware’ is increasingly appearing in offices too.10 Noughties nostalgia is rife on TikTokIt was a pretty fun decade, to be fair.Quote of the day

     “This is scientific heaven. Or it used to be.”

    —Tom Rapoport, a 77-year-old Harvard Medical School professor from Germany, expresses his sadness about Trump’s cuts to US science funding to the New York Times. 

    One more thing

    OLCF

    What’s next for the world’s fastest supercomputers

    When the Frontier supercomputer came online in 2022, it marked the dawn of so-called exascale computing, with machines that can execute an exaflop—or a quintillionfloating point operations a second.Since then, scientists have geared up to make more of these blazingly fast computers: several exascale machines are due to come online in the US and Europe.But speed itself isn’t the endgame. Researchers hope to pursue previously unanswerable questions about nature—and to design new technologies in areas from transportation to medicine. Read the full story.

    —Sophia Chen

    We can still have nice things

    A place for comfort, fun and distraction to brighten up your day.+ If tracking tube trains in London is your thing, you’ll love this live map.+ Take a truly bonkers trip down memory lane, courtesy of these FBI artifacts.+ Netflix’s Frankenstein looks pretty intense.+ Why landlines are so darn spooky
    #download #ais #role #math #calculating
    The Download: AI’s role in math, and calculating its energy footprint
    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology. What’s next for AI and math The modern world is built on mathematics. Math lets us model complex systems such as the way air flows around an aircraft, the way financial markets fluctuate, and the way blood flows through the heart. Mathematicians have used computers for decades, but the new vision is that AI might help them crack problems that were previously uncrackable.   However, there’s a huge difference between AI that can solve the kinds of problems set in high school—math that the latest generation of models has already mastered—and AI that couldsolve the kinds of problems that professional mathematicians spend careers chipping away at. Here are three ways to understand that gulf.  —Will Douglas HeavenThis story is from our What’s Next series, which looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here. Inside the effort to tally AI’s energy appetite —James O’Donnell After working on it for months, my colleague Casey Crownhart and I finally saw our story on AI’s energy and emissions burden go live last week.  The initial goal sounded simple: Calculate how much energy is used when we interact with a chatbot, then tally that up to understand why leaders in tech and politics are so keen to harness unprecedented levels of electricity to power AI and reshape our energy grids in the process.It was, of course, not so simple. After speaking with dozens of researchers, we realized that the common understanding of AI’s energy appetite is full of holes. I encourage you to read the full story, which has some incredible graphics to help you understand this topic. But here are three takeaways I have after the project. This story originally appeared in The Algorithm, our weekly newsletter on AI. To get it in your inbox first, sign up here, and check out the rest of our Power Hungry package about AI here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Elon Musk has turned on Trump He called Trump’s domestic policy agenda a “disgusting abomination.”+ House Speaker Mike Johnson has, naturally, hit back. 2 NASA is in crisisIts budget has been cut by a quarter, and now its new leader has had his nomination revoked.+ What’s next for NASA’s giant moon rocket? 3 Here’s how Big Tech plans to wield AITo build ‘everything apps’ that keep you inside their ecosystem, forever.+ The trouble is, the experience isn’t always slick enough, as Google has discovered with its ‘Ask Photos’ feature.+ How to fight your instinct to blindly trust AI. 4 Meta has signed a 20-year deal to buy nuclear power It’s the latest in a race to try to keep up with AI’s surging energy demands.+ Can nuclear power really fuel the rise of AI?  5 Extreme heat takes a huge toll on people’s mental healthIt’s yet another issue we’re failing to prepare for, as summers get hotter and hotter.+ The quest to protect farmworkers from extreme heat. 6 China’s robotaxi companies are planning to expand in the Middle East And they’re getting a warmer welcome than in the US or Europe.+ China’s EV giants are also betting big on humanoid robots. 7 AI will supercharge hackersThe full impact of new AI techniques is yet to be felt, but experts say it’s only a matter of time.+ Five ways criminals are using AI. 8 It’s an exciting time to be working on Alzheimer’s treatments 12 of them are moving to the final phase of clinical trials this year.+ The innovation that gets an Alzheimer’s drug through the blood-brain barrier. 9 Workers are being subjected to more and more surveillanceNot just in the gig economy either—’bossware’ is increasingly appearing in offices too.10 Noughties nostalgia is rife on TikTokIt was a pretty fun decade, to be fair.Quote of the day  “This is scientific heaven. Or it used to be.” —Tom Rapoport, a 77-year-old Harvard Medical School professor from Germany, expresses his sadness about Trump’s cuts to US science funding to the New York Times.  One more thing OLCF What’s next for the world’s fastest supercomputers When the Frontier supercomputer came online in 2022, it marked the dawn of so-called exascale computing, with machines that can execute an exaflop—or a quintillionfloating point operations a second.Since then, scientists have geared up to make more of these blazingly fast computers: several exascale machines are due to come online in the US and Europe.But speed itself isn’t the endgame. Researchers hope to pursue previously unanswerable questions about nature—and to design new technologies in areas from transportation to medicine. Read the full story. —Sophia Chen We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ If tracking tube trains in London is your thing, you’ll love this live map.+ Take a truly bonkers trip down memory lane, courtesy of these FBI artifacts.+ Netflix’s Frankenstein looks pretty intense.+ Why landlines are so darn spooky #download #ais #role #math #calculating
    WWW.TECHNOLOGYREVIEW.COM
    The Download: AI’s role in math, and calculating its energy footprint
    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology. What’s next for AI and math The modern world is built on mathematics. Math lets us model complex systems such as the way air flows around an aircraft, the way financial markets fluctuate, and the way blood flows through the heart. Mathematicians have used computers for decades, but the new vision is that AI might help them crack problems that were previously uncrackable.   However, there’s a huge difference between AI that can solve the kinds of problems set in high school—math that the latest generation of models has already mastered—and AI that could (in theory) solve the kinds of problems that professional mathematicians spend careers chipping away at. Here are three ways to understand that gulf.  —Will Douglas HeavenThis story is from our What’s Next series, which looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here. Inside the effort to tally AI’s energy appetite —James O’Donnell After working on it for months, my colleague Casey Crownhart and I finally saw our story on AI’s energy and emissions burden go live last week.  The initial goal sounded simple: Calculate how much energy is used when we interact with a chatbot, then tally that up to understand why leaders in tech and politics are so keen to harness unprecedented levels of electricity to power AI and reshape our energy grids in the process.It was, of course, not so simple. After speaking with dozens of researchers, we realized that the common understanding of AI’s energy appetite is full of holes. I encourage you to read the full story, which has some incredible graphics to help you understand this topic. But here are three takeaways I have after the project. This story originally appeared in The Algorithm, our weekly newsletter on AI. To get it in your inbox first, sign up here, and check out the rest of our Power Hungry package about AI here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Elon Musk has turned on Trump He called Trump’s domestic policy agenda a “disgusting abomination.” (NYT $)+ House Speaker Mike Johnson has, naturally, hit back. (Insider $) 2 NASA is in crisisIts budget has been cut by a quarter, and now its new leader has had his nomination revoked. (New Scientist $)+ What’s next for NASA’s giant moon rocket? (MIT Technology Review)3 Here’s how Big Tech plans to wield AITo build ‘everything apps’ that keep you inside their ecosystem, forever. (The Atlantic $)+ The trouble is, the experience isn’t always slick enough, as Google has discovered with its ‘Ask Photos’ feature. (The Verge $)+ How to fight your instinct to blindly trust AI. (WP $)4 Meta has signed a 20-year deal to buy nuclear power It’s the latest in a race to try to keep up with AI’s surging energy demands. (ABC)+ Can nuclear power really fuel the rise of AI? (MIT Technology Review) 5 Extreme heat takes a huge toll on people’s mental healthIt’s yet another issue we’re failing to prepare for, as summers get hotter and hotter. (Scientific American $)+ The quest to protect farmworkers from extreme heat. (MIT Technology Review) 6 China’s robotaxi companies are planning to expand in the Middle East And they’re getting a warmer welcome than in the US or Europe. (WSJ $)+ China’s EV giants are also betting big on humanoid robots. (MIT Technology Review)7 AI will supercharge hackersThe full impact of new AI techniques is yet to be felt, but experts say it’s only a matter of time. (Wired $)+ Five ways criminals are using AI. (MIT Technology Review)8 It’s an exciting time to be working on Alzheimer’s treatments 12 of them are moving to the final phase of clinical trials this year. (The Economist $)+ The innovation that gets an Alzheimer’s drug through the blood-brain barrier. (MIT Technology Review)9 Workers are being subjected to more and more surveillanceNot just in the gig economy either—’bossware’ is increasingly appearing in offices too. (Rest of World) 10 Noughties nostalgia is rife on TikTokIt was a pretty fun decade, to be fair. (The Guardian) Quote of the day  “This is scientific heaven. Or it used to be.” —Tom Rapoport, a 77-year-old Harvard Medical School professor from Germany, expresses his sadness about Trump’s cuts to US science funding to the New York Times.  One more thing OLCF What’s next for the world’s fastest supercomputers When the Frontier supercomputer came online in 2022, it marked the dawn of so-called exascale computing, with machines that can execute an exaflop—or a quintillion (1018) floating point operations a second.Since then, scientists have geared up to make more of these blazingly fast computers: several exascale machines are due to come online in the US and Europe.But speed itself isn’t the endgame. Researchers hope to pursue previously unanswerable questions about nature—and to design new technologies in areas from transportation to medicine. Read the full story. —Sophia Chen We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.) + If tracking tube trains in London is your thing, you’ll love this live map.+ Take a truly bonkers trip down memory lane, courtesy of these FBI artifacts.+ Netflix’s Frankenstein looks pretty intense.+ Why landlines are so darn spooky
    Like
    Love
    Wow
    Sad
    Angry
    227
    0 Comments 0 Shares 0 Reviews
CGShares https://cgshares.com