• Black Death Bacterium Evolved to be Less Aggressive to Kill Victims Slowly

    Co-lead author Ravneet Sidhu examines an ancient human tooth at the McMaster Ancient DNA Centre.NewsletterSign up for our email newsletter for the latest science newsA new study in Science suggests that changes in a gene in Yersinia pestis, the bacterium that causes plague, could’ve added to the length of two plague pandemics, including the pandemic that started with the “Black Death.” “Ours is one of the first research studies to directly examine changes in an ancient pathogen, one we still see today, in an attempt to understand what drives the virulence, persistence, and eventual extinction of pandemics,” said Hendrik Poinar, a study author and the director of the McMaster Ancient DNA Centre, according to a press release.The study suggests that less virulent plague bacteria could’ve caused longer plague pandemics — thanks to the fact that infected rodents livedfor longer periods of time before dying from their infections. Read More: Scientists Reveal the Black Death’s Origin StoryThe Three Plague PandemicsThe bacterium Y. pestis infects rodents and humans alike and has caused three main plague pandemics in humans, all of which continued for centuries after their initial outbreaks. The first began in the 500s; the second began in the 1300s; and the third started in the 1800s. Although all three pandemics were devastating at their outset, the second pandemic was by far the most severe. The Black Death, its initial outburst, killed around 30 to 50 percent of the population of Europe between 1347 and 1352 and — to this day — represents the deadliest disease wave in recorded history.To learn more about how these plague pandemics changed over time, scientists at McMaster University in Canada and the Institut Pasteur in France turned to a Y. pestis virulence gene known as pla. This gene is repeated many times throughout the Y. pestis genome, and it allows the bacterium to spread undetected throughout the bodies of infected individuals. A Gene and the PlagueTo investigate this gene, the scientists studied historical strains of Y. pestis from human remains and found that the number of repetitions of pla decreased over the course of the first and second plague pandemics. Then, the scientists tested Y. pestis bacteria from the third pandemic, infecting mice with three strains that had reduced repetitions of pla. “These three samples enabled us to analyze the biological impact of these pla gene deletions,” said Javier Pizarro-Cerdá, another study author and the director of the Yersinia Research Unit at the Institut Pasteur, according to the release.The results revealed that pla depletion decreases the virulence and increases the length of plague infections in mice. According to the study authors, these changes could have caused rodents to live longer in the later stages of the first and second pandemics, allowing them to spread their infections for a longer period. “It’s important to remember that plague was an epidemic of rats, which were the drivers of epidemics and pandemics. Humans were accidental victims. ” Poinar added in another press release.The Continued Threat of Y. PestisThough the pla depletion occurred around 100 years after the first and second pandemics began, the scientists stress that both changes were random and unrelated.“Our research sheds light on an interesting pattern in the evolutionary history of the plague. However, it is important to note that the majority of strains which continue to circulate today in Africa, the Americas, and Asia are highly virulent strains,” said Ravneet Sidhu, another study author and a Ph.D. student at the McMaster Ancient DNA Centre.Though still a threat to current populations, Y. pestis infections are much more manageable now as a result of modern diagnostics and treatments.“Today, the plague is a rare disease, but one that remains a public health concern and serves as a model for gaining a broad understanding of how pandemics emerge and become extinct. This example illustrates the balance of virulence a pathogen can adopt in order to spread effectively,” Pizarro-Cerdá said in the press release.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Science. Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as /monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as !SubscribeAlready a subscriber?Register or Log In
    #black #death #bacterium #evolved #less
    Black Death Bacterium Evolved to be Less Aggressive to Kill Victims Slowly
    Co-lead author Ravneet Sidhu examines an ancient human tooth at the McMaster Ancient DNA Centre.NewsletterSign up for our email newsletter for the latest science newsA new study in Science suggests that changes in a gene in Yersinia pestis, the bacterium that causes plague, could’ve added to the length of two plague pandemics, including the pandemic that started with the “Black Death.” “Ours is one of the first research studies to directly examine changes in an ancient pathogen, one we still see today, in an attempt to understand what drives the virulence, persistence, and eventual extinction of pandemics,” said Hendrik Poinar, a study author and the director of the McMaster Ancient DNA Centre, according to a press release.The study suggests that less virulent plague bacteria could’ve caused longer plague pandemics — thanks to the fact that infected rodents livedfor longer periods of time before dying from their infections. Read More: Scientists Reveal the Black Death’s Origin StoryThe Three Plague PandemicsThe bacterium Y. pestis infects rodents and humans alike and has caused three main plague pandemics in humans, all of which continued for centuries after their initial outbreaks. The first began in the 500s; the second began in the 1300s; and the third started in the 1800s. Although all three pandemics were devastating at their outset, the second pandemic was by far the most severe. The Black Death, its initial outburst, killed around 30 to 50 percent of the population of Europe between 1347 and 1352 and — to this day — represents the deadliest disease wave in recorded history.To learn more about how these plague pandemics changed over time, scientists at McMaster University in Canada and the Institut Pasteur in France turned to a Y. pestis virulence gene known as pla. This gene is repeated many times throughout the Y. pestis genome, and it allows the bacterium to spread undetected throughout the bodies of infected individuals. A Gene and the PlagueTo investigate this gene, the scientists studied historical strains of Y. pestis from human remains and found that the number of repetitions of pla decreased over the course of the first and second plague pandemics. Then, the scientists tested Y. pestis bacteria from the third pandemic, infecting mice with three strains that had reduced repetitions of pla. “These three samples enabled us to analyze the biological impact of these pla gene deletions,” said Javier Pizarro-Cerdá, another study author and the director of the Yersinia Research Unit at the Institut Pasteur, according to the release.The results revealed that pla depletion decreases the virulence and increases the length of plague infections in mice. According to the study authors, these changes could have caused rodents to live longer in the later stages of the first and second pandemics, allowing them to spread their infections for a longer period. “It’s important to remember that plague was an epidemic of rats, which were the drivers of epidemics and pandemics. Humans were accidental victims. ” Poinar added in another press release.The Continued Threat of Y. PestisThough the pla depletion occurred around 100 years after the first and second pandemics began, the scientists stress that both changes were random and unrelated.“Our research sheds light on an interesting pattern in the evolutionary history of the plague. However, it is important to note that the majority of strains which continue to circulate today in Africa, the Americas, and Asia are highly virulent strains,” said Ravneet Sidhu, another study author and a Ph.D. student at the McMaster Ancient DNA Centre.Though still a threat to current populations, Y. pestis infections are much more manageable now as a result of modern diagnostics and treatments.“Today, the plague is a rare disease, but one that remains a public health concern and serves as a model for gaining a broad understanding of how pandemics emerge and become extinct. This example illustrates the balance of virulence a pathogen can adopt in order to spread effectively,” Pizarro-Cerdá said in the press release.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Science. Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as /monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as !SubscribeAlready a subscriber?Register or Log In #black #death #bacterium #evolved #less
    WWW.DISCOVERMAGAZINE.COM
    Black Death Bacterium Evolved to be Less Aggressive to Kill Victims Slowly
    Co-lead author Ravneet Sidhu examines an ancient human tooth at the McMaster Ancient DNA Centre. (Image Credit: McMaster University)NewsletterSign up for our email newsletter for the latest science newsA new study in Science suggests that changes in a gene in Yersinia pestis, the bacterium that causes plague, could’ve added to the length of two plague pandemics, including the pandemic that started with the “Black Death.” “Ours is one of the first research studies to directly examine changes in an ancient pathogen, one we still see today, in an attempt to understand what drives the virulence, persistence, and eventual extinction of pandemics,” said Hendrik Poinar, a study author and the director of the McMaster Ancient DNA Centre, according to a press release.The study suggests that less virulent plague bacteria could’ve caused longer plague pandemics — thanks to the fact that infected rodents lived (and spread plague) for longer periods of time before dying from their infections. Read More: Scientists Reveal the Black Death’s Origin StoryThe Three Plague PandemicsThe bacterium Y. pestis infects rodents and humans alike and has caused three main plague pandemics in humans, all of which continued for centuries after their initial outbreaks. The first began in the 500s; the second began in the 1300s; and the third started in the 1800s (and still continues in certain areas in Asia, Africa, and the Americas today). Although all three pandemics were devastating at their outset, the second pandemic was by far the most severe. The Black Death, its initial outburst, killed around 30 to 50 percent of the population of Europe between 1347 and 1352 and — to this day — represents the deadliest disease wave in recorded history.To learn more about how these plague pandemics changed over time, scientists at McMaster University in Canada and the Institut Pasteur in France turned to a Y. pestis virulence gene known as pla. This gene is repeated many times throughout the Y. pestis genome, and it allows the bacterium to spread undetected throughout the bodies of infected individuals. A Gene and the PlagueTo investigate this gene, the scientists studied historical strains of Y. pestis from human remains and found that the number of repetitions of pla decreased over the course of the first and second plague pandemics. Then, the scientists tested Y. pestis bacteria from the third pandemic, infecting mice with three strains that had reduced repetitions of pla. “These three samples enabled us to analyze the biological impact of these pla gene deletions,” said Javier Pizarro-Cerdá, another study author and the director of the Yersinia Research Unit at the Institut Pasteur, according to the release.The results revealed that pla depletion decreases the virulence and increases the length of plague infections in mice. According to the study authors, these changes could have caused rodents to live longer in the later stages of the first and second pandemics, allowing them to spread their infections for a longer period. “It’s important to remember that plague was an epidemic of rats, which were the drivers of epidemics and pandemics. Humans were accidental victims. ” Poinar added in another press release.The Continued Threat of Y. PestisThough the pla depletion occurred around 100 years after the first and second pandemics began, the scientists stress that both changes were random and unrelated.“Our research sheds light on an interesting pattern in the evolutionary history of the plague. However, it is important to note that the majority of strains which continue to circulate today in Africa, the Americas, and Asia are highly virulent strains,” said Ravneet Sidhu, another study author and a Ph.D. student at the McMaster Ancient DNA Centre.Though still a threat to current populations, Y. pestis infections are much more manageable now as a result of modern diagnostics and treatments.“Today, the plague is a rare disease, but one that remains a public health concern and serves as a model for gaining a broad understanding of how pandemics emerge and become extinct. This example illustrates the balance of virulence a pathogen can adopt in order to spread effectively,” Pizarro-Cerdá said in the press release.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Science. Sam Walters is a journalist covering archaeology, paleontology, ecology, and evolution for Discover, along with an assortment of other topics. Before joining the Discover team as an assistant editor in 2022, Sam studied journalism at Northwestern University in Evanston, Illinois.1 free article leftWant More? Get unlimited access for as low as $1.99/monthSubscribeAlready a subscriber?Register or Log In1 free articleSubscribeWant more?Keep reading for as low as $1.99!SubscribeAlready a subscriber?Register or Log In
    0 Commentarii 0 Distribuiri 0 previzualizare
  • Research roundup: 7 stories we almost missed

    Best of the rest

    Research roundup: 7 stories we almost missed

    Also: drumming chimpanzees, picking styles of two jazz greats, and an ancient underground city's soundscape

    Jennifer Ouellette



    May 31, 2025 5:37 pm

    |

    4

    Time lapse photos show a new ping-pong-playing robot performing a top spin.

    Credit:

    David Nguyen, Kendrick Cancio and Sangbae Kim

    Time lapse photos show a new ping-pong-playing robot performing a top spin.

    Credit:

    David Nguyen, Kendrick Cancio and Sangbae Kim

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    It's a regrettable reality that there is never time to cover all the interesting scientific stories we come across each month. In the past, we've featured year-end roundups of cool science stories wemissed. This year, we're experimenting with a monthly collection. May's list includes a nifty experiment to make a predicted effect of special relativity visible; a ping-pong playing robot that can return hits with 88 percent accuracy; and the discovery of the rare genetic mutation that makes orange cats orange, among other highlights.
    Special relativity made visible

    Credit:

    TU Wien

    Perhaps the most well-known feature of Albert Einstein's special theory of relativity is time dilation and length contraction. In 1959, two physicists predicted another feature of relativistic motion: an object moving near the speed of light should also appear to be rotated. It's not been possible to demonstrate this experimentally, however—until now. Physicists at the Vienna University of Technology figured out how to reproduce this rotational effect in the lab using laser pulses and precision cameras, according to a paper published in the journal Communications Physics.
    They found their inspiration in art, specifically an earlier collaboration with an artist named Enar de Dios Rodriguez, who collaborated with VUT and the University of Vienna on a project involving ultra-fast photography and slow light. For this latest research, they used objects shaped like a cube and a sphere and moved them around the lab while zapping them with ultrashort laser pulses, recording the flashes with a high-speed camera.
    Getting the timing just right effectively yields similar results to a light speed of 2 m/s. After photographing the objects many times using this method, the team then combined the still images into a single image. The results: the cube looked twisted and the sphere's North Pole was in a different location—a demonstration of the rotational effect predicted back in 1959.

    DOI: Communications Physics, 2025. 10.1038/s42005-025-02003-6  .
    Drumming chimpanzees

    A chimpanzee feeling the rhythm. Credit: Current Biology/Eleuteri et al., 2025.

    Chimpanzees are known to "drum" on the roots of trees as a means of communication, often combining that action with what are known as "pant-hoot" vocalizations. Scientists have found that the chimps' drumming exhibits key elements of musical rhythm much like humans, according to  a paper published in the journal Current Biology—specifically non-random timing and isochrony. And chimps from different geographical regions have different drumming rhythms.
    Back in 2022, the same team observed that individual chimps had unique styles of "buttress drumming," which served as a kind of communication, letting others in the same group know their identity, location, and activity. This time around they wanted to know if this was also true of chimps living in different groups and whether their drumming was rhythmic in nature. So they collected video footage of the drumming behavior among 11 chimpanzee communities across six populations in East Africaand West Africa, amounting to 371 drumming bouts.
    Their analysis of the drum patterns confirmed their hypothesis. The western chimps drummed in regularly spaced hits, used faster tempos, and started drumming earlier during their pant-hoot vocalizations. Eastern chimps would alternate between shorter and longer spaced hits. Since this kind of rhythmic percussion is one of the earliest evolved forms of human musical expression and is ubiquitous across cultures, findings such as this could shed light on how our love of rhythm evolved.
    DOI: Current Biology, 2025. 10.1016/j.cub.2025.04.019  .
    Distinctive styles of two jazz greats

    Jazz lovers likely need no introduction to Joe Pass and Wes Montgomery, 20th century guitarists who influenced generations of jazz musicians with their innovative techniques. Montgomery, for instance, didn't use a pick, preferring to pluck the strings with his thumb—a method he developed because he practiced at night after working all day as a machinist and didn't want to wake his children or neighbors. Pass developed his own range of picking techniques, including fingerpicking, hybrid picking, and "flat picking."
    Chirag Gokani and Preston Wilson, both with Applied Research Laboratories and the University of Texas, Austin, greatly admired both Pass and Montgomery and decided to explore the underlying the acoustics of their distinctive playing, modeling the interactions of the thumb, fingers, and pick with a guitar string. They described their research during a meeting of the Acoustical Society of America in New Orleans, LA.
    Among their findings: Montgomery achieved his warm tone by playing closer to the bridge and mostly plucking at the string. Pass's rich tone arose from a combination of using a pick and playing closer to the guitar neck. There were also differences in how much a thumb, finger, and pick slip off the string:  use of the thumbproduced more of a "pluck" compared to the pick, which produced more of a "strike." Gokani and Wilson think their model could be used to synthesize digital guitars with a more realistic sound, as well as helping guitarists better emulate Pass and Montgomery.
    Sounds of an ancient underground city

    Credit:

    Sezin Nas

    Turkey is home to the underground city Derinkuyu, originally carved out inside soft volcanic rock around the 8th century BCE. It was later expanded to include four main ventilation channelsserving seven levels, which could be closed off from the inside with a large rolling stone. The city could hold up to 20,000 people and it  was connected to another underground city, Kaymakli, via tunnels. Derinkuyu helped protect Arab Muslims during the Arab-Byzantine wars, served as a refuge from the Ottomans in the 14th century, and as a haven for Armenians escaping persecution in the early 20th century, among other functions.

    The tunnels were rediscovered in the 1960s and about half of the city has been open to visitors since 2016. The site is naturally of great archaeological interest, but there has been little to no research on the acoustics of the site, particularly the ventilation channels—one of Derinkuyu's most unique features, according to Sezin Nas, an architectural acoustician at Istanbul Galata University in Turkey.  She gave a talk at a meeting of the Acoustical Society of America in New Orleans, LA, about her work on the site's acoustic environment.
    Nas analyzed a church, a living area, and a kitchen, measuring sound sources and reverberation patterns, among other factors, to create a 3D virtual soundscape. The hope is that a better understanding of this aspect of Derinkuyu could improve the design of future underground urban spaces—as well as one day using her virtual soundscape to enable visitors to experience the sounds of the city themselves.
    MIT's latest ping-pong robot
    Robots playing ping-pong have been a thing since the 1980s, of particular interest to scientists because it requires the robot to combine the slow, precise ability to grasp and pick up objects with dynamic, adaptable locomotion. Such robots need high-speed machine vision, fast motors and actuators, precise control, and the ability to make accurate predictions in real time, not to mention being able to develop a game strategy. More recent designs use AI techniques to allow the robots to "learn" from prior data to improve their performance.
    MIT researchers have built their own version of a ping-pong playing robot, incorporating a lightweight design and the ability to precisely return shots. They built on prior work developing the Humanoid, a small bipedal two-armed robot—specifically, modifying the Humanoid's arm by adding an extra degree of freedom to the wrist so the robot could control a ping-pong paddle. They tested their robot by mounting it on a ping-pong table and lobbing 150 balls at it from the other side of the table, capturing the action with high-speed cameras.

    The new bot can execute three different swing typesand during the trial runs it returned the ball with impressive accuracy across all three types: 88.4 percent, 89.2 percent, and 87.5 percent, respectively. Subsequent tweaks to theirrystem brought the robot's strike speed up to 19 meters per second, close to the 12 to 25 meters per second of advanced human players. The addition of control algorithms gave the robot the ability to aim. The robot still has limited mobility and reach because it has to be fixed to the ping-pong table but the MIT researchers plan to rig it to a gantry or wheeled platform in the future to address that shortcoming.
    Why orange cats are orange

    Credit:

    Astropulse/CC BY-SA 3.0

    Cat lovers know orange cats are special for more than their unique coloring, but that's the quality that has intrigued scientists for almost a century. Sure, lots of animals have orange, ginger, or yellow hues, like tigers, orangutans, and golden retrievers. But in domestic cats that color is specifically linked to sex. Almost all orange cats are male. Scientists have now identified the genetic mutation responsible and it appears to be unique to cats, according to a paper published in the journal Current Biology.
    Prior work had narrowed down the region on the X chromosome most likely to contain the relevant mutation. The scientists knew that females usually have just one copy of the mutation and in that case have tortoiseshellcoloring, although in rare cases, a female cat will be orange if both X chromosomes have the mutation. Over the last five to ten years, there has been an explosion in genome resourcesfor cats which greatly aided the team's research, along with taking additional DNA samples from cats at spay and neuter clinics.

    From an initial pool of 51 candidate variants, the scientists narrowed it down to three genes, only one of which was likely to play any role in gene regulation: Arhgap36. It wasn't known to play any role in pigment cells in humans, mice, or non-orange cats. But orange cats are special; their mutationturns on Arhgap36 expression in pigment cells, thereby interfering with the molecular pathway that controls coat color in other orange-shaded mammals. The scientists suggest that this is an example of how genes can acquire new functions, thereby enabling species to better adapt and evolve.
    DOI: Current Biology, 2025. 10.1016/j.cub.2025.03.075  .
    Not a Roman "massacre" after all

    Credit:

    Martin Smith

    In 1936, archaeologists excavating the Iron Age hill fort Maiden Castle in the UK unearthed dozens of human skeletons, all showing signs of lethal injuries to the head and upper body—likely inflicted with weaponry. At the time, this was interpreted as evidence of a pitched battle between the Britons of the local Durotriges tribe and invading Romans. The Romans slaughtered the native inhabitants, thereby bringing a sudden violent end to the Iron Age. At least that's the popular narrative that has prevailed ever since in countless popular articles, books, and documentaries.
    But a paper published in the Oxford Journal of Archaeology calls that narrative into question. Archaeologists at Bournemouth University have re-analyzed those burials, incorporating radiocarbon dating into their efforts. They concluded that those individuals didn't die in a single brutal battle. Rather, it was Britons killing other Britons over multiple generations between the first century BCE and the first century CE—most likely in periodic localized outbursts of violence in the lead-up to the Roman conquest of Britain. It's possible there are still many human remains waiting to be discovered at the site, which could shed further light on what happened at Maiden Castle.
    DOI: Oxford Journal of Archaeology, 2025. 10.1111/ojoa.12324  .

    Jennifer Ouellette
    Senior Writer

    Jennifer Ouellette
    Senior Writer

    Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

    4 Comments
    #research #roundup #stories #almost #missed
    Research roundup: 7 stories we almost missed
    Best of the rest Research roundup: 7 stories we almost missed Also: drumming chimpanzees, picking styles of two jazz greats, and an ancient underground city's soundscape Jennifer Ouellette – May 31, 2025 5:37 pm | 4 Time lapse photos show a new ping-pong-playing robot performing a top spin. Credit: David Nguyen, Kendrick Cancio and Sangbae Kim Time lapse photos show a new ping-pong-playing robot performing a top spin. Credit: David Nguyen, Kendrick Cancio and Sangbae Kim Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more It's a regrettable reality that there is never time to cover all the interesting scientific stories we come across each month. In the past, we've featured year-end roundups of cool science stories wemissed. This year, we're experimenting with a monthly collection. May's list includes a nifty experiment to make a predicted effect of special relativity visible; a ping-pong playing robot that can return hits with 88 percent accuracy; and the discovery of the rare genetic mutation that makes orange cats orange, among other highlights. Special relativity made visible Credit: TU Wien Perhaps the most well-known feature of Albert Einstein's special theory of relativity is time dilation and length contraction. In 1959, two physicists predicted another feature of relativistic motion: an object moving near the speed of light should also appear to be rotated. It's not been possible to demonstrate this experimentally, however—until now. Physicists at the Vienna University of Technology figured out how to reproduce this rotational effect in the lab using laser pulses and precision cameras, according to a paper published in the journal Communications Physics. They found their inspiration in art, specifically an earlier collaboration with an artist named Enar de Dios Rodriguez, who collaborated with VUT and the University of Vienna on a project involving ultra-fast photography and slow light. For this latest research, they used objects shaped like a cube and a sphere and moved them around the lab while zapping them with ultrashort laser pulses, recording the flashes with a high-speed camera. Getting the timing just right effectively yields similar results to a light speed of 2 m/s. After photographing the objects many times using this method, the team then combined the still images into a single image. The results: the cube looked twisted and the sphere's North Pole was in a different location—a demonstration of the rotational effect predicted back in 1959. DOI: Communications Physics, 2025. 10.1038/s42005-025-02003-6  . Drumming chimpanzees A chimpanzee feeling the rhythm. Credit: Current Biology/Eleuteri et al., 2025. Chimpanzees are known to "drum" on the roots of trees as a means of communication, often combining that action with what are known as "pant-hoot" vocalizations. Scientists have found that the chimps' drumming exhibits key elements of musical rhythm much like humans, according to  a paper published in the journal Current Biology—specifically non-random timing and isochrony. And chimps from different geographical regions have different drumming rhythms. Back in 2022, the same team observed that individual chimps had unique styles of "buttress drumming," which served as a kind of communication, letting others in the same group know their identity, location, and activity. This time around they wanted to know if this was also true of chimps living in different groups and whether their drumming was rhythmic in nature. So they collected video footage of the drumming behavior among 11 chimpanzee communities across six populations in East Africaand West Africa, amounting to 371 drumming bouts. Their analysis of the drum patterns confirmed their hypothesis. The western chimps drummed in regularly spaced hits, used faster tempos, and started drumming earlier during their pant-hoot vocalizations. Eastern chimps would alternate between shorter and longer spaced hits. Since this kind of rhythmic percussion is one of the earliest evolved forms of human musical expression and is ubiquitous across cultures, findings such as this could shed light on how our love of rhythm evolved. DOI: Current Biology, 2025. 10.1016/j.cub.2025.04.019  . Distinctive styles of two jazz greats Jazz lovers likely need no introduction to Joe Pass and Wes Montgomery, 20th century guitarists who influenced generations of jazz musicians with their innovative techniques. Montgomery, for instance, didn't use a pick, preferring to pluck the strings with his thumb—a method he developed because he practiced at night after working all day as a machinist and didn't want to wake his children or neighbors. Pass developed his own range of picking techniques, including fingerpicking, hybrid picking, and "flat picking." Chirag Gokani and Preston Wilson, both with Applied Research Laboratories and the University of Texas, Austin, greatly admired both Pass and Montgomery and decided to explore the underlying the acoustics of their distinctive playing, modeling the interactions of the thumb, fingers, and pick with a guitar string. They described their research during a meeting of the Acoustical Society of America in New Orleans, LA. Among their findings: Montgomery achieved his warm tone by playing closer to the bridge and mostly plucking at the string. Pass's rich tone arose from a combination of using a pick and playing closer to the guitar neck. There were also differences in how much a thumb, finger, and pick slip off the string:  use of the thumbproduced more of a "pluck" compared to the pick, which produced more of a "strike." Gokani and Wilson think their model could be used to synthesize digital guitars with a more realistic sound, as well as helping guitarists better emulate Pass and Montgomery. Sounds of an ancient underground city Credit: Sezin Nas Turkey is home to the underground city Derinkuyu, originally carved out inside soft volcanic rock around the 8th century BCE. It was later expanded to include four main ventilation channelsserving seven levels, which could be closed off from the inside with a large rolling stone. The city could hold up to 20,000 people and it  was connected to another underground city, Kaymakli, via tunnels. Derinkuyu helped protect Arab Muslims during the Arab-Byzantine wars, served as a refuge from the Ottomans in the 14th century, and as a haven for Armenians escaping persecution in the early 20th century, among other functions. The tunnels were rediscovered in the 1960s and about half of the city has been open to visitors since 2016. The site is naturally of great archaeological interest, but there has been little to no research on the acoustics of the site, particularly the ventilation channels—one of Derinkuyu's most unique features, according to Sezin Nas, an architectural acoustician at Istanbul Galata University in Turkey.  She gave a talk at a meeting of the Acoustical Society of America in New Orleans, LA, about her work on the site's acoustic environment. Nas analyzed a church, a living area, and a kitchen, measuring sound sources and reverberation patterns, among other factors, to create a 3D virtual soundscape. The hope is that a better understanding of this aspect of Derinkuyu could improve the design of future underground urban spaces—as well as one day using her virtual soundscape to enable visitors to experience the sounds of the city themselves. MIT's latest ping-pong robot Robots playing ping-pong have been a thing since the 1980s, of particular interest to scientists because it requires the robot to combine the slow, precise ability to grasp and pick up objects with dynamic, adaptable locomotion. Such robots need high-speed machine vision, fast motors and actuators, precise control, and the ability to make accurate predictions in real time, not to mention being able to develop a game strategy. More recent designs use AI techniques to allow the robots to "learn" from prior data to improve their performance. MIT researchers have built their own version of a ping-pong playing robot, incorporating a lightweight design and the ability to precisely return shots. They built on prior work developing the Humanoid, a small bipedal two-armed robot—specifically, modifying the Humanoid's arm by adding an extra degree of freedom to the wrist so the robot could control a ping-pong paddle. They tested their robot by mounting it on a ping-pong table and lobbing 150 balls at it from the other side of the table, capturing the action with high-speed cameras. The new bot can execute three different swing typesand during the trial runs it returned the ball with impressive accuracy across all three types: 88.4 percent, 89.2 percent, and 87.5 percent, respectively. Subsequent tweaks to theirrystem brought the robot's strike speed up to 19 meters per second, close to the 12 to 25 meters per second of advanced human players. The addition of control algorithms gave the robot the ability to aim. The robot still has limited mobility and reach because it has to be fixed to the ping-pong table but the MIT researchers plan to rig it to a gantry or wheeled platform in the future to address that shortcoming. Why orange cats are orange Credit: Astropulse/CC BY-SA 3.0 Cat lovers know orange cats are special for more than their unique coloring, but that's the quality that has intrigued scientists for almost a century. Sure, lots of animals have orange, ginger, or yellow hues, like tigers, orangutans, and golden retrievers. But in domestic cats that color is specifically linked to sex. Almost all orange cats are male. Scientists have now identified the genetic mutation responsible and it appears to be unique to cats, according to a paper published in the journal Current Biology. Prior work had narrowed down the region on the X chromosome most likely to contain the relevant mutation. The scientists knew that females usually have just one copy of the mutation and in that case have tortoiseshellcoloring, although in rare cases, a female cat will be orange if both X chromosomes have the mutation. Over the last five to ten years, there has been an explosion in genome resourcesfor cats which greatly aided the team's research, along with taking additional DNA samples from cats at spay and neuter clinics. From an initial pool of 51 candidate variants, the scientists narrowed it down to three genes, only one of which was likely to play any role in gene regulation: Arhgap36. It wasn't known to play any role in pigment cells in humans, mice, or non-orange cats. But orange cats are special; their mutationturns on Arhgap36 expression in pigment cells, thereby interfering with the molecular pathway that controls coat color in other orange-shaded mammals. The scientists suggest that this is an example of how genes can acquire new functions, thereby enabling species to better adapt and evolve. DOI: Current Biology, 2025. 10.1016/j.cub.2025.03.075  . Not a Roman "massacre" after all Credit: Martin Smith In 1936, archaeologists excavating the Iron Age hill fort Maiden Castle in the UK unearthed dozens of human skeletons, all showing signs of lethal injuries to the head and upper body—likely inflicted with weaponry. At the time, this was interpreted as evidence of a pitched battle between the Britons of the local Durotriges tribe and invading Romans. The Romans slaughtered the native inhabitants, thereby bringing a sudden violent end to the Iron Age. At least that's the popular narrative that has prevailed ever since in countless popular articles, books, and documentaries. But a paper published in the Oxford Journal of Archaeology calls that narrative into question. Archaeologists at Bournemouth University have re-analyzed those burials, incorporating radiocarbon dating into their efforts. They concluded that those individuals didn't die in a single brutal battle. Rather, it was Britons killing other Britons over multiple generations between the first century BCE and the first century CE—most likely in periodic localized outbursts of violence in the lead-up to the Roman conquest of Britain. It's possible there are still many human remains waiting to be discovered at the site, which could shed further light on what happened at Maiden Castle. DOI: Oxford Journal of Archaeology, 2025. 10.1111/ojoa.12324  . Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 4 Comments #research #roundup #stories #almost #missed
    ARSTECHNICA.COM
    Research roundup: 7 stories we almost missed
    Best of the rest Research roundup: 7 stories we almost missed Also: drumming chimpanzees, picking styles of two jazz greats, and an ancient underground city's soundscape Jennifer Ouellette – May 31, 2025 5:37 pm | 4 Time lapse photos show a new ping-pong-playing robot performing a top spin. Credit: David Nguyen, Kendrick Cancio and Sangbae Kim Time lapse photos show a new ping-pong-playing robot performing a top spin. Credit: David Nguyen, Kendrick Cancio and Sangbae Kim Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more It's a regrettable reality that there is never time to cover all the interesting scientific stories we come across each month. In the past, we've featured year-end roundups of cool science stories we (almost) missed. This year, we're experimenting with a monthly collection. May's list includes a nifty experiment to make a predicted effect of special relativity visible; a ping-pong playing robot that can return hits with 88 percent accuracy; and the discovery of the rare genetic mutation that makes orange cats orange, among other highlights. Special relativity made visible Credit: TU Wien Perhaps the most well-known feature of Albert Einstein's special theory of relativity is time dilation and length contraction. In 1959, two physicists predicted another feature of relativistic motion: an object moving near the speed of light should also appear to be rotated. It's not been possible to demonstrate this experimentally, however—until now. Physicists at the Vienna University of Technology figured out how to reproduce this rotational effect in the lab using laser pulses and precision cameras, according to a paper published in the journal Communications Physics. They found their inspiration in art, specifically an earlier collaboration with an artist named Enar de Dios Rodriguez, who collaborated with VUT and the University of Vienna on a project involving ultra-fast photography and slow light. For this latest research, they used objects shaped like a cube and a sphere and moved them around the lab while zapping them with ultrashort laser pulses, recording the flashes with a high-speed camera. Getting the timing just right effectively yields similar results to a light speed of 2 m/s. After photographing the objects many times using this method, the team then combined the still images into a single image. The results: the cube looked twisted and the sphere's North Pole was in a different location—a demonstration of the rotational effect predicted back in 1959. DOI: Communications Physics, 2025. 10.1038/s42005-025-02003-6  (About DOIs). Drumming chimpanzees A chimpanzee feeling the rhythm. Credit: Current Biology/Eleuteri et al., 2025. Chimpanzees are known to "drum" on the roots of trees as a means of communication, often combining that action with what are known as "pant-hoot" vocalizations (see above video). Scientists have found that the chimps' drumming exhibits key elements of musical rhythm much like humans, according to  a paper published in the journal Current Biology—specifically non-random timing and isochrony. And chimps from different geographical regions have different drumming rhythms. Back in 2022, the same team observed that individual chimps had unique styles of "buttress drumming," which served as a kind of communication, letting others in the same group know their identity, location, and activity. This time around they wanted to know if this was also true of chimps living in different groups and whether their drumming was rhythmic in nature. So they collected video footage of the drumming behavior among 11 chimpanzee communities across six populations in East Africa (Uganda) and West Africa (Ivory Coast), amounting to 371 drumming bouts. Their analysis of the drum patterns confirmed their hypothesis. The western chimps drummed in regularly spaced hits, used faster tempos, and started drumming earlier during their pant-hoot vocalizations. Eastern chimps would alternate between shorter and longer spaced hits. Since this kind of rhythmic percussion is one of the earliest evolved forms of human musical expression and is ubiquitous across cultures, findings such as this could shed light on how our love of rhythm evolved. DOI: Current Biology, 2025. 10.1016/j.cub.2025.04.019  (About DOIs). Distinctive styles of two jazz greats Jazz lovers likely need no introduction to Joe Pass and Wes Montgomery, 20th century guitarists who influenced generations of jazz musicians with their innovative techniques. Montgomery, for instance, didn't use a pick, preferring to pluck the strings with his thumb—a method he developed because he practiced at night after working all day as a machinist and didn't want to wake his children or neighbors. Pass developed his own range of picking techniques, including fingerpicking, hybrid picking, and "flat picking." Chirag Gokani and Preston Wilson, both with Applied Research Laboratories and the University of Texas, Austin, greatly admired both Pass and Montgomery and decided to explore the underlying the acoustics of their distinctive playing, modeling the interactions of the thumb, fingers, and pick with a guitar string. They described their research during a meeting of the Acoustical Society of America in New Orleans, LA. Among their findings: Montgomery achieved his warm tone by playing closer to the bridge and mostly plucking at the string. Pass's rich tone arose from a combination of using a pick and playing closer to the guitar neck. There were also differences in how much a thumb, finger, and pick slip off the string:  use of the thumb (Montgomery) produced more of a "pluck" compared to the pick (Pass), which produced more of a "strike." Gokani and Wilson think their model could be used to synthesize digital guitars with a more realistic sound, as well as helping guitarists better emulate Pass and Montgomery. Sounds of an ancient underground city Credit: Sezin Nas Turkey is home to the underground city Derinkuyu, originally carved out inside soft volcanic rock around the 8th century BCE. It was later expanded to include four main ventilation channels (and some 50,000 smaller shafts) serving seven levels, which could be closed off from the inside with a large rolling stone. The city could hold up to 20,000 people and it  was connected to another underground city, Kaymakli, via tunnels. Derinkuyu helped protect Arab Muslims during the Arab-Byzantine wars, served as a refuge from the Ottomans in the 14th century, and as a haven for Armenians escaping persecution in the early 20th century, among other functions. The tunnels were rediscovered in the 1960s and about half of the city has been open to visitors since 2016. The site is naturally of great archaeological interest, but there has been little to no research on the acoustics of the site, particularly the ventilation channels—one of Derinkuyu's most unique features, according to Sezin Nas, an architectural acoustician at Istanbul Galata University in Turkey.  She gave a talk at a meeting of the Acoustical Society of America in New Orleans, LA, about her work on the site's acoustic environment. Nas analyzed a church, a living area, and a kitchen, measuring sound sources and reverberation patterns, among other factors, to create a 3D virtual soundscape. The hope is that a better understanding of this aspect of Derinkuyu could improve the design of future underground urban spaces—as well as one day using her virtual soundscape to enable visitors to experience the sounds of the city themselves. MIT's latest ping-pong robot Robots playing ping-pong have been a thing since the 1980s, of particular interest to scientists because it requires the robot to combine the slow, precise ability to grasp and pick up objects with dynamic, adaptable locomotion. Such robots need high-speed machine vision, fast motors and actuators, precise control, and the ability to make accurate predictions in real time, not to mention being able to develop a game strategy. More recent designs use AI techniques to allow the robots to "learn" from prior data to improve their performance. MIT researchers have built their own version of a ping-pong playing robot, incorporating a lightweight design and the ability to precisely return shots. They built on prior work developing the Humanoid, a small bipedal two-armed robot—specifically, modifying the Humanoid's arm by adding an extra degree of freedom to the wrist so the robot could control a ping-pong paddle. They tested their robot by mounting it on a ping-pong table and lobbing 150 balls at it from the other side of the table, capturing the action with high-speed cameras. The new bot can execute three different swing types (loop, drive, and chip) and during the trial runs it returned the ball with impressive accuracy across all three types: 88.4 percent, 89.2 percent, and 87.5 percent, respectively. Subsequent tweaks to theirrystem brought the robot's strike speed up to 19 meters per second (about 42 MPH), close to the 12 to 25 meters per second of advanced human players. The addition of control algorithms gave the robot the ability to aim. The robot still has limited mobility and reach because it has to be fixed to the ping-pong table but the MIT researchers plan to rig it to a gantry or wheeled platform in the future to address that shortcoming. Why orange cats are orange Credit: Astropulse/CC BY-SA 3.0 Cat lovers know orange cats are special for more than their unique coloring, but that's the quality that has intrigued scientists for almost a century. Sure, lots of animals have orange, ginger, or yellow hues, like tigers, orangutans, and golden retrievers. But in domestic cats that color is specifically linked to sex. Almost all orange cats are male. Scientists have now identified the genetic mutation responsible and it appears to be unique to cats, according to a paper published in the journal Current Biology. Prior work had narrowed down the region on the X chromosome most likely to contain the relevant mutation. The scientists knew that females usually have just one copy of the mutation and in that case have tortoiseshell (partially orange) coloring, although in rare cases, a female cat will be orange if both X chromosomes have the mutation. Over the last five to ten years, there has been an explosion in genome resources (including complete sequenced genomes) for cats which greatly aided the team's research, along with taking additional DNA samples from cats at spay and neuter clinics. From an initial pool of 51 candidate variants, the scientists narrowed it down to three genes, only one of which was likely to play any role in gene regulation: Arhgap36. It wasn't known to play any role in pigment cells in humans, mice, or non-orange cats. But orange cats are special; their mutation (sex-linked orange) turns on Arhgap36 expression in pigment cells (and only pigment cells), thereby interfering with the molecular pathway that controls coat color in other orange-shaded mammals. The scientists suggest that this is an example of how genes can acquire new functions, thereby enabling species to better adapt and evolve. DOI: Current Biology, 2025. 10.1016/j.cub.2025.03.075  (About DOIs). Not a Roman "massacre" after all Credit: Martin Smith In 1936, archaeologists excavating the Iron Age hill fort Maiden Castle in the UK unearthed dozens of human skeletons, all showing signs of lethal injuries to the head and upper body—likely inflicted with weaponry. At the time, this was interpreted as evidence of a pitched battle between the Britons of the local Durotriges tribe and invading Romans. The Romans slaughtered the native inhabitants, thereby bringing a sudden violent end to the Iron Age. At least that's the popular narrative that has prevailed ever since in countless popular articles, books, and documentaries. But a paper published in the Oxford Journal of Archaeology calls that narrative into question. Archaeologists at Bournemouth University have re-analyzed those burials, incorporating radiocarbon dating into their efforts. They concluded that those individuals didn't die in a single brutal battle. Rather, it was Britons killing other Britons over multiple generations between the first century BCE and the first century CE—most likely in periodic localized outbursts of violence in the lead-up to the Roman conquest of Britain. It's possible there are still many human remains waiting to be discovered at the site, which could shed further light on what happened at Maiden Castle. DOI: Oxford Journal of Archaeology, 2025. 10.1111/ojoa.12324  (About DOIs). Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 4 Comments
    13 Commentarii 0 Distribuiri 0 previzualizare
  • 7,100-Year-Old Skeleton Reveals Unknown Human Lineage in China

    Photo Credit: Yunnan Institute of Cultural Relics and Archaeology Ancient DNA from Yunnan Uncovers Ghost Lineage Linked to Tibetan Ancestry

    Highlights

    7,100-year-old Xingyi_EN carries DNA from a mysterious ghost lineage
    Study links ancient Yunnan DNA to modern Tibetan ancestry
    Central Yunnan ancestry tied to early Austroasiatic populations

    Advertisement

    A new study on a 7,100-year-old skeleton from China has revealed a "ghost" lineage that only existed in theories until now. Skeleton of the early Neolithic woman, known as Xingyi_EN, unearthed at the Xingyi archaeological site in southwestern China's Yunnan province. Her DNA links her to a deeply divergent human population that may have contributed to the ancestry of modern Tibetans. This study also reveals a distinct Central Yunnan ancestry connected to early Austroasiatic-speaking groups. This discovery makes Yunnan as a key region to understand the ancient genetic history of East and Southeast Asia. The detailed analysis of 127 human genomes from southwestern China is published in a study in the journal Science.Xingyi_EN: A Genetic Link to a Mysterious PastAccording to the study, radiocarbon dating indicates Xingyi_EN lived around 7,100 years ago and isotope analysis suggests she lived as a hunter-gatherer. Genetic sequencing revealed her ancestry from a deeply diverged human lineage—now named the Basal Asian Xingyi lineage. This lineage diverged from other modern human groups over 40,000 years ago and remained isolated for thousands of years without mixing with other populations.This "ghost" lineage does not match DNA from Neanderthals or Denisovans but appears to have later contributed to the ancestry of some modern Tibetans. Xingyi_EN represents the first physical evidence of this previously unknown population.Yunnan's significance as a reservoir of deep human diversityMost of the skeletons that the researchers sampled were dated between 1,400 and 7,150 years ago and came from Yunnan province, which today has the highest ethnic and linguistic diversity in all of China."Ancient humans that lived in this region may be key to addressing several remaining questions on the prehistoric populations of East and Southeast Asia," the researchers wrote in the study. Those unanswered questions include the origins of people who live on the Tibetan Plateau, as previous studies have shown that Tibetans have northern East Asian ancestry.

    For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

    Further reading:
    Ancient DNA, Yunnan, Ghost Lineage, Tibetan Ancestry, Human Evolution, Xingyi_EN, Paleogenetics, Archaeogenomics, Prehistoric Asia, Science Study

    Gadgets 360 Staff

    The resident bot. If you email me, a human will respond.
    More

    Related Stories
    #7100yearold #skeleton #reveals #unknown #human
    7,100-Year-Old Skeleton Reveals Unknown Human Lineage in China
    Photo Credit: Yunnan Institute of Cultural Relics and Archaeology Ancient DNA from Yunnan Uncovers Ghost Lineage Linked to Tibetan Ancestry Highlights 7,100-year-old Xingyi_EN carries DNA from a mysterious ghost lineage Study links ancient Yunnan DNA to modern Tibetan ancestry Central Yunnan ancestry tied to early Austroasiatic populations Advertisement A new study on a 7,100-year-old skeleton from China has revealed a "ghost" lineage that only existed in theories until now. Skeleton of the early Neolithic woman, known as Xingyi_EN, unearthed at the Xingyi archaeological site in southwestern China's Yunnan province. Her DNA links her to a deeply divergent human population that may have contributed to the ancestry of modern Tibetans. This study also reveals a distinct Central Yunnan ancestry connected to early Austroasiatic-speaking groups. This discovery makes Yunnan as a key region to understand the ancient genetic history of East and Southeast Asia. The detailed analysis of 127 human genomes from southwestern China is published in a study in the journal Science.Xingyi_EN: A Genetic Link to a Mysterious PastAccording to the study, radiocarbon dating indicates Xingyi_EN lived around 7,100 years ago and isotope analysis suggests she lived as a hunter-gatherer. Genetic sequencing revealed her ancestry from a deeply diverged human lineage—now named the Basal Asian Xingyi lineage. This lineage diverged from other modern human groups over 40,000 years ago and remained isolated for thousands of years without mixing with other populations.This "ghost" lineage does not match DNA from Neanderthals or Denisovans but appears to have later contributed to the ancestry of some modern Tibetans. Xingyi_EN represents the first physical evidence of this previously unknown population.Yunnan's significance as a reservoir of deep human diversityMost of the skeletons that the researchers sampled were dated between 1,400 and 7,150 years ago and came from Yunnan province, which today has the highest ethnic and linguistic diversity in all of China."Ancient humans that lived in this region may be key to addressing several remaining questions on the prehistoric populations of East and Southeast Asia," the researchers wrote in the study. Those unanswered questions include the origins of people who live on the Tibetan Plateau, as previous studies have shown that Tibetans have northern East Asian ancestry. For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: Ancient DNA, Yunnan, Ghost Lineage, Tibetan Ancestry, Human Evolution, Xingyi_EN, Paleogenetics, Archaeogenomics, Prehistoric Asia, Science Study Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories #7100yearold #skeleton #reveals #unknown #human
    WWW.GADGETS360.COM
    7,100-Year-Old Skeleton Reveals Unknown Human Lineage in China
    Photo Credit: Yunnan Institute of Cultural Relics and Archaeology Ancient DNA from Yunnan Uncovers Ghost Lineage Linked to Tibetan Ancestry Highlights 7,100-year-old Xingyi_EN carries DNA from a mysterious ghost lineage Study links ancient Yunnan DNA to modern Tibetan ancestry Central Yunnan ancestry tied to early Austroasiatic populations Advertisement A new study on a 7,100-year-old skeleton from China has revealed a "ghost" lineage that only existed in theories until now. Skeleton of the early Neolithic woman, known as Xingyi_EN, unearthed at the Xingyi archaeological site in southwestern China's Yunnan province. Her DNA links her to a deeply divergent human population that may have contributed to the ancestry of modern Tibetans. This study also reveals a distinct Central Yunnan ancestry connected to early Austroasiatic-speaking groups. This discovery makes Yunnan as a key region to understand the ancient genetic history of East and Southeast Asia. The detailed analysis of 127 human genomes from southwestern China is published in a study in the journal Science.Xingyi_EN: A Genetic Link to a Mysterious PastAccording to the study, radiocarbon dating indicates Xingyi_EN lived around 7,100 years ago and isotope analysis suggests she lived as a hunter-gatherer. Genetic sequencing revealed her ancestry from a deeply diverged human lineage—now named the Basal Asian Xingyi lineage. This lineage diverged from other modern human groups over 40,000 years ago and remained isolated for thousands of years without mixing with other populations.This "ghost" lineage does not match DNA from Neanderthals or Denisovans but appears to have later contributed to the ancestry of some modern Tibetans. Xingyi_EN represents the first physical evidence of this previously unknown population.Yunnan's significance as a reservoir of deep human diversityMost of the skeletons that the researchers sampled were dated between 1,400 and 7,150 years ago and came from Yunnan province, which today has the highest ethnic and linguistic diversity in all of China."Ancient humans that lived in this region may be key to addressing several remaining questions on the prehistoric populations of East and Southeast Asia," the researchers wrote in the study. Those unanswered questions include the origins of people who live on the Tibetan Plateau, as previous studies have shown that Tibetans have northern East Asian ancestry. For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: Ancient DNA, Yunnan, Ghost Lineage, Tibetan Ancestry, Human Evolution, Xingyi_EN, Paleogenetics, Archaeogenomics, Prehistoric Asia, Science Study Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories
    13 Commentarii 0 Distribuiri 0 previzualizare
  • RFK Jr. is looking in the wrong place for autism’s cause

    Let’s start with one unambiguous fact: More children are diagnosed with autism today than in the early 1990s. According to a sweeping 2000 analysis by the Centers for Disease Control and Prevention, a range of 2–7 per 1,000, or roughly 0.5 percent of US children, were diagnosed with autism in the 1990s. That figure has risen to 1 in 35 kids, or roughly 3 percent.The apparent rapid increase caught the attention of people like Robert F. Kennedy Jr., who assumed that something had to be changing in the environment to drive it. In 2005, Kennedy, a lawyer and environmental activist at the time, authored an infamous essay in Rolling Stone that primarily placed the blame for the increased prevalence of autism on vaccines.More recently, he has theorized that a mysterious toxin introduced in the late 1980s must be responsible. Now, as the nation’s top health official leading the Department of Health and Human Services, Kennedy has declared autism an “epidemic.” And, in April, he launched a massive federal effort to find the culprit for the rise in autism rates, calling for researchers to examine a range of suspects: chemicals, molds, vaccines, and perhaps even ultrasounds given to pregnant mothers. “Genes don’t cause epidemics. You need an environmental toxin,” Kennedy said in April when announcing his department’s new autism research project. He argued that too much money had been put into genetic research — “a dead end,” in his words — and his project would be a correction to focus on environmental causes. “That’s where we’re going to find an answer.”But according to many autism scientists I spoke to for this story, Kennedy is looking in exactly the wrong place. Three takeaways from this storyExperts say the increase in US autism rates is mostly explained by the expanding definitions of the condition, as well as more awareness and more screening for it.Scientists have identified hundreds of genes that are associated with autism, building a convincing case that genetics are the most important driver of autism’s development — not, as Health Secretary Robert F. Kennedy Jr. has argued, a single environmental toxin.Researchers fear Kennedy’s fixation on outside toxins could distract from genetic research that has facilitated the development of exciting new therapies that could help those with profound autism.Autism is a complex disorder with a range of manifestations that has long defied simple explanations, and it’s unlikely that we will ever identify a single “cause” of autism.But scientists have learned a lot in the past 50 years, including identifying some of the most important risk factors. They are not, as Kennedy suggests, out in our environment. They are written into our genetics. What appeared to be a massive increase in autism was actually a byproduct of better screening and more awareness. “The way the HHS secretary has been walking about his plans, his goals, he starts out with this basic assumption that nothing worthwhile has been done,” Helen Tager-Flusberg, a psychologist at Boston University who has worked with and studied children with autism for years, said. “Genes play a significant role. We know now that autism runs in families… There is no single underlying factor. Looking for that holy grail is not the best approach.”Doctors who treat children with autism often talk about how they wish they could provide easy answers to the families. The answers being uncovered through genetics research may not be simple per se, but they are answers supported by science.Kennedy is muddying the story, pledging to find a silver-bullet answer where likely none exists. It’s a false promise — one that could cause more anxiety and confusion for the very families Kennedy says he wants to help. Robert F. Kennedy Jr. speaks during a news conference at the Department of Health and Human Services in mid-April to discuss this agency’s efforts to determine the cause of autism. Alex Wong/Getty ImagesThe autism “epidemic” that wasn’tAutism was first described in 1911, and for many decades, researchers and clinicians confused the social challenges and language development difficulties common among those with the condition for a psychological issue. Some child therapists even blamed the condition on bad parenting. But in 1977, a study discovered that identical twins, who share all of their DNA, were much more likely to both be autistic than fraternal twins, who share no more DNA than ordinary siblings. It marked a major breakthrough in autism research, and pushed scientists to begin coalescing around a different theory: There was a biological factor.At the time, this was just a theory — scientists lacked the technology to prove those suspicions at the genetic level. And clinicians were also still trying to work out an even more fundamental question: What exactly was autism? For a long time, the criteria for diagnosing a person with autism was strictly based on speech development. But clinicians were increasingly observing children who could acquire basic language skills but still struggled with social communication — things like misunderstanding nonverbal cues or taking figurative language literally. Psychologists gradually broadened their definition of autism from a strict and narrow focus on language, culminating in a 2013 criteria that included a wide range of social and emotional symptoms with three subtypes — the autism spectrum disorder we’re familiar with today.Along the way, autism had evolved from a niche diagnosis for the severely impaired to something that encompassed far more children. It makes sense then, that as the broad criteria for autism expanded, more and more children would meet it, and autism rates would rise. That’s precisely what happened. And it means that the “epidemic” that Kennedy and other activists have been fixated on is mostly a diagnostic mirage. Historical autism data is spotty and subject to these same historical biases, but if you look at the prevalence of profound autism alone — those who need the highest levels of support — a clearer picture emerges.In the ’80s and ’90s, low-support needs individuals would have been less likely to receive an autism diagnosis given the more restrictive criteria and less overall awareness of the disorder, meaning that people with severe autism likely represented most of the roughly 0.5 percent of children diagnosed with autism in the 1990s.By 2025, when about 3 percent of children are being diagnosed with autism, about one in four of those diagnosed are considered to have high-support needs autism, those with most severe manifestation of the condition. That would equal about 0.8 percent of all US children — which would be a fairly marginal increase from autism rates 30 years ago. Or look at it another way: In 2000, as many as 60 percent of the people being diagnosed with autism had an intellectual disability, one of the best indicators of high-support needs autism. In 2022, that percentage was less than 40 percent.As a recently published CDC report on autism prevalence among young children concluded, the increase in autism rates can largely be accounted for by stronger surveillance and more awareness among providers and parents, rather than a novel toxin or some other external factor driving an increase in cases.Other known risk factors — like more people now having babies later in their life, given that parental age is linked to a higher likelihood of autism — are more likely to be a factor than anything Kennedy is pointing at, experts say. “It’s very clear it’s not going to be one environmental toxin,” said Alison Singer, founder of the Autism Science Foundation and parent of a child with profound autism. “If there were a smoking gun, I think they would have found it.”While Kennedy has fixated on vaccines and environmental influences, scientists have gained more precision in mapping human genetics and identifying the biological mechanisms that appear to be a primary cause of autism. And that not only helps us understand why autism develops, but potentially puts long-elusive therapies within reach. It began with an accident in the 1990s. Steven Scherer, now director of the Center for Applied Genomics at the Hospital for Sick Children in Toronto, began his career in the late 1980s trying to identify the gene that caused cystic fibrosis — in collaboration with Francis Collins, who went on to lead the Human Genome Project that successfully sequenced all of the DNA in the human genome in the early 2000s. Scherer and Collins’s teams focused on chromosome 7, identified as a likely target by the primitive genetic research available at the time, a coincidence that would reorient Scherer’s career just a few years later, putting him on the trail of autism’s genetic roots.After four years, the researchers concluded that one gene within chromosome 7 caused cystic fibrosis. Soon after Scherer helped crack the code on cystic fibrosis in the mid-1990s, two parents from California called him: He was the world’s leading expert on chromosome 7, and recent tests had revealed that their children with autism had a problem within that particular chromosome.That very same week, Scherer says, he read the findings of a study by a group at Oxford University, which had looked at the chromosomes of families with two or more kids with autism. They, too, had identified problems within chromosome 7.“So I said, ‘Okay, we’re going to work on autism,’” Scherer told me. He helped coordinate a global research project, uniting his Canadian lab with the Oxford team and groups in the US to run a database that became the Autism Genome Project, still the world’s largest repository of genetic information of people with autism.They had a starting point — one chromosome — but a given chromosome contains hundreds of genes. And humans have, of course, 45 other chromosomes, any of which conceivably might play a role. So over the years, they collected DNA samples from thousands upon thousands of people with autism, sequenced their genes, and then searched for patterns. If the same gene is mutated or missing across a high percentage of autistic people, it goes on the list as potentially associated with the condition. Scientists discovered that autism has not one genetic factor, but many — further evidence that this is a condition of complex origin, in which multiple variables likely play a role in its development, rather than one caused by a single genetic error like sickle-cell anemia.Here is one way to think about how far we have come: Joseph Buxbaum, the director of the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai in New York, entered autism genetics research 35 years ago. He recalls scientists being hopeful that they might identify a half dozen or so genes linked to autism.They have now found 500 genes — and Buxbaum told me he believed they might find a thousand before they are through. These genetic factors continue to prove their value in predicting the onset of autism: Scherer pointed to one recent study in which the researchers identified people who all shared a mutation in the SHANK3 gene, one of the first to be associated with autism, but who were otherwise unalike: They were not related and came from different demographic backgrounds. Nevertheless, they had all been diagnosed with autism.Researchers analyze the brain activity of a 14-year-old boy with autism as part of a University of California San Francisco study that involves intensive brain imaging of kids and their parents who have a rare chromosome disruption connected to autism. The study, the Simons Variation in Individuals Project, is a genetics-first approach to studying autism spectrum and related neurodevelopmental disorders. Michael Macor/San Francisco Chronicle via The Associated PressPrecisely how much genetics contributes to the development of autism remains the subject of ongoing study. By analyzing millions of children with autism and their parents for patterns in diagnoses, multiple studies have attributed about 80 percent of a person’s risk of developing autism to their inherited genetic factors. But of course 80 percent is not 100 percent. We don’t yet have the full picture of how or why autism develops. Among identical twins, for example, studies have found that in most cases, if one twin has high-support needs autism, the other does as well, affirming the genetic effect. But there are consistently a small minority of cases — 5 and 10 percent of twin pairs, Scherer told me — in which one twin has relatively low-support needs while the one requires a a high degree of support for their autism.Kennedy is not wholly incorrect to look at environmental factors — researchers theorize that autism may be the result of a complex interaction between a person’s genetics and something they experience in utero. Scientists in autism research are exploring the possible influence when, for example, a person’s mother develops maternal diabetes, high blood sugar that persists throughout pregnancy. And yet even if these other factors do play some role, the researchers I spoke to agree that genetics is, based on what we know now, far and away the most important driver.“We need to figure out how other types of genetics and also environmental factors affect autism’s development,” Scherer said. “There could be environmental changes…involved in some people, but it’s going to be based on their genetics and the pathways that lead them to be susceptible.”While the precise contours of Health Department’s new autism research project is still taking shape, Kennedy has that researchers at the National Institutes of Health will collect data from federal programs such as Medicare and Medicaid and somehow use that information to identify possible environmental exposures that lead to autism. He initially pledged results by September, a timeline that, as outside experts pointed out, may be too fast to allow for a thorough and thoughtful review of the research literature. Kennedy has since backed off on that deadline, promising some initial findings in the fall but with more to come next year.RFK Jr.’s autism commission research risks the accessibility of groundbreaking autism treatmentsIf Kennedy were serious about moving autism science forward, he would be talking more about genetics, not dismissing them. That’s because genetics is where all of the exciting drug development is currently happening.A biotech firm called Jaguar Gene Therapy has received FDA approval to conduct the first clinical trial of a gene therapy for autism, focused on SHANK3. The treatment, developed in part by one of Buxbaum’s colleagues, is a one-time injection that would replace a mutated or missing SHANK3 gene with a functional one. The hope is that the therapy would improve speech and other symptoms among people with high-needs autism who have also been diagnosed with a rare chromosomal deletion disorder called Phelan-McDermid syndrome; many people with this condition also have Autism spectrum disorder.The trial will begin this year with a few infant patients, 2 years old and younger, who have been diagnosed with autism. Jaguar eventually aims to test the therapy on adults over 18 with autism in the future. Patients are supposed to start enrolling this year in the trial, which is focused on first establishing the treatment’s safety; if it proves safe, another round of trials would start to rigorously evaluate its effectiveness.“This is the stuff that three or four years ago sounded like science fiction,” Singer said. “The conversation has really changed from Is this possible? to What are the best methods to do it? And that’s based on genetics.”Researchers at Mount Sinai have also experimented with delivering lithium to patients and seeing if it improves their SHANK3 function. Other gene therapies targeting other genes are in earlier stages of development. Some investigators are experimenting with CRISPR technology, the revolutionary new platform for gene editing, to target the problematic genes that correspond to the onset of autism.But these scientists fear that their work could be slowed by Kennedy’s insistence on hunting for environmental toxins, if federal dollars are instead shifted into his new project. They are already trying to subsist amid deep budget cuts across the many funding streams that support the institutions where they work. “Now we have this massive disruption where instead of doing really key experiments, people are worrying about paying their bills and laying off their staff and things,” Scherer said. “It’s horrible.” For the families of people with high-needs autism, Kennedy’s crusade has stirred conflicting emotions. Alison Singer, the leader of the Autism Science Foundation, is also the parent of a child with profound autism. When I spoke with her, I was struck by the bind that Kennedy’s rhetoric has put people like her and her family in. Singer told me profound autism has not received enough federal support in the past, as more emphasis was placed on individuals who have low support needs included in the expanding definitions of the disorder, and so she appreciates Kennedy giving voice to those families. She believes that he is sincerely empathetic toward their predicament and their feeling that the mainstream discussion about autism has for too long ignored their experiences in favor of patients with lower support needs. But she worries that his obsession with environmental factors will stymie the research that could yield breakthroughs for people like her child.“He feels for those families and genuinely wants to help them,” Singer said. “The problem is he is a data denier. You can’t be so entrenched in your beliefs that you can’t see the data right in front of you. That’s not science.”See More:
    #rfk #looking #wrong #place #autisms
    RFK Jr. is looking in the wrong place for autism’s cause
    Let’s start with one unambiguous fact: More children are diagnosed with autism today than in the early 1990s. According to a sweeping 2000 analysis by the Centers for Disease Control and Prevention, a range of 2–7 per 1,000, or roughly 0.5 percent of US children, were diagnosed with autism in the 1990s. That figure has risen to 1 in 35 kids, or roughly 3 percent.The apparent rapid increase caught the attention of people like Robert F. Kennedy Jr., who assumed that something had to be changing in the environment to drive it. In 2005, Kennedy, a lawyer and environmental activist at the time, authored an infamous essay in Rolling Stone that primarily placed the blame for the increased prevalence of autism on vaccines.More recently, he has theorized that a mysterious toxin introduced in the late 1980s must be responsible. Now, as the nation’s top health official leading the Department of Health and Human Services, Kennedy has declared autism an “epidemic.” And, in April, he launched a massive federal effort to find the culprit for the rise in autism rates, calling for researchers to examine a range of suspects: chemicals, molds, vaccines, and perhaps even ultrasounds given to pregnant mothers. “Genes don’t cause epidemics. You need an environmental toxin,” Kennedy said in April when announcing his department’s new autism research project. He argued that too much money had been put into genetic research — “a dead end,” in his words — and his project would be a correction to focus on environmental causes. “That’s where we’re going to find an answer.”But according to many autism scientists I spoke to for this story, Kennedy is looking in exactly the wrong place. Three takeaways from this storyExperts say the increase in US autism rates is mostly explained by the expanding definitions of the condition, as well as more awareness and more screening for it.Scientists have identified hundreds of genes that are associated with autism, building a convincing case that genetics are the most important driver of autism’s development — not, as Health Secretary Robert F. Kennedy Jr. has argued, a single environmental toxin.Researchers fear Kennedy’s fixation on outside toxins could distract from genetic research that has facilitated the development of exciting new therapies that could help those with profound autism.Autism is a complex disorder with a range of manifestations that has long defied simple explanations, and it’s unlikely that we will ever identify a single “cause” of autism.But scientists have learned a lot in the past 50 years, including identifying some of the most important risk factors. They are not, as Kennedy suggests, out in our environment. They are written into our genetics. What appeared to be a massive increase in autism was actually a byproduct of better screening and more awareness. “The way the HHS secretary has been walking about his plans, his goals, he starts out with this basic assumption that nothing worthwhile has been done,” Helen Tager-Flusberg, a psychologist at Boston University who has worked with and studied children with autism for years, said. “Genes play a significant role. We know now that autism runs in families… There is no single underlying factor. Looking for that holy grail is not the best approach.”Doctors who treat children with autism often talk about how they wish they could provide easy answers to the families. The answers being uncovered through genetics research may not be simple per se, but they are answers supported by science.Kennedy is muddying the story, pledging to find a silver-bullet answer where likely none exists. It’s a false promise — one that could cause more anxiety and confusion for the very families Kennedy says he wants to help. Robert F. Kennedy Jr. speaks during a news conference at the Department of Health and Human Services in mid-April to discuss this agency’s efforts to determine the cause of autism. Alex Wong/Getty ImagesThe autism “epidemic” that wasn’tAutism was first described in 1911, and for many decades, researchers and clinicians confused the social challenges and language development difficulties common among those with the condition for a psychological issue. Some child therapists even blamed the condition on bad parenting. But in 1977, a study discovered that identical twins, who share all of their DNA, were much more likely to both be autistic than fraternal twins, who share no more DNA than ordinary siblings. It marked a major breakthrough in autism research, and pushed scientists to begin coalescing around a different theory: There was a biological factor.At the time, this was just a theory — scientists lacked the technology to prove those suspicions at the genetic level. And clinicians were also still trying to work out an even more fundamental question: What exactly was autism? For a long time, the criteria for diagnosing a person with autism was strictly based on speech development. But clinicians were increasingly observing children who could acquire basic language skills but still struggled with social communication — things like misunderstanding nonverbal cues or taking figurative language literally. Psychologists gradually broadened their definition of autism from a strict and narrow focus on language, culminating in a 2013 criteria that included a wide range of social and emotional symptoms with three subtypes — the autism spectrum disorder we’re familiar with today.Along the way, autism had evolved from a niche diagnosis for the severely impaired to something that encompassed far more children. It makes sense then, that as the broad criteria for autism expanded, more and more children would meet it, and autism rates would rise. That’s precisely what happened. And it means that the “epidemic” that Kennedy and other activists have been fixated on is mostly a diagnostic mirage. Historical autism data is spotty and subject to these same historical biases, but if you look at the prevalence of profound autism alone — those who need the highest levels of support — a clearer picture emerges.In the ’80s and ’90s, low-support needs individuals would have been less likely to receive an autism diagnosis given the more restrictive criteria and less overall awareness of the disorder, meaning that people with severe autism likely represented most of the roughly 0.5 percent of children diagnosed with autism in the 1990s.By 2025, when about 3 percent of children are being diagnosed with autism, about one in four of those diagnosed are considered to have high-support needs autism, those with most severe manifestation of the condition. That would equal about 0.8 percent of all US children — which would be a fairly marginal increase from autism rates 30 years ago. Or look at it another way: In 2000, as many as 60 percent of the people being diagnosed with autism had an intellectual disability, one of the best indicators of high-support needs autism. In 2022, that percentage was less than 40 percent.As a recently published CDC report on autism prevalence among young children concluded, the increase in autism rates can largely be accounted for by stronger surveillance and more awareness among providers and parents, rather than a novel toxin or some other external factor driving an increase in cases.Other known risk factors — like more people now having babies later in their life, given that parental age is linked to a higher likelihood of autism — are more likely to be a factor than anything Kennedy is pointing at, experts say. “It’s very clear it’s not going to be one environmental toxin,” said Alison Singer, founder of the Autism Science Foundation and parent of a child with profound autism. “If there were a smoking gun, I think they would have found it.”While Kennedy has fixated on vaccines and environmental influences, scientists have gained more precision in mapping human genetics and identifying the biological mechanisms that appear to be a primary cause of autism. And that not only helps us understand why autism develops, but potentially puts long-elusive therapies within reach. It began with an accident in the 1990s. Steven Scherer, now director of the Center for Applied Genomics at the Hospital for Sick Children in Toronto, began his career in the late 1980s trying to identify the gene that caused cystic fibrosis — in collaboration with Francis Collins, who went on to lead the Human Genome Project that successfully sequenced all of the DNA in the human genome in the early 2000s. Scherer and Collins’s teams focused on chromosome 7, identified as a likely target by the primitive genetic research available at the time, a coincidence that would reorient Scherer’s career just a few years later, putting him on the trail of autism’s genetic roots.After four years, the researchers concluded that one gene within chromosome 7 caused cystic fibrosis. Soon after Scherer helped crack the code on cystic fibrosis in the mid-1990s, two parents from California called him: He was the world’s leading expert on chromosome 7, and recent tests had revealed that their children with autism had a problem within that particular chromosome.That very same week, Scherer says, he read the findings of a study by a group at Oxford University, which had looked at the chromosomes of families with two or more kids with autism. They, too, had identified problems within chromosome 7.“So I said, ‘Okay, we’re going to work on autism,’” Scherer told me. He helped coordinate a global research project, uniting his Canadian lab with the Oxford team and groups in the US to run a database that became the Autism Genome Project, still the world’s largest repository of genetic information of people with autism.They had a starting point — one chromosome — but a given chromosome contains hundreds of genes. And humans have, of course, 45 other chromosomes, any of which conceivably might play a role. So over the years, they collected DNA samples from thousands upon thousands of people with autism, sequenced their genes, and then searched for patterns. If the same gene is mutated or missing across a high percentage of autistic people, it goes on the list as potentially associated with the condition. Scientists discovered that autism has not one genetic factor, but many — further evidence that this is a condition of complex origin, in which multiple variables likely play a role in its development, rather than one caused by a single genetic error like sickle-cell anemia.Here is one way to think about how far we have come: Joseph Buxbaum, the director of the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai in New York, entered autism genetics research 35 years ago. He recalls scientists being hopeful that they might identify a half dozen or so genes linked to autism.They have now found 500 genes — and Buxbaum told me he believed they might find a thousand before they are through. These genetic factors continue to prove their value in predicting the onset of autism: Scherer pointed to one recent study in which the researchers identified people who all shared a mutation in the SHANK3 gene, one of the first to be associated with autism, but who were otherwise unalike: They were not related and came from different demographic backgrounds. Nevertheless, they had all been diagnosed with autism.Researchers analyze the brain activity of a 14-year-old boy with autism as part of a University of California San Francisco study that involves intensive brain imaging of kids and their parents who have a rare chromosome disruption connected to autism. The study, the Simons Variation in Individuals Project, is a genetics-first approach to studying autism spectrum and related neurodevelopmental disorders. Michael Macor/San Francisco Chronicle via The Associated PressPrecisely how much genetics contributes to the development of autism remains the subject of ongoing study. By analyzing millions of children with autism and their parents for patterns in diagnoses, multiple studies have attributed about 80 percent of a person’s risk of developing autism to their inherited genetic factors. But of course 80 percent is not 100 percent. We don’t yet have the full picture of how or why autism develops. Among identical twins, for example, studies have found that in most cases, if one twin has high-support needs autism, the other does as well, affirming the genetic effect. But there are consistently a small minority of cases — 5 and 10 percent of twin pairs, Scherer told me — in which one twin has relatively low-support needs while the one requires a a high degree of support for their autism.Kennedy is not wholly incorrect to look at environmental factors — researchers theorize that autism may be the result of a complex interaction between a person’s genetics and something they experience in utero. Scientists in autism research are exploring the possible influence when, for example, a person’s mother develops maternal diabetes, high blood sugar that persists throughout pregnancy. And yet even if these other factors do play some role, the researchers I spoke to agree that genetics is, based on what we know now, far and away the most important driver.“We need to figure out how other types of genetics and also environmental factors affect autism’s development,” Scherer said. “There could be environmental changes…involved in some people, but it’s going to be based on their genetics and the pathways that lead them to be susceptible.”While the precise contours of Health Department’s new autism research project is still taking shape, Kennedy has that researchers at the National Institutes of Health will collect data from federal programs such as Medicare and Medicaid and somehow use that information to identify possible environmental exposures that lead to autism. He initially pledged results by September, a timeline that, as outside experts pointed out, may be too fast to allow for a thorough and thoughtful review of the research literature. Kennedy has since backed off on that deadline, promising some initial findings in the fall but with more to come next year.RFK Jr.’s autism commission research risks the accessibility of groundbreaking autism treatmentsIf Kennedy were serious about moving autism science forward, he would be talking more about genetics, not dismissing them. That’s because genetics is where all of the exciting drug development is currently happening.A biotech firm called Jaguar Gene Therapy has received FDA approval to conduct the first clinical trial of a gene therapy for autism, focused on SHANK3. The treatment, developed in part by one of Buxbaum’s colleagues, is a one-time injection that would replace a mutated or missing SHANK3 gene with a functional one. The hope is that the therapy would improve speech and other symptoms among people with high-needs autism who have also been diagnosed with a rare chromosomal deletion disorder called Phelan-McDermid syndrome; many people with this condition also have Autism spectrum disorder.The trial will begin this year with a few infant patients, 2 years old and younger, who have been diagnosed with autism. Jaguar eventually aims to test the therapy on adults over 18 with autism in the future. Patients are supposed to start enrolling this year in the trial, which is focused on first establishing the treatment’s safety; if it proves safe, another round of trials would start to rigorously evaluate its effectiveness.“This is the stuff that three or four years ago sounded like science fiction,” Singer said. “The conversation has really changed from Is this possible? to What are the best methods to do it? And that’s based on genetics.”Researchers at Mount Sinai have also experimented with delivering lithium to patients and seeing if it improves their SHANK3 function. Other gene therapies targeting other genes are in earlier stages of development. Some investigators are experimenting with CRISPR technology, the revolutionary new platform for gene editing, to target the problematic genes that correspond to the onset of autism.But these scientists fear that their work could be slowed by Kennedy’s insistence on hunting for environmental toxins, if federal dollars are instead shifted into his new project. They are already trying to subsist amid deep budget cuts across the many funding streams that support the institutions where they work. “Now we have this massive disruption where instead of doing really key experiments, people are worrying about paying their bills and laying off their staff and things,” Scherer said. “It’s horrible.” For the families of people with high-needs autism, Kennedy’s crusade has stirred conflicting emotions. Alison Singer, the leader of the Autism Science Foundation, is also the parent of a child with profound autism. When I spoke with her, I was struck by the bind that Kennedy’s rhetoric has put people like her and her family in. Singer told me profound autism has not received enough federal support in the past, as more emphasis was placed on individuals who have low support needs included in the expanding definitions of the disorder, and so she appreciates Kennedy giving voice to those families. She believes that he is sincerely empathetic toward their predicament and their feeling that the mainstream discussion about autism has for too long ignored their experiences in favor of patients with lower support needs. But she worries that his obsession with environmental factors will stymie the research that could yield breakthroughs for people like her child.“He feels for those families and genuinely wants to help them,” Singer said. “The problem is he is a data denier. You can’t be so entrenched in your beliefs that you can’t see the data right in front of you. That’s not science.”See More: #rfk #looking #wrong #place #autisms
    WWW.VOX.COM
    RFK Jr. is looking in the wrong place for autism’s cause
    Let’s start with one unambiguous fact: More children are diagnosed with autism today than in the early 1990s. According to a sweeping 2000 analysis by the Centers for Disease Control and Prevention, a range of 2–7 per 1,000, or roughly 0.5 percent of US children, were diagnosed with autism in the 1990s. That figure has risen to 1 in 35 kids, or roughly 3 percent.The apparent rapid increase caught the attention of people like Robert F. Kennedy Jr., who assumed that something had to be changing in the environment to drive it. In 2005, Kennedy, a lawyer and environmental activist at the time, authored an infamous essay in Rolling Stone that primarily placed the blame for the increased prevalence of autism on vaccines. (The article was retracted in 2011 as more studies debunked the vaccine-autism connection.) More recently, he has theorized that a mysterious toxin introduced in the late 1980s must be responsible. Now, as the nation’s top health official leading the Department of Health and Human Services, Kennedy has declared autism an “epidemic.” And, in April, he launched a massive federal effort to find the culprit for the rise in autism rates, calling for researchers to examine a range of suspects: chemicals, molds, vaccines, and perhaps even ultrasounds given to pregnant mothers. “Genes don’t cause epidemics. You need an environmental toxin,” Kennedy said in April when announcing his department’s new autism research project. He argued that too much money had been put into genetic research — “a dead end,” in his words — and his project would be a correction to focus on environmental causes. “That’s where we’re going to find an answer.”But according to many autism scientists I spoke to for this story, Kennedy is looking in exactly the wrong place. Three takeaways from this storyExperts say the increase in US autism rates is mostly explained by the expanding definitions of the condition, as well as more awareness and more screening for it.Scientists have identified hundreds of genes that are associated with autism, building a convincing case that genetics are the most important driver of autism’s development — not, as Health Secretary Robert F. Kennedy Jr. has argued, a single environmental toxin.Researchers fear Kennedy’s fixation on outside toxins could distract from genetic research that has facilitated the development of exciting new therapies that could help those with profound autism.Autism is a complex disorder with a range of manifestations that has long defied simple explanations, and it’s unlikely that we will ever identify a single “cause” of autism.But scientists have learned a lot in the past 50 years, including identifying some of the most important risk factors. They are not, as Kennedy suggests, out in our environment. They are written into our genetics. What appeared to be a massive increase in autism was actually a byproduct of better screening and more awareness. “The way the HHS secretary has been walking about his plans, his goals, he starts out with this basic assumption that nothing worthwhile has been done,” Helen Tager-Flusberg, a psychologist at Boston University who has worked with and studied children with autism for years, said. “Genes play a significant role. We know now that autism runs in families… There is no single underlying factor. Looking for that holy grail is not the best approach.”Doctors who treat children with autism often talk about how they wish they could provide easy answers to the families. The answers being uncovered through genetics research may not be simple per se, but they are answers supported by science.Kennedy is muddying the story, pledging to find a silver-bullet answer where likely none exists. It’s a false promise — one that could cause more anxiety and confusion for the very families Kennedy says he wants to help. Robert F. Kennedy Jr. speaks during a news conference at the Department of Health and Human Services in mid-April to discuss this agency’s efforts to determine the cause of autism. Alex Wong/Getty ImagesThe autism “epidemic” that wasn’tAutism was first described in 1911, and for many decades, researchers and clinicians confused the social challenges and language development difficulties common among those with the condition for a psychological issue. Some child therapists even blamed the condition on bad parenting. But in 1977, a study discovered that identical twins, who share all of their DNA, were much more likely to both be autistic than fraternal twins, who share no more DNA than ordinary siblings. It marked a major breakthrough in autism research, and pushed scientists to begin coalescing around a different theory: There was a biological factor.At the time, this was just a theory — scientists lacked the technology to prove those suspicions at the genetic level. And clinicians were also still trying to work out an even more fundamental question: What exactly was autism? For a long time, the criteria for diagnosing a person with autism was strictly based on speech development. But clinicians were increasingly observing children who could acquire basic language skills but still struggled with social communication — things like misunderstanding nonverbal cues or taking figurative language literally. Psychologists gradually broadened their definition of autism from a strict and narrow focus on language, culminating in a 2013 criteria that included a wide range of social and emotional symptoms with three subtypes — the autism spectrum disorder we’re familiar with today.Along the way, autism had evolved from a niche diagnosis for the severely impaired to something that encompassed far more children. It makes sense then, that as the broad criteria for autism expanded, more and more children would meet it, and autism rates would rise. That’s precisely what happened. And it means that the “epidemic” that Kennedy and other activists have been fixated on is mostly a diagnostic mirage. Historical autism data is spotty and subject to these same historical biases, but if you look at the prevalence of profound autism alone — those who need the highest levels of support — a clearer picture emerges. (There is an ongoing debate in the autism community about whether to use the terminology of “profound autism” or “high support needs” for those who have the most severe form of the condition.) In the ’80s and ’90s, low-support needs individuals would have been less likely to receive an autism diagnosis given the more restrictive criteria and less overall awareness of the disorder, meaning that people with severe autism likely represented most of the roughly 0.5 percent of children diagnosed with autism in the 1990s. (One large analysis from Atlanta examining data from 1996 found that 68 percent of kids ages 3 to 10 diagnosed with autism had an IQ below 70, the typical cutoff for intellectual disability.)By 2025, when about 3 percent of children are being diagnosed with autism, about one in four of those diagnosed are considered to have high-support needs autism, those with most severe manifestation of the condition. That would equal about 0.8 percent of all US children — which would be a fairly marginal increase from autism rates 30 years ago. Or look at it another way: In 2000, as many as 60 percent of the people being diagnosed with autism had an intellectual disability, one of the best indicators of high-support needs autism. In 2022, that percentage was less than 40 percent.As a recently published CDC report on autism prevalence among young children concluded, the increase in autism rates can largely be accounted for by stronger surveillance and more awareness among providers and parents, rather than a novel toxin or some other external factor driving an increase in cases.Other known risk factors — like more people now having babies later in their life, given that parental age is linked to a higher likelihood of autism — are more likely to be a factor than anything Kennedy is pointing at, experts say. “It’s very clear it’s not going to be one environmental toxin,” said Alison Singer, founder of the Autism Science Foundation and parent of a child with profound autism. “If there were a smoking gun, I think they would have found it.”While Kennedy has fixated on vaccines and environmental influences, scientists have gained more precision in mapping human genetics and identifying the biological mechanisms that appear to be a primary cause of autism. And that not only helps us understand why autism develops, but potentially puts long-elusive therapies within reach. It began with an accident in the 1990s. Steven Scherer, now director of the Center for Applied Genomics at the Hospital for Sick Children in Toronto, began his career in the late 1980s trying to identify the gene that caused cystic fibrosis — in collaboration with Francis Collins, who went on to lead the Human Genome Project that successfully sequenced all of the DNA in the human genome in the early 2000s. Scherer and Collins’s teams focused on chromosome 7, identified as a likely target by the primitive genetic research available at the time, a coincidence that would reorient Scherer’s career just a few years later, putting him on the trail of autism’s genetic roots.After four years, the researchers concluded that one gene within chromosome 7 caused cystic fibrosis. Soon after Scherer helped crack the code on cystic fibrosis in the mid-1990s, two parents from California called him: He was the world’s leading expert on chromosome 7, and recent tests had revealed that their children with autism had a problem within that particular chromosome.That very same week, Scherer says, he read the findings of a study by a group at Oxford University, which had looked at the chromosomes of families with two or more kids with autism. They, too, had identified problems within chromosome 7.“So I said, ‘Okay, we’re going to work on autism,’” Scherer told me. He helped coordinate a global research project, uniting his Canadian lab with the Oxford team and groups in the US to run a database that became the Autism Genome Project, still the world’s largest repository of genetic information of people with autism.They had a starting point — one chromosome — but a given chromosome contains hundreds of genes. And humans have, of course, 45 other chromosomes, any of which conceivably might play a role. So over the years, they collected DNA samples from thousands upon thousands of people with autism, sequenced their genes, and then searched for patterns. If the same gene is mutated or missing across a high percentage of autistic people, it goes on the list as potentially associated with the condition. Scientists discovered that autism has not one genetic factor, but many — further evidence that this is a condition of complex origin, in which multiple variables likely play a role in its development, rather than one caused by a single genetic error like sickle-cell anemia.Here is one way to think about how far we have come: Joseph Buxbaum, the director of the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai in New York, entered autism genetics research 35 years ago. He recalls scientists being hopeful that they might identify a half dozen or so genes linked to autism.They have now found 500 genes — and Buxbaum told me he believed they might find a thousand before they are through. These genetic factors continue to prove their value in predicting the onset of autism: Scherer pointed to one recent study in which the researchers identified people who all shared a mutation in the SHANK3 gene, one of the first to be associated with autism, but who were otherwise unalike: They were not related and came from different demographic backgrounds. Nevertheless, they had all been diagnosed with autism.Researchers analyze the brain activity of a 14-year-old boy with autism as part of a University of California San Francisco study that involves intensive brain imaging of kids and their parents who have a rare chromosome disruption connected to autism. The study, the Simons Variation in Individuals Project, is a genetics-first approach to studying autism spectrum and related neurodevelopmental disorders. Michael Macor/San Francisco Chronicle via The Associated PressPrecisely how much genetics contributes to the development of autism remains the subject of ongoing study. By analyzing millions of children with autism and their parents for patterns in diagnoses, multiple studies have attributed about 80 percent of a person’s risk of developing autism to their inherited genetic factors. But of course 80 percent is not 100 percent. We don’t yet have the full picture of how or why autism develops. Among identical twins, for example, studies have found that in most cases, if one twin has high-support needs autism, the other does as well, affirming the genetic effect. But there are consistently a small minority of cases — 5 and 10 percent of twin pairs, Scherer told me — in which one twin has relatively low-support needs while the one requires a a high degree of support for their autism.Kennedy is not wholly incorrect to look at environmental factors — researchers theorize that autism may be the result of a complex interaction between a person’s genetics and something they experience in utero. Scientists in autism research are exploring the possible influence when, for example, a person’s mother develops maternal diabetes, high blood sugar that persists throughout pregnancy. And yet even if these other factors do play some role, the researchers I spoke to agree that genetics is, based on what we know now, far and away the most important driver.“We need to figure out how other types of genetics and also environmental factors affect autism’s development,” Scherer said. “There could be environmental changes…involved in some people, but it’s going to be based on their genetics and the pathways that lead them to be susceptible.”While the precise contours of Health Department’s new autism research project is still taking shape, Kennedy has that researchers at the National Institutes of Health will collect data from federal programs such as Medicare and Medicaid and somehow use that information to identify possible environmental exposures that lead to autism. He initially pledged results by September, a timeline that, as outside experts pointed out, may be too fast to allow for a thorough and thoughtful review of the research literature. Kennedy has since backed off on that deadline, promising some initial findings in the fall but with more to come next year.RFK Jr.’s autism commission research risks the accessibility of groundbreaking autism treatmentsIf Kennedy were serious about moving autism science forward, he would be talking more about genetics, not dismissing them. That’s because genetics is where all of the exciting drug development is currently happening.A biotech firm called Jaguar Gene Therapy has received FDA approval to conduct the first clinical trial of a gene therapy for autism, focused on SHANK3. The treatment, developed in part by one of Buxbaum’s colleagues, is a one-time injection that would replace a mutated or missing SHANK3 gene with a functional one. The hope is that the therapy would improve speech and other symptoms among people with high-needs autism who have also been diagnosed with a rare chromosomal deletion disorder called Phelan-McDermid syndrome; many people with this condition also have Autism spectrum disorder.The trial will begin this year with a few infant patients, 2 years old and younger, who have been diagnosed with autism. Jaguar eventually aims to test the therapy on adults over 18 with autism in the future. Patients are supposed to start enrolling this year in the trial, which is focused on first establishing the treatment’s safety; if it proves safe, another round of trials would start to rigorously evaluate its effectiveness.“This is the stuff that three or four years ago sounded like science fiction,” Singer said. “The conversation has really changed from Is this possible? to What are the best methods to do it? And that’s based on genetics.”Researchers at Mount Sinai have also experimented with delivering lithium to patients and seeing if it improves their SHANK3 function. Other gene therapies targeting other genes are in earlier stages of development. Some investigators are experimenting with CRISPR technology, the revolutionary new platform for gene editing, to target the problematic genes that correspond to the onset of autism.But these scientists fear that their work could be slowed by Kennedy’s insistence on hunting for environmental toxins, if federal dollars are instead shifted into his new project. They are already trying to subsist amid deep budget cuts across the many funding streams that support the institutions where they work. “Now we have this massive disruption where instead of doing really key experiments, people are worrying about paying their bills and laying off their staff and things,” Scherer said. “It’s horrible.” For the families of people with high-needs autism, Kennedy’s crusade has stirred conflicting emotions. Alison Singer, the leader of the Autism Science Foundation, is also the parent of a child with profound autism. When I spoke with her, I was struck by the bind that Kennedy’s rhetoric has put people like her and her family in. Singer told me profound autism has not received enough federal support in the past, as more emphasis was placed on individuals who have low support needs included in the expanding definitions of the disorder, and so she appreciates Kennedy giving voice to those families. She believes that he is sincerely empathetic toward their predicament and their feeling that the mainstream discussion about autism has for too long ignored their experiences in favor of patients with lower support needs. But she worries that his obsession with environmental factors will stymie the research that could yield breakthroughs for people like her child.“He feels for those families and genuinely wants to help them,” Singer said. “The problem is he is a data denier. You can’t be so entrenched in your beliefs that you can’t see the data right in front of you. That’s not science.”See More:
    0 Commentarii 0 Distribuiri 0 previzualizare
  • The Supercomputer Designed to Accelerate Nobel-Worthy Science

    Ready for a front-row seat to the next scientific revolution?
    That’s the idea behind Doudna — a groundbreaking supercomputer announced today at Lawrence Berkeley National Laboratory in Berkeley, California. The system represents a major national investment in advancing U.S. high-performance computingleadership, ensuring U.S. researchers have access to cutting-edge tools to address global challenges.
    “It will advance scientific discovery from chemistry to physics to biology and all powered by — unleashing this power — of artificial intelligence,” U.S. Energy Secretary Chris Wrightsaid at today’s event.
    Also known as NERSC-10, Doudna is named for Nobel laureate and CRISPR pioneer Jennifer Doudna. The next-generation system announced today is designed not just for speed but for impact.
    Nobel laureate and CRISPR pioneer Jennifer Doudna speaking at today’s event in Berkeley, California. To her right, NVIDIA founder and CEO Jensen Huang and Paul Perez, senior vice president and senior technology fellow at Dell Technologies.
    Powered by Dell Technologies infrastructure with the NVIDIA Vera Rubin architecture, and set to launch in 2026, Doudna is tailored for real-time discovery across the U.S. Department of Energy’s most urgent scientific missions. It’s poised to catapult American researchers to the forefront of critical scientific breakthroughs, fostering innovation and securing the nation’s competitive edge in key technological fields.
    “I’m so proud that America continues to invest in this particular area,” said NVIDIA founder and CEO Jensen Huang. “It is the foundation of scientific discovery for our country. It is also the foundation for economic and technology leadership.”
    “It’s an incredible honor to be here,” Doudna said, adding she was “surprised and delighted” that a supercomputer would be named after her. “I think we’re standing at a really interesting moment in biology,” she added, with people with different skills coming together to address global issues.
    Designed to Accelerate Breakthroughs
    Unlike traditional systems that operate in silos, Doudna merges simulation, data and AI into a single seamless platform.
    “The Doudna supercomputer is designed to accelerate a broad set of scientific workflows,” said NERSC Director Sudip Dosanjh. “Doudna will be connected to DOE experimental and observational facilities through the Energy Sciences Network, allowing scientists to stream data seamlessly into the system from all parts of the country and to analyze it in near real time.”
    It’s engineered to empower over 11,000 researchers with almost instantaneous responsiveness and integrated workflows, helping scientists explore bigger questions and reach answers faster than ever.
    “We’re not just building a faster computer,” said Nick Wright, advanced technologies group lead and Doudna chief architect at NERSC. “We’re building a system that helps researchers think bigger and discover sooner.”
    Here’s what Wright expects Doudna to advance:

    Fusion energy: Breakthroughs in simulation that unlocks clean fusion energy.
    Materials science: AI models that design new classes of superconducting materials.
    Drug discovery acceleration: Ultrarapid workflow that helps biologists fold proteins fast enough to outpace a pandemic.
    Astronomy: Real-time processing of data from the Dark Energy Spectroscopic Instrument at Kitt Peak to help scientists map the universe.

    Doudna is expected to outperform its predecessor, Perlmutter, by more than 10x in scientific output, all while using just 2-3x the power.
    This translates to a 3-5x increase in performance per watt, a result of innovations in chip design, dynamic load balancing and system-level efficiencies.
    AI-Powered Discovery at Scale
    Doudna will power AI-driven breakthroughs across high-impact scientific fields nationwide. Highlights include:

    AI for protein design: David Baker, a 2024 Nobel laureate, used NERSC systems to support his work using AI to predict novel protein structures, addressing challenges across scientific disciplines.
    AI for fundamental physics: Researchers like Benjamin Nachman are using AI to “unfold” detector distortions in particle physics data and analyze proton data from electron-proton colliders.
    AI for materials science: A collaboration including Berkeley Lab and Meta created “Open Molecules 2025,” a massive dataset for using AI to accurately model complex molecular chemical reactions. Researchers involved also use NERSC for their AI models.

    Real-Time Science, Real-World Impact
    Doudna isn’t a standalone system. It’s an integral part of scientific workflows. DOE’s ESnet will stream data from telescopes, detectors and genome sequencers directly into the machine with low-latency, high-throughput NVIDIA Quantum-X800 InfiniBand networking.
    This critical data flow is prioritized by intelligent quality-of-service mechanisms, ensuring it stays fast and uninterrupted, from input to insight.
    This will make the system incredibly responsive. At the DIII-D national fusion ignition facility, for example, data will stream control-room events directly into Doudna for rapid-response plasma modeling, so scientists can make adjustments in real time.
    “We used to think of the supercomputer as a passive participant in the corner,” Wright said. “Now it’s part of the entire workflow, connected to experiments, telescopes, detectors.”
    The Platform for What’s Next: Unlocking Quantum and HPC Workflows
    Doudna supports traditional HPC, cutting-edge AI, real-time streaming and even quantum workflows.
    The Mayall 4-Meter Telescope, which will be home to the Dark Energy Spectroscopic Instrument, seen at night at Kitt Peak National Observatory.
    This includes support for scalable quantum algorithm development and the codesign of future integrated quantum-HPC systems, using platforms like NVIDIA CUDA-Q.
    All of these workflows will run on the next-generation NVIDIA Vera Rubin platform, which will blend high-performance CPUs with coherent GPUs, meaning all processors can access and share data directly to support the most demanding scientific workloads.
    Researchers are already porting full pipelines using frameworks like PyTorch, the NVIDIA Holoscan software development kit, TensorFlow, NVIDIA cuDNN and NVIDIA CUDA-Q, all optimized for the system’s Rubin GPUs and NVIDIA NVLink architecture.
    Over 20 research teams are already porting full workflows to Doudna through the NERSC Science Acceleration Program, tackling everything from climate models to particle physics. This isn’t just about raw compute, it’s about discovery, integrated from idea to insight.
    Designed for Urgency
    Last year, AI-assisted science earned two Nobel Prizes. From climate research to pandemic response, the next breakthroughs won’t wait for better infrastructure.
    With deployment slated for 2026, Doudna is positioned to lead a new era of accelerated science. DOE facilities across the country, from Fermilab to the Joint Genome Institute, will rely on its capabilities to turn today’s questions into tomorrow’s breakthroughs.
    “This isn’t a system for one field,” Wright said. “It’s for discovery — across chemistry, physics and fields we haven’t imagined yet.”
    As Huang put it, Doudna is “a time machine for science.” It compresses years of discovery into days and gives the world’s toughest problems the power they’ve been waiting for.
    This post has been updated with comments from Thursday’s event at Lawrence Berkeley National Laboratory. 
    #supercomputer #designed #accelerate #nobelworthy #science
    The Supercomputer Designed to Accelerate Nobel-Worthy Science
    Ready for a front-row seat to the next scientific revolution? That’s the idea behind Doudna — a groundbreaking supercomputer announced today at Lawrence Berkeley National Laboratory in Berkeley, California. The system represents a major national investment in advancing U.S. high-performance computingleadership, ensuring U.S. researchers have access to cutting-edge tools to address global challenges. “It will advance scientific discovery from chemistry to physics to biology and all powered by — unleashing this power — of artificial intelligence,” U.S. Energy Secretary Chris Wrightsaid at today’s event. Also known as NERSC-10, Doudna is named for Nobel laureate and CRISPR pioneer Jennifer Doudna. The next-generation system announced today is designed not just for speed but for impact. Nobel laureate and CRISPR pioneer Jennifer Doudna speaking at today’s event in Berkeley, California. To her right, NVIDIA founder and CEO Jensen Huang and Paul Perez, senior vice president and senior technology fellow at Dell Technologies. Powered by Dell Technologies infrastructure with the NVIDIA Vera Rubin architecture, and set to launch in 2026, Doudna is tailored for real-time discovery across the U.S. Department of Energy’s most urgent scientific missions. It’s poised to catapult American researchers to the forefront of critical scientific breakthroughs, fostering innovation and securing the nation’s competitive edge in key technological fields. “I’m so proud that America continues to invest in this particular area,” said NVIDIA founder and CEO Jensen Huang. “It is the foundation of scientific discovery for our country. It is also the foundation for economic and technology leadership.” “It’s an incredible honor to be here,” Doudna said, adding she was “surprised and delighted” that a supercomputer would be named after her. “I think we’re standing at a really interesting moment in biology,” she added, with people with different skills coming together to address global issues. Designed to Accelerate Breakthroughs Unlike traditional systems that operate in silos, Doudna merges simulation, data and AI into a single seamless platform. “The Doudna supercomputer is designed to accelerate a broad set of scientific workflows,” said NERSC Director Sudip Dosanjh. “Doudna will be connected to DOE experimental and observational facilities through the Energy Sciences Network, allowing scientists to stream data seamlessly into the system from all parts of the country and to analyze it in near real time.” It’s engineered to empower over 11,000 researchers with almost instantaneous responsiveness and integrated workflows, helping scientists explore bigger questions and reach answers faster than ever. “We’re not just building a faster computer,” said Nick Wright, advanced technologies group lead and Doudna chief architect at NERSC. “We’re building a system that helps researchers think bigger and discover sooner.” Here’s what Wright expects Doudna to advance: Fusion energy: Breakthroughs in simulation that unlocks clean fusion energy. Materials science: AI models that design new classes of superconducting materials. Drug discovery acceleration: Ultrarapid workflow that helps biologists fold proteins fast enough to outpace a pandemic. Astronomy: Real-time processing of data from the Dark Energy Spectroscopic Instrument at Kitt Peak to help scientists map the universe. Doudna is expected to outperform its predecessor, Perlmutter, by more than 10x in scientific output, all while using just 2-3x the power. This translates to a 3-5x increase in performance per watt, a result of innovations in chip design, dynamic load balancing and system-level efficiencies. AI-Powered Discovery at Scale Doudna will power AI-driven breakthroughs across high-impact scientific fields nationwide. Highlights include: AI for protein design: David Baker, a 2024 Nobel laureate, used NERSC systems to support his work using AI to predict novel protein structures, addressing challenges across scientific disciplines. AI for fundamental physics: Researchers like Benjamin Nachman are using AI to “unfold” detector distortions in particle physics data and analyze proton data from electron-proton colliders. AI for materials science: A collaboration including Berkeley Lab and Meta created “Open Molecules 2025,” a massive dataset for using AI to accurately model complex molecular chemical reactions. Researchers involved also use NERSC for their AI models. Real-Time Science, Real-World Impact Doudna isn’t a standalone system. It’s an integral part of scientific workflows. DOE’s ESnet will stream data from telescopes, detectors and genome sequencers directly into the machine with low-latency, high-throughput NVIDIA Quantum-X800 InfiniBand networking. This critical data flow is prioritized by intelligent quality-of-service mechanisms, ensuring it stays fast and uninterrupted, from input to insight. This will make the system incredibly responsive. At the DIII-D national fusion ignition facility, for example, data will stream control-room events directly into Doudna for rapid-response plasma modeling, so scientists can make adjustments in real time. “We used to think of the supercomputer as a passive participant in the corner,” Wright said. “Now it’s part of the entire workflow, connected to experiments, telescopes, detectors.” The Platform for What’s Next: Unlocking Quantum and HPC Workflows Doudna supports traditional HPC, cutting-edge AI, real-time streaming and even quantum workflows. The Mayall 4-Meter Telescope, which will be home to the Dark Energy Spectroscopic Instrument, seen at night at Kitt Peak National Observatory. This includes support for scalable quantum algorithm development and the codesign of future integrated quantum-HPC systems, using platforms like NVIDIA CUDA-Q. All of these workflows will run on the next-generation NVIDIA Vera Rubin platform, which will blend high-performance CPUs with coherent GPUs, meaning all processors can access and share data directly to support the most demanding scientific workloads. Researchers are already porting full pipelines using frameworks like PyTorch, the NVIDIA Holoscan software development kit, TensorFlow, NVIDIA cuDNN and NVIDIA CUDA-Q, all optimized for the system’s Rubin GPUs and NVIDIA NVLink architecture. Over 20 research teams are already porting full workflows to Doudna through the NERSC Science Acceleration Program, tackling everything from climate models to particle physics. This isn’t just about raw compute, it’s about discovery, integrated from idea to insight. Designed for Urgency Last year, AI-assisted science earned two Nobel Prizes. From climate research to pandemic response, the next breakthroughs won’t wait for better infrastructure. With deployment slated for 2026, Doudna is positioned to lead a new era of accelerated science. DOE facilities across the country, from Fermilab to the Joint Genome Institute, will rely on its capabilities to turn today’s questions into tomorrow’s breakthroughs. “This isn’t a system for one field,” Wright said. “It’s for discovery — across chemistry, physics and fields we haven’t imagined yet.” As Huang put it, Doudna is “a time machine for science.” It compresses years of discovery into days and gives the world’s toughest problems the power they’ve been waiting for. This post has been updated with comments from Thursday’s event at Lawrence Berkeley National Laboratory.  #supercomputer #designed #accelerate #nobelworthy #science
    BLOGS.NVIDIA.COM
    The Supercomputer Designed to Accelerate Nobel-Worthy Science
    Ready for a front-row seat to the next scientific revolution? That’s the idea behind Doudna — a groundbreaking supercomputer announced today at Lawrence Berkeley National Laboratory in Berkeley, California. The system represents a major national investment in advancing U.S. high-performance computing (HPC) leadership, ensuring U.S. researchers have access to cutting-edge tools to address global challenges. “It will advance scientific discovery from chemistry to physics to biology and all powered by — unleashing this power — of artificial intelligence,” U.S. Energy Secretary Chris Wright (pictured above) said at today’s event. Also known as NERSC-10, Doudna is named for Nobel laureate and CRISPR pioneer Jennifer Doudna. The next-generation system announced today is designed not just for speed but for impact. Nobel laureate and CRISPR pioneer Jennifer Doudna speaking at today’s event in Berkeley, California. To her right, NVIDIA founder and CEO Jensen Huang and Paul Perez, senior vice president and senior technology fellow at Dell Technologies. Powered by Dell Technologies infrastructure with the NVIDIA Vera Rubin architecture, and set to launch in 2026, Doudna is tailored for real-time discovery across the U.S. Department of Energy’s most urgent scientific missions. It’s poised to catapult American researchers to the forefront of critical scientific breakthroughs, fostering innovation and securing the nation’s competitive edge in key technological fields. “I’m so proud that America continues to invest in this particular area,” said NVIDIA founder and CEO Jensen Huang. “It is the foundation of scientific discovery for our country. It is also the foundation for economic and technology leadership.” “It’s an incredible honor to be here,” Doudna said, adding she was “surprised and delighted” that a supercomputer would be named after her. “I think we’re standing at a really interesting moment in biology,” she added, with people with different skills coming together to address global issues. Designed to Accelerate Breakthroughs Unlike traditional systems that operate in silos, Doudna merges simulation, data and AI into a single seamless platform. “The Doudna supercomputer is designed to accelerate a broad set of scientific workflows,” said NERSC Director Sudip Dosanjh. “Doudna will be connected to DOE experimental and observational facilities through the Energy Sciences Network (ESnet), allowing scientists to stream data seamlessly into the system from all parts of the country and to analyze it in near real time.” It’s engineered to empower over 11,000 researchers with almost instantaneous responsiveness and integrated workflows, helping scientists explore bigger questions and reach answers faster than ever. “We’re not just building a faster computer,” said Nick Wright, advanced technologies group lead and Doudna chief architect at NERSC. “We’re building a system that helps researchers think bigger and discover sooner.” Here’s what Wright expects Doudna to advance: Fusion energy: Breakthroughs in simulation that unlocks clean fusion energy. Materials science: AI models that design new classes of superconducting materials. Drug discovery acceleration: Ultrarapid workflow that helps biologists fold proteins fast enough to outpace a pandemic. Astronomy: Real-time processing of data from the Dark Energy Spectroscopic Instrument at Kitt Peak to help scientists map the universe. Doudna is expected to outperform its predecessor, Perlmutter, by more than 10x in scientific output, all while using just 2-3x the power. This translates to a 3-5x increase in performance per watt, a result of innovations in chip design, dynamic load balancing and system-level efficiencies. AI-Powered Discovery at Scale Doudna will power AI-driven breakthroughs across high-impact scientific fields nationwide. Highlights include: AI for protein design: David Baker, a 2024 Nobel laureate, used NERSC systems to support his work using AI to predict novel protein structures, addressing challenges across scientific disciplines. AI for fundamental physics: Researchers like Benjamin Nachman are using AI to “unfold” detector distortions in particle physics data and analyze proton data from electron-proton colliders. AI for materials science: A collaboration including Berkeley Lab and Meta created “Open Molecules 2025,” a massive dataset for using AI to accurately model complex molecular chemical reactions. Researchers involved also use NERSC for their AI models. Real-Time Science, Real-World Impact Doudna isn’t a standalone system. It’s an integral part of scientific workflows. DOE’s ESnet will stream data from telescopes, detectors and genome sequencers directly into the machine with low-latency, high-throughput NVIDIA Quantum-X800 InfiniBand networking. This critical data flow is prioritized by intelligent quality-of-service mechanisms, ensuring it stays fast and uninterrupted, from input to insight. This will make the system incredibly responsive. At the DIII-D national fusion ignition facility, for example, data will stream control-room events directly into Doudna for rapid-response plasma modeling, so scientists can make adjustments in real time. “We used to think of the supercomputer as a passive participant in the corner,” Wright said. “Now it’s part of the entire workflow, connected to experiments, telescopes, detectors.” The Platform for What’s Next: Unlocking Quantum and HPC Workflows Doudna supports traditional HPC, cutting-edge AI, real-time streaming and even quantum workflows. The Mayall 4-Meter Telescope, which will be home to the Dark Energy Spectroscopic Instrument, seen at night at Kitt Peak National Observatory. This includes support for scalable quantum algorithm development and the codesign of future integrated quantum-HPC systems, using platforms like NVIDIA CUDA-Q. All of these workflows will run on the next-generation NVIDIA Vera Rubin platform, which will blend high-performance CPUs with coherent GPUs, meaning all processors can access and share data directly to support the most demanding scientific workloads. Researchers are already porting full pipelines using frameworks like PyTorch, the NVIDIA Holoscan software development kit, TensorFlow, NVIDIA cuDNN and NVIDIA CUDA-Q, all optimized for the system’s Rubin GPUs and NVIDIA NVLink architecture. Over 20 research teams are already porting full workflows to Doudna through the NERSC Science Acceleration Program, tackling everything from climate models to particle physics. This isn’t just about raw compute, it’s about discovery, integrated from idea to insight. Designed for Urgency Last year, AI-assisted science earned two Nobel Prizes. From climate research to pandemic response, the next breakthroughs won’t wait for better infrastructure. With deployment slated for 2026, Doudna is positioned to lead a new era of accelerated science. DOE facilities across the country, from Fermilab to the Joint Genome Institute, will rely on its capabilities to turn today’s questions into tomorrow’s breakthroughs. “This isn’t a system for one field,” Wright said. “It’s for discovery — across chemistry, physics and fields we haven’t imagined yet.” As Huang put it, Doudna is “a time machine for science.” It compresses years of discovery into days and gives the world’s toughest problems the power they’ve been waiting for. This post has been updated with comments from Thursday’s event at Lawrence Berkeley National Laboratory. 
    0 Commentarii 0 Distribuiri 0 previzualizare
  • This Detailed Map of a Human Cell Could Help Us Understand How Cancer Develops

    It’s been more than two decades since scientists finished sequencing the human genome, providing a comprehensive map of human biology that has since accelerated progress in disease research and personalized medicine. Thanks to that endeavor, we know that each of us has about 20,000 protein-coding genes, which serve as blueprints for the diverse protein molecules that give shape to our cells and keep them functioning properly.Yet, we know relatively little about how those proteins are organized within cells and how they interact with each other, says Trey Ideker, a professor of medicine and bioengineering at University of California San Diego. Without that knowledge, he says, trying to study and treat disease is “like trying to understand how to fix your car without the shop manual.” Mapping the Human CellIn a recent paper in the journal Nature, Ideker and his colleagues presented their latest attempt to fill this information gap: a fine-grained map of a human cell, showing the locations of more than 5,000 proteins and how they assemble into larger and larger structures. The researchers also created an interactive version of the map. It goes far beyond the simplified diagrams you may recall from high school biology class. Familiar objects like the nucleus appear at the highest level, but zooming in, you find the nucleoplasm, then the chromatin factors, then the transcription factor IID complex, which is home to five individual proteins better left nameless. This subcellular metropolis is unintelligible to non-specialists, but it offers a look at the extraordinary complexity within us all.Surprising Cell FeaturesEven for specialists, there are some surprises. The team identified 275 protein assemblies, ranging in scale from large charismatic organelles like mitochondria, to smaller features like microtubules and ribosomes, down to the tiny protein complexes that constitute “the basic machinery” of the cell, as Ideker put it. “Across all that,” he says, “about half of it was known, and about half of it, believe it or not, wasn't known.” In other words, 50 percent of the structures they found “just simply don't map to anything in the cell biology textbook.”Multimodal Process for Cell MappingThey achieved this level of detail by taking a “multimodal” approach. First, to figure out which molecules interact with each other, the researchers would line a tube with a particular protein, called the “bait” protein; then they would pour a blended mixture of other proteins through the tube to see what stuck, revealing which ones were neighbors.Next, to get precise coordinates for the location of these proteins, they lit up individual molecules within a cell using glowing antibodies, the cellular defenders produced by the immune system to bind to and neutralize specific substances. Once an antibody found its target, the illuminated protein could be visualized under a microscope and placed on the map. Enhancing Cancer ResearchThere are many human cell types, and the one Ideker’s team chose for this study is called the U2OS cell. It’s commonly associated with pediatric bone tumors. Indeed, the researchers identified about 100 mutated proteins that are linked to this childhood cancer, enhancing our understanding of how the disease develops. Better yet, they located the assemblies those proteins belong to. Typically, Ideker says, cancer research is focused on individual mutations, whereas it’s often more useful to think about the larger systems that cancer disrupts. Returning to the car analogy, he notes that a vehicle’s braking system can fail in various ways: You can tamper with the pedal, the calipers, the discs or the brake fluid, and all these mechanisms give the same outcome.Similarly, cancer can cause a biological system to malfunction in various ways, and Ideker argues that comprehensive cell maps provide an effective way to study those diverse mechanisms of disease. “We've only understood the tip of the iceberg in terms of what gets mutated in cancer,” he says. “The problem is that we're not looking at the machines that actually matter, we're looking at the nuts and bolts.”Mapping Cells for the FutureBeyond cancer, the researchers hope their map will serve as a model for scientists attempting to chart other kinds of cells. This map took more than three years to create, but technology and methodological improvements could speed up the process — as they did for genome sequencing throughout the late 20th century — allowing medical treatments to be tailored to a person’s unique protein profile. “We're going to have to turn Moore's law on this,” Ideker says, “to really scale it up and understand differences in cell biologybetween individuals.”This article is not offering medical advice and should be used for informational purposes only.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Cody Cottier is a contributing writer at Discover who loves exploring big questions about the universe and our home planet, the nature of consciousness, the ethical implications of science and more. He holds a bachelor's degree in journalism and media production from Washington State University.
    #this #detailed #map #human #cell
    This Detailed Map of a Human Cell Could Help Us Understand How Cancer Develops
    It’s been more than two decades since scientists finished sequencing the human genome, providing a comprehensive map of human biology that has since accelerated progress in disease research and personalized medicine. Thanks to that endeavor, we know that each of us has about 20,000 protein-coding genes, which serve as blueprints for the diverse protein molecules that give shape to our cells and keep them functioning properly.Yet, we know relatively little about how those proteins are organized within cells and how they interact with each other, says Trey Ideker, a professor of medicine and bioengineering at University of California San Diego. Without that knowledge, he says, trying to study and treat disease is “like trying to understand how to fix your car without the shop manual.” Mapping the Human CellIn a recent paper in the journal Nature, Ideker and his colleagues presented their latest attempt to fill this information gap: a fine-grained map of a human cell, showing the locations of more than 5,000 proteins and how they assemble into larger and larger structures. The researchers also created an interactive version of the map. It goes far beyond the simplified diagrams you may recall from high school biology class. Familiar objects like the nucleus appear at the highest level, but zooming in, you find the nucleoplasm, then the chromatin factors, then the transcription factor IID complex, which is home to five individual proteins better left nameless. This subcellular metropolis is unintelligible to non-specialists, but it offers a look at the extraordinary complexity within us all.Surprising Cell FeaturesEven for specialists, there are some surprises. The team identified 275 protein assemblies, ranging in scale from large charismatic organelles like mitochondria, to smaller features like microtubules and ribosomes, down to the tiny protein complexes that constitute “the basic machinery” of the cell, as Ideker put it. “Across all that,” he says, “about half of it was known, and about half of it, believe it or not, wasn't known.” In other words, 50 percent of the structures they found “just simply don't map to anything in the cell biology textbook.”Multimodal Process for Cell MappingThey achieved this level of detail by taking a “multimodal” approach. First, to figure out which molecules interact with each other, the researchers would line a tube with a particular protein, called the “bait” protein; then they would pour a blended mixture of other proteins through the tube to see what stuck, revealing which ones were neighbors.Next, to get precise coordinates for the location of these proteins, they lit up individual molecules within a cell using glowing antibodies, the cellular defenders produced by the immune system to bind to and neutralize specific substances. Once an antibody found its target, the illuminated protein could be visualized under a microscope and placed on the map. Enhancing Cancer ResearchThere are many human cell types, and the one Ideker’s team chose for this study is called the U2OS cell. It’s commonly associated with pediatric bone tumors. Indeed, the researchers identified about 100 mutated proteins that are linked to this childhood cancer, enhancing our understanding of how the disease develops. Better yet, they located the assemblies those proteins belong to. Typically, Ideker says, cancer research is focused on individual mutations, whereas it’s often more useful to think about the larger systems that cancer disrupts. Returning to the car analogy, he notes that a vehicle’s braking system can fail in various ways: You can tamper with the pedal, the calipers, the discs or the brake fluid, and all these mechanisms give the same outcome.Similarly, cancer can cause a biological system to malfunction in various ways, and Ideker argues that comprehensive cell maps provide an effective way to study those diverse mechanisms of disease. “We've only understood the tip of the iceberg in terms of what gets mutated in cancer,” he says. “The problem is that we're not looking at the machines that actually matter, we're looking at the nuts and bolts.”Mapping Cells for the FutureBeyond cancer, the researchers hope their map will serve as a model for scientists attempting to chart other kinds of cells. This map took more than three years to create, but technology and methodological improvements could speed up the process — as they did for genome sequencing throughout the late 20th century — allowing medical treatments to be tailored to a person’s unique protein profile. “We're going to have to turn Moore's law on this,” Ideker says, “to really scale it up and understand differences in cell biologybetween individuals.”This article is not offering medical advice and should be used for informational purposes only.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Cody Cottier is a contributing writer at Discover who loves exploring big questions about the universe and our home planet, the nature of consciousness, the ethical implications of science and more. He holds a bachelor's degree in journalism and media production from Washington State University. #this #detailed #map #human #cell
    WWW.DISCOVERMAGAZINE.COM
    This Detailed Map of a Human Cell Could Help Us Understand How Cancer Develops
    It’s been more than two decades since scientists finished sequencing the human genome, providing a comprehensive map of human biology that has since accelerated progress in disease research and personalized medicine. Thanks to that endeavor, we know that each of us has about 20,000 protein-coding genes, which serve as blueprints for the diverse protein molecules that give shape to our cells and keep them functioning properly.Yet, we know relatively little about how those proteins are organized within cells and how they interact with each other, says Trey Ideker, a professor of medicine and bioengineering at University of California San Diego. Without that knowledge, he says, trying to study and treat disease is “like trying to understand how to fix your car without the shop manual.” Mapping the Human CellIn a recent paper in the journal Nature, Ideker and his colleagues presented their latest attempt to fill this information gap: a fine-grained map of a human cell, showing the locations of more than 5,000 proteins and how they assemble into larger and larger structures. The researchers also created an interactive version of the map. It goes far beyond the simplified diagrams you may recall from high school biology class. Familiar objects like the nucleus appear at the highest level, but zooming in, you find the nucleoplasm, then the chromatin factors, then the transcription factor IID complex, which is home to five individual proteins better left nameless. This subcellular metropolis is unintelligible to non-specialists, but it offers a look at the extraordinary complexity within us all.Surprising Cell FeaturesEven for specialists, there are some surprises. The team identified 275 protein assemblies, ranging in scale from large charismatic organelles like mitochondria, to smaller features like microtubules and ribosomes, down to the tiny protein complexes that constitute “the basic machinery” of the cell, as Ideker put it. “Across all that,” he says, “about half of it was known, and about half of it, believe it or not, wasn't known.” In other words, 50 percent of the structures they found “just simply don't map to anything in the cell biology textbook.”Multimodal Process for Cell MappingThey achieved this level of detail by taking a “multimodal” approach. First, to figure out which molecules interact with each other, the researchers would line a tube with a particular protein, called the “bait” protein; then they would pour a blended mixture of other proteins through the tube to see what stuck, revealing which ones were neighbors.Next, to get precise coordinates for the location of these proteins, they lit up individual molecules within a cell using glowing antibodies, the cellular defenders produced by the immune system to bind to and neutralize specific substances (often foreign invaders like viruses and bacteria, but in this case homegrown proteins). Once an antibody found its target, the illuminated protein could be visualized under a microscope and placed on the map. Enhancing Cancer ResearchThere are many human cell types, and the one Ideker’s team chose for this study is called the U2OS cell. It’s commonly associated with pediatric bone tumors. Indeed, the researchers identified about 100 mutated proteins that are linked to this childhood cancer, enhancing our understanding of how the disease develops. Better yet, they located the assemblies those proteins belong to. Typically, Ideker says, cancer research is focused on individual mutations, whereas it’s often more useful to think about the larger systems that cancer disrupts. Returning to the car analogy, he notes that a vehicle’s braking system can fail in various ways: You can tamper with the pedal, the calipers, the discs or the brake fluid, and all these mechanisms give the same outcome.Similarly, cancer can cause a biological system to malfunction in various ways, and Ideker argues that comprehensive cell maps provide an effective way to study those diverse mechanisms of disease. “We've only understood the tip of the iceberg in terms of what gets mutated in cancer,” he says. “The problem is that we're not looking at the machines that actually matter, we're looking at the nuts and bolts.”Mapping Cells for the FutureBeyond cancer, the researchers hope their map will serve as a model for scientists attempting to chart other kinds of cells. This map took more than three years to create, but technology and methodological improvements could speed up the process — as they did for genome sequencing throughout the late 20th century — allowing medical treatments to be tailored to a person’s unique protein profile. “We're going to have to turn Moore's law on this,” Ideker says, “to really scale it up and understand differences in cell biology […] between individuals.”This article is not offering medical advice and should be used for informational purposes only.Article SourcesOur writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:Cody Cottier is a contributing writer at Discover who loves exploring big questions about the universe and our home planet, the nature of consciousness, the ethical implications of science and more. He holds a bachelor's degree in journalism and media production from Washington State University.
    11 Commentarii 0 Distribuiri 0 previzualizare
  • 23andMe Found a Buyer for Our Genetic Data, and I’m Kind of Optimistic

    The last time we talked about 23andMe’s bankruptcy sale, we suggested you might want to delete your genetic data from the site, since we didn’t know who would end up owning it. Now, a buyer has been announced, and they’re promising to “prioritize the privacy, security and ethical use” of customer data and to keep offering the company’s services uninterrupted. This is all—probably—good news. I'm feeling cautiously optimistic, anyway.Regeneron will be 23andMe’s new owner23andMe’s new buyer, paying million for the company’s assets, is Regeneron. Regeneron is a biotech company perhaps best known for developing an antibody treatment for COVID early in the pandemic.That treatment never made it all the way to market, but the company does market other antibody- and protein-based treatments for conditions like Ebola virus, genetic disorders, and cancers. Regeneron’s website states that they “are shaping the next frontier of medicine with data-powered insights from the Regeneron Genetics Center® and pioneering genetic medicine platforms, enabling us to identify innovative targets and complementary approaches to potentially treat or cure diseases.” That explains why they’re interested in 23andMe, since it provides a trove of genetic data. Many 23andMe users had also signed up to provide more of their personal medical information for research purposes. Regeneron says they plan to “continue all consumer genome services uninterrupted,” rather than shut down the company. Lemonaid health, also owned by 23andMe, is not included in the sale.What this means for your data and privacyImportantly, Regeneron says they will respect the company’s privacy policyand the 23andMe press release also says that Regeneron will not be making any changes to the privacy policy. The sale, which still needs to be approved by a bankruptcy court, will also include a court-appointed “consumer privacy ombudsman” whose job is to make sure that everyone’s data is treated appropriately. Regeneron says that they’re ready to work with this ombudsman and will detail all their privacy-related plans. While we don’t yet know what the future holds, this all has me tentatively optimistic. Yes, a corporation has your data and intends to profit from it; but that was also true of 23andMe in its heyday. The policies about privacy and consent that you agreed to when contributing your data will still be in effect. The 23andMe community seems to be cautiously optimistic. In a r/23andme Reddit thread about the sale, one redditor, who identifies themselves as an academic biomedical researcher, says “I would ratherhave my data than an insurance provider or just random.” Another redditor says, “So there really is notbest case scenario here, there's just ‘wait and see’ and bad. And this is more of a ‘wait and see’ than a bad.” And another says “I know people side eye big pharmabut this is a much better outcome than many other situations.” 
    #23andme #found #buyer #our #genetic
    23andMe Found a Buyer for Our Genetic Data, and I’m Kind of Optimistic
    The last time we talked about 23andMe’s bankruptcy sale, we suggested you might want to delete your genetic data from the site, since we didn’t know who would end up owning it. Now, a buyer has been announced, and they’re promising to “prioritize the privacy, security and ethical use” of customer data and to keep offering the company’s services uninterrupted. This is all—probably—good news. I'm feeling cautiously optimistic, anyway.Regeneron will be 23andMe’s new owner23andMe’s new buyer, paying million for the company’s assets, is Regeneron. Regeneron is a biotech company perhaps best known for developing an antibody treatment for COVID early in the pandemic.That treatment never made it all the way to market, but the company does market other antibody- and protein-based treatments for conditions like Ebola virus, genetic disorders, and cancers. Regeneron’s website states that they “are shaping the next frontier of medicine with data-powered insights from the Regeneron Genetics Center® and pioneering genetic medicine platforms, enabling us to identify innovative targets and complementary approaches to potentially treat or cure diseases.” That explains why they’re interested in 23andMe, since it provides a trove of genetic data. Many 23andMe users had also signed up to provide more of their personal medical information for research purposes. Regeneron says they plan to “continue all consumer genome services uninterrupted,” rather than shut down the company. Lemonaid health, also owned by 23andMe, is not included in the sale.What this means for your data and privacyImportantly, Regeneron says they will respect the company’s privacy policyand the 23andMe press release also says that Regeneron will not be making any changes to the privacy policy. The sale, which still needs to be approved by a bankruptcy court, will also include a court-appointed “consumer privacy ombudsman” whose job is to make sure that everyone’s data is treated appropriately. Regeneron says that they’re ready to work with this ombudsman and will detail all their privacy-related plans. While we don’t yet know what the future holds, this all has me tentatively optimistic. Yes, a corporation has your data and intends to profit from it; but that was also true of 23andMe in its heyday. The policies about privacy and consent that you agreed to when contributing your data will still be in effect. The 23andMe community seems to be cautiously optimistic. In a r/23andme Reddit thread about the sale, one redditor, who identifies themselves as an academic biomedical researcher, says “I would ratherhave my data than an insurance provider or just random.” Another redditor says, “So there really is notbest case scenario here, there's just ‘wait and see’ and bad. And this is more of a ‘wait and see’ than a bad.” And another says “I know people side eye big pharmabut this is a much better outcome than many other situations.”  #23andme #found #buyer #our #genetic
    LIFEHACKER.COM
    23andMe Found a Buyer for Our Genetic Data, and I’m Kind of Optimistic
    The last time we talked about 23andMe’s bankruptcy sale, we suggested you might want to delete your genetic data from the site, since we didn’t know who would end up owning it. Now, a buyer has been announced, and they’re promising to “prioritize the privacy, security and ethical use” of customer data and to keep offering the company’s services uninterrupted. This is all—probably—good news. I'm feeling cautiously optimistic, anyway.Regeneron will be 23andMe’s new owner23andMe’s new buyer, paying $256 million for the company’s assets, is Regeneron. Regeneron is a biotech company perhaps best known for developing an antibody treatment for COVID early in the pandemic. (Donald Trump was given a dose when he first came down with the virus.) That treatment never made it all the way to market, but the company does market other antibody- and protein-based treatments for conditions like Ebola virus, genetic disorders, and cancers. Regeneron’s website states that they “are shaping the next frontier of medicine with data-powered insights from the Regeneron Genetics Center® and pioneering genetic medicine platforms, enabling us to identify innovative targets and complementary approaches to potentially treat or cure diseases.” That explains why they’re interested in 23andMe, since it provides a trove of genetic data. Many 23andMe users had also signed up to provide more of their personal medical information for research purposes (this was a separate thing that you would have had to opt in to provide). Regeneron says they plan to “continue all consumer genome services uninterrupted,” rather than shut down the company. Lemonaid health, also owned by 23andMe, is not included in the sale.What this means for your data and privacyImportantly, Regeneron says they will respect the company’s privacy policy (“and applicable laws”) and the 23andMe press release also says that Regeneron will not be making any changes to the privacy policy. The sale, which still needs to be approved by a bankruptcy court, will also include a court-appointed “consumer privacy ombudsman” whose job is to make sure that everyone’s data is treated appropriately. Regeneron says that they’re ready to work with this ombudsman and will detail all their privacy-related plans. While we don’t yet know what the future holds, this all has me tentatively optimistic. Yes, a corporation has your data and intends to profit from it; but that was also true of 23andMe in its heyday. The policies about privacy and consent that you agreed to when contributing your data will still be in effect. The 23andMe community seems to be cautiously optimistic. In a r/23andme Reddit thread about the sale, one redditor, who identifies themselves as an academic biomedical researcher, says “I would rather [Regeneron] have my data than an insurance provider or just random [venture capitalist].” Another redditor says, “So there really is not [a] best case scenario here, there's just ‘wait and see’ and bad. And this is more of a ‘wait and see’ than a bad.” And another says “I know people side eye big pharma (rightfully in most circumstances) but this is a much better outcome than many other situations.” 
    0 Commentarii 0 Distribuiri 0 previzualizare
  • The case against summer

    Close your eyes and think of the word “summer.” What comes to mind?Is it long days at the beach, a drink in one hand and a book in the other, letting the sun fall on your face and the waves tickle your toes? Two weeks of vacation in some remote destination, piling up memories to keep yourself warm through the rest of the year? The endless freedom you remember in those July and August weeks of childhood, set loose from the confines of the classroom? Hot dogs and ice cream and roller coasters and ballgames? John Travolta’s falsetto at the end of “Summer Love”?Well, I have bad news for you, my friend. You are yet another victim of the summer industrial complex, that travel industry-concocted collection of lies designed to convince you that June, July, and August are the three best months of the year. The beach? That sun will literally kill you. Vacation? Just don’t look up how much plane tickets cost, and don’t even think of leaving the country with the way the dollar is dropping. Freedom? Unless you are an actual child, a schoolteacher, or an NBA player, you’re going to spend most of your time in summer working as hard as you do the rest of the year. Hot dogs are honestly the worst way to eat meat. Your ice cream is already ice soup. Roller coasters kill an average of four people per year. If you want to drink beer, you don’t need to sit through a baseball game while doing it. Grease is fine, but its success led to John Travolta one day being allowed to make Battlefield Earth, a film so bad that as of this writing, it has a 3 percent rating on Rotten Tomatoes. Summer is the triumph of hope over experience. Every Memorial Day weekend, we begin our summers full of expectation, sure that this will be the season we create the summer to remember. And every Labor Day weekend, we emerge, sweaty and mosquito-bitten, wondering what precisely happened over the past three months. Then next year we do it all over again, fruitlessly chasing that evanescent summer high — even though deep down inside, you know it’s probably going to be a disappointment, and secretly you’re counting the days until September. If you were able to control those hopes, you might be able to control that disappointment.But don’t you dare air those feelings out loud. When I suggested this essay to my fellow Vox editors, they reacted as though I were about to commit a war crime on paper. Doesn’t everyone love summer? Isn’t summer the best? How dare you look askance at the gift that is the three months when our hemisphere happens to be titled toward the rays of our life-giving sun? What kind of monster are you?As it turns out, I am precisely that kind of monster. So what follows is why this is our most overrated season — and unlike summer itself, which really is getting longer year by year, I’m going to be brief. It’s hotYou will not be surprised to learn that I don’t like the heat. Maybe it’s genes — my ancestors come from Ireland, a small, charming, rainy island where for most of the year, the sun is little more than a rumor. I realize this makes me unusual. The US county that has added the most people in recent years is Maricopa, Arizona, home to Phoenix. Phoenix has a lot of things going for it: relatively inexpensive housing, a fairly robust labor market, and a vibrant population of wild parrots, which is absolutely something I knew before researching this article.Phoenix also has sun — lots and lots of sun. Just look at what they named their NBA team. And with that sun comes unfathomable summer heat. Across the full 2024 calendar year, the city logged a record-breaking 70 days of temperatures over 110 degrees, obliterating the previous record of 55 days set in 2020. It also set a record for the most days straight with temperatures in the triple digits, with an unfathomable 113 days in a row.Yet every year, apparently tens of thousands of Americans take a look at those numbers and think, “Yes, please, I would like to see if they have any available lots left on the surface of the sun.” Look, I get it. The tremendous growth of the Sun Belt in recent decades is one big piece of evidence that, if given the choice, most Americans would rather boil than freeze. Or even be slightly cold. And sure, historically cold temperatures have had a bad habit of killing large numbers of human beings. No one in Game of Thrones was warning that “summer is coming.”But while it’s still true that extreme cold kills significantly more people globally than extreme heat by a large magnitude, heat is catching up. And there’s one thing you can count on with climate change: It will continue to get hotter. Summer — that season you love so much — is where we’re going to feel it. You may have heard the line: “This could be the coolest summer of the rest of your life”? It’s true! Just to take one example: A study found that by 2053, 107 million people in the US — 13 times as many as today — will be living in an extreme heat belt where they could experience heat indexes above 125 degrees. So sure, Americans like the heat just like they like summer, though I can’t help wondering if that has to do with the documented connection between extreme heat and cognitive impairment.But I doubt you will like it when your body is no longer able to cool itself through sweating and you begin suffering multiple organ failures. It’s boringLet’s flip through the major events of autumn. You have your Halloween — everyone loves candy. Thanksgiving — by far the best American holiday, even if we have all collectively decided to eat a bird we wouldn’t otherwise touch the rest of the year. Christmas and Hanukkah — presents and several days off.Spring has Easter, a festival of renewal and chocolate. Winter has…okay, to be clear, this is an argument against summer, not a defense of winter. Summer has Memorial Day; Fourth of July; and then two utterly endless months before Labor Day, where we also have cookouts and beaches. And in between, there are just…days.This is the secret problem with summer. After school has let out and Independence Day has passed, we enter a tepid sea of indistinguishable days, with little to no events to break them up. July 12? July 27? August 13? I challenge you to tell the difference. Time becomes a desert that stretches out to every horizon, without even the false hope of a mirage to break it up. The Catholic Church, which I grew up in, calls the entirety of summer “Ordinary Time” in its liturgical calendar, which always seemed fitting to me. Nothing special, nothing to wait for — just all the Ordinary Time you can take.And while the calendar is no help, there’s also what I call the collective action problem of summer. Everything slows down and even shuts down, either because people go off on vacation or because they haven’t but almost everyone else has so what’s the point of doing anything. All the big cultural events — the books, themovies, most of the good TV — won’t arrive until the fall.The sports landscape is as barren as your office, and this summer we don’t even have the Olympics.I’m sure someone will tell me I’m missing the point of summer, when the very formlessness of the days reminds us to slow down and appreciate these moments out of time. Sure, great, whatever. Personally, I can either be hot or I can be bored — not both.It has AugustTechnically this should be a subcategory of the previous section, but even Auxo, the Greek goddess of summer, would get impatient with August. Why does it have 31 days? Who voted for that? August is the worst parts of summer concentrated and then wrung out over the course of more than four sweaty, sticky weeks. I am positive that I have experienced August days where time begins to move backward.Slate had it right back in 2008: Let’s get rid of August. We’ve gone to the moon, we’ve mastered the genome, we’ve somehow made Glen Powell a movie star. If we can do all that, we can remove one measly month from the calendar. Or we could, except that August is the month when all motivation goes to die.It has vacations…in AugustI’ve got a great idea. Let’s have most of the country all go on vacation during the same few weeks. And then let’s ensure that those few weeks are set during one of the hottest, muggiest months of the year. What could go wrong?It has FOMOIt’s probably not true that everyone is having more fun than you this summer, all evidence on social media notwithstanding. But it will feel that way.It’s become a verbLet me give you one last piece of advice. If you encounter someone who uses the term “summering” in a sentence, get far, far away. You are dangerously close to getting into a conversation about the best way to clean linen pants.I realize I’m not going to change a lot of minds here. There’s something deep in our biological clocks that can’t seem to help but welcome the days when the sun stays up past 8 pm and the air temperature reaches equilibrium with our bodies. Add that to the enforced summer love that comes from all the industries that capitalize on this seasonal affliction. We summer haters are few and rarely invited to parties, but at least we see the truth. The truth is that you might actually enjoy your summer more if you lower your expectations. It’s not the summer of your life — it’s just three months in the middle of the year. And please, put on some sunscreen. That big thing in the sky really is trying to kill you. Update, May 26, 9 am ET: This story was originally published on July 8, 2024, and has been updated with new data on heat waves in Phoenix.You’ve read 1 article in the last monthHere at Vox, we're unwavering in our commitment to covering the issues that matter most to you — threats to democracy, immigration, reproductive rights, the environment, and the rising polarization across this country.Our mission is to provide clear, accessible journalism that empowers you to stay informed and engaged in shaping our world. By becoming a Vox Member, you directly strengthen our ability to deliver in-depth, independent reporting that drives meaningful change.We rely on readers like you — join us.Swati SharmaVox Editor-in-ChiefSee More:
    #case #against #summer
    The case against summer
    Close your eyes and think of the word “summer.” What comes to mind?Is it long days at the beach, a drink in one hand and a book in the other, letting the sun fall on your face and the waves tickle your toes? Two weeks of vacation in some remote destination, piling up memories to keep yourself warm through the rest of the year? The endless freedom you remember in those July and August weeks of childhood, set loose from the confines of the classroom? Hot dogs and ice cream and roller coasters and ballgames? John Travolta’s falsetto at the end of “Summer Love”?Well, I have bad news for you, my friend. You are yet another victim of the summer industrial complex, that travel industry-concocted collection of lies designed to convince you that June, July, and August are the three best months of the year. The beach? That sun will literally kill you. Vacation? Just don’t look up how much plane tickets cost, and don’t even think of leaving the country with the way the dollar is dropping. Freedom? Unless you are an actual child, a schoolteacher, or an NBA player, you’re going to spend most of your time in summer working as hard as you do the rest of the year. Hot dogs are honestly the worst way to eat meat. Your ice cream is already ice soup. Roller coasters kill an average of four people per year. If you want to drink beer, you don’t need to sit through a baseball game while doing it. Grease is fine, but its success led to John Travolta one day being allowed to make Battlefield Earth, a film so bad that as of this writing, it has a 3 percent rating on Rotten Tomatoes. Summer is the triumph of hope over experience. Every Memorial Day weekend, we begin our summers full of expectation, sure that this will be the season we create the summer to remember. And every Labor Day weekend, we emerge, sweaty and mosquito-bitten, wondering what precisely happened over the past three months. Then next year we do it all over again, fruitlessly chasing that evanescent summer high — even though deep down inside, you know it’s probably going to be a disappointment, and secretly you’re counting the days until September. If you were able to control those hopes, you might be able to control that disappointment.But don’t you dare air those feelings out loud. When I suggested this essay to my fellow Vox editors, they reacted as though I were about to commit a war crime on paper. Doesn’t everyone love summer? Isn’t summer the best? How dare you look askance at the gift that is the three months when our hemisphere happens to be titled toward the rays of our life-giving sun? What kind of monster are you?As it turns out, I am precisely that kind of monster. So what follows is why this is our most overrated season — and unlike summer itself, which really is getting longer year by year, I’m going to be brief. It’s hotYou will not be surprised to learn that I don’t like the heat. Maybe it’s genes — my ancestors come from Ireland, a small, charming, rainy island where for most of the year, the sun is little more than a rumor. I realize this makes me unusual. The US county that has added the most people in recent years is Maricopa, Arizona, home to Phoenix. Phoenix has a lot of things going for it: relatively inexpensive housing, a fairly robust labor market, and a vibrant population of wild parrots, which is absolutely something I knew before researching this article.Phoenix also has sun — lots and lots of sun. Just look at what they named their NBA team. And with that sun comes unfathomable summer heat. Across the full 2024 calendar year, the city logged a record-breaking 70 days of temperatures over 110 degrees, obliterating the previous record of 55 days set in 2020. It also set a record for the most days straight with temperatures in the triple digits, with an unfathomable 113 days in a row.Yet every year, apparently tens of thousands of Americans take a look at those numbers and think, “Yes, please, I would like to see if they have any available lots left on the surface of the sun.” Look, I get it. The tremendous growth of the Sun Belt in recent decades is one big piece of evidence that, if given the choice, most Americans would rather boil than freeze. Or even be slightly cold. And sure, historically cold temperatures have had a bad habit of killing large numbers of human beings. No one in Game of Thrones was warning that “summer is coming.”But while it’s still true that extreme cold kills significantly more people globally than extreme heat by a large magnitude, heat is catching up. And there’s one thing you can count on with climate change: It will continue to get hotter. Summer — that season you love so much — is where we’re going to feel it. You may have heard the line: “This could be the coolest summer of the rest of your life”? It’s true! Just to take one example: A study found that by 2053, 107 million people in the US — 13 times as many as today — will be living in an extreme heat belt where they could experience heat indexes above 125 degrees. So sure, Americans like the heat just like they like summer, though I can’t help wondering if that has to do with the documented connection between extreme heat and cognitive impairment.But I doubt you will like it when your body is no longer able to cool itself through sweating and you begin suffering multiple organ failures. It’s boringLet’s flip through the major events of autumn. You have your Halloween — everyone loves candy. Thanksgiving — by far the best American holiday, even if we have all collectively decided to eat a bird we wouldn’t otherwise touch the rest of the year. Christmas and Hanukkah — presents and several days off.Spring has Easter, a festival of renewal and chocolate. Winter has…okay, to be clear, this is an argument against summer, not a defense of winter. Summer has Memorial Day; Fourth of July; and then two utterly endless months before Labor Day, where we also have cookouts and beaches. And in between, there are just…days.This is the secret problem with summer. After school has let out and Independence Day has passed, we enter a tepid sea of indistinguishable days, with little to no events to break them up. July 12? July 27? August 13? I challenge you to tell the difference. Time becomes a desert that stretches out to every horizon, without even the false hope of a mirage to break it up. The Catholic Church, which I grew up in, calls the entirety of summer “Ordinary Time” in its liturgical calendar, which always seemed fitting to me. Nothing special, nothing to wait for — just all the Ordinary Time you can take.And while the calendar is no help, there’s also what I call the collective action problem of summer. Everything slows down and even shuts down, either because people go off on vacation or because they haven’t but almost everyone else has so what’s the point of doing anything. All the big cultural events — the books, themovies, most of the good TV — won’t arrive until the fall.The sports landscape is as barren as your office, and this summer we don’t even have the Olympics.I’m sure someone will tell me I’m missing the point of summer, when the very formlessness of the days reminds us to slow down and appreciate these moments out of time. Sure, great, whatever. Personally, I can either be hot or I can be bored — not both.It has AugustTechnically this should be a subcategory of the previous section, but even Auxo, the Greek goddess of summer, would get impatient with August. Why does it have 31 days? Who voted for that? August is the worst parts of summer concentrated and then wrung out over the course of more than four sweaty, sticky weeks. I am positive that I have experienced August days where time begins to move backward.Slate had it right back in 2008: Let’s get rid of August. We’ve gone to the moon, we’ve mastered the genome, we’ve somehow made Glen Powell a movie star. If we can do all that, we can remove one measly month from the calendar. Or we could, except that August is the month when all motivation goes to die.It has vacations…in AugustI’ve got a great idea. Let’s have most of the country all go on vacation during the same few weeks. And then let’s ensure that those few weeks are set during one of the hottest, muggiest months of the year. What could go wrong?It has FOMOIt’s probably not true that everyone is having more fun than you this summer, all evidence on social media notwithstanding. But it will feel that way.It’s become a verbLet me give you one last piece of advice. If you encounter someone who uses the term “summering” in a sentence, get far, far away. You are dangerously close to getting into a conversation about the best way to clean linen pants.I realize I’m not going to change a lot of minds here. There’s something deep in our biological clocks that can’t seem to help but welcome the days when the sun stays up past 8 pm and the air temperature reaches equilibrium with our bodies. Add that to the enforced summer love that comes from all the industries that capitalize on this seasonal affliction. We summer haters are few and rarely invited to parties, but at least we see the truth. The truth is that you might actually enjoy your summer more if you lower your expectations. It’s not the summer of your life — it’s just three months in the middle of the year. And please, put on some sunscreen. That big thing in the sky really is trying to kill you. Update, May 26, 9 am ET: This story was originally published on July 8, 2024, and has been updated with new data on heat waves in Phoenix.You’ve read 1 article in the last monthHere at Vox, we're unwavering in our commitment to covering the issues that matter most to you — threats to democracy, immigration, reproductive rights, the environment, and the rising polarization across this country.Our mission is to provide clear, accessible journalism that empowers you to stay informed and engaged in shaping our world. By becoming a Vox Member, you directly strengthen our ability to deliver in-depth, independent reporting that drives meaningful change.We rely on readers like you — join us.Swati SharmaVox Editor-in-ChiefSee More: #case #against #summer
    WWW.VOX.COM
    The case against summer
    Close your eyes and think of the word “summer.” What comes to mind?Is it long days at the beach, a drink in one hand and a book in the other, letting the sun fall on your face and the waves tickle your toes? Two weeks of vacation in some remote destination, piling up memories to keep yourself warm through the rest of the year? The endless freedom you remember in those July and August weeks of childhood, set loose from the confines of the classroom? Hot dogs and ice cream and roller coasters and ballgames? John Travolta’s falsetto at the end of “Summer Love”?Well, I have bad news for you, my friend. You are yet another victim of the summer industrial complex, that travel industry-concocted collection of lies designed to convince you that June, July, and August are the three best months of the year. The beach? That sun will literally kill you. Vacation? Just don’t look up how much plane tickets cost, and don’t even think of leaving the country with the way the dollar is dropping. Freedom? Unless you are an actual child, a schoolteacher, or an NBA player, you’re going to spend most of your time in summer working as hard as you do the rest of the year. Hot dogs are honestly the worst way to eat meat. Your ice cream is already ice soup. Roller coasters kill an average of four people per year (you can look it up). If you want to drink beer, you don’t need to sit through a baseball game while doing it. Grease is fine, but its success led to John Travolta one day being allowed to make Battlefield Earth, a film so bad that as of this writing, it has a 3 percent rating on Rotten Tomatoes. Summer is the triumph of hope over experience. Every Memorial Day weekend, we begin our summers full of expectation, sure that this will be the season we create the summer to remember. And every Labor Day weekend, we emerge, sweaty and mosquito-bitten, wondering what precisely happened over the past three months. Then next year we do it all over again, fruitlessly chasing that evanescent summer high — even though deep down inside, you know it’s probably going to be a disappointment, and secretly you’re counting the days until September. If you were able to control those hopes, you might be able to control that disappointment.But don’t you dare air those feelings out loud. When I suggested this essay to my fellow Vox editors, they reacted as though I were about to commit a war crime on paper. Doesn’t everyone love summer? Isn’t summer the best? How dare you look askance at the gift that is the three months when our hemisphere happens to be titled toward the rays of our life-giving sun? What kind of monster are you?As it turns out, I am precisely that kind of monster. So what follows is why this is our most overrated season — and unlike summer itself, which really is getting longer year by year, I’m going to be brief. It’s hotYou will not be surprised to learn that I don’t like the heat. Maybe it’s genes — my ancestors come from Ireland, a small, charming, rainy island where for most of the year, the sun is little more than a rumor. I realize this makes me unusual. The US county that has added the most people in recent years is Maricopa, Arizona, home to Phoenix. Phoenix has a lot of things going for it: relatively inexpensive housing, a fairly robust labor market, and a vibrant population of wild parrots, which is absolutely something I knew before researching this article.Phoenix also has sun — lots and lots of sun. Just look at what they named their NBA team. And with that sun comes unfathomable summer heat. Across the full 2024 calendar year, the city logged a record-breaking 70 days of temperatures over 110 degrees, obliterating the previous record of 55 days set in 2020. It also set a record for the most days straight with temperatures in the triple digits, with an unfathomable 113 days in a row.Yet every year, apparently tens of thousands of Americans take a look at those numbers and think, “Yes, please, I would like to see if they have any available lots left on the surface of the sun.” Look, I get it. The tremendous growth of the Sun Belt in recent decades is one big piece of evidence that, if given the choice, most Americans would rather boil than freeze. Or even be slightly cold. And sure, historically cold temperatures have had a bad habit of killing large numbers of human beings. No one in Game of Thrones was warning that “summer is coming.”But while it’s still true that extreme cold kills significantly more people globally than extreme heat by a large magnitude, heat is catching up. And there’s one thing you can count on with climate change: It will continue to get hotter. Summer — that season you love so much — is where we’re going to feel it. You may have heard the line: “This could be the coolest summer of the rest of your life”? It’s true! Just to take one example: A study found that by 2053, 107 million people in the US — 13 times as many as today — will be living in an extreme heat belt where they could experience heat indexes above 125 degrees. So sure, Americans like the heat just like they like summer, though I can’t help wondering if that has to do with the documented connection between extreme heat and cognitive impairment. (Summer! It makes you dumber!) But I doubt you will like it when your body is no longer able to cool itself through sweating and you begin suffering multiple organ failures. It’s boringLet’s flip through the major events of autumn. You have your Halloween — everyone loves candy. Thanksgiving — by far the best American holiday, even if we have all collectively decided to eat a bird we wouldn’t otherwise touch the rest of the year. Christmas and Hanukkah — presents and several days off.Spring has Easter, a festival of renewal and chocolate. Winter has…okay, to be clear, this is an argument against summer, not a defense of winter. Summer has Memorial Day (cookouts, beaches); Fourth of July (cookouts, beaches, and ooh, a chance to blow off my finger with fireworks); and then two utterly endless months before Labor Day, where we also have cookouts and beaches. And in between, there are just…days.This is the secret problem with summer. After school has let out and Independence Day has passed, we enter a tepid sea of indistinguishable days, with little to no events to break them up. July 12? July 27? August 13? I challenge you to tell the difference. Time becomes a desert that stretches out to every horizon, without even the false hope of a mirage to break it up. The Catholic Church, which I grew up in, calls the entirety of summer “Ordinary Time” in its liturgical calendar, which always seemed fitting to me. Nothing special, nothing to wait for — just all the Ordinary Time you can take.And while the calendar is no help, there’s also what I call the collective action problem of summer. Everything slows down and even shuts down, either because people go off on vacation or because they haven’t but almost everyone else has so what’s the point of doing anything. All the big cultural events — the books, the (actually good) movies, most of the good TV — won’t arrive until the fall. (Except The Bear. The Bear is great.) The sports landscape is as barren as your office, and this summer we don’t even have the Olympics.I’m sure someone will tell me I’m missing the point of summer, when the very formlessness of the days reminds us to slow down and appreciate these moments out of time. Sure, great, whatever. Personally, I can either be hot or I can be bored — not both.It has AugustTechnically this should be a subcategory of the previous section, but even Auxo, the Greek goddess of summer, would get impatient with August. Why does it have 31 days? Who voted for that? August is the worst parts of summer concentrated and then wrung out over the course of more than four sweaty, sticky weeks. I am positive that I have experienced August days where time begins to move backward.Slate had it right back in 2008: Let’s get rid of August. We’ve gone to the moon, we’ve mastered the genome, we’ve somehow made Glen Powell a movie star. If we can do all that, we can remove one measly month from the calendar. Or we could, except that August is the month when all motivation goes to die.It has vacations…in AugustI’ve got a great idea. Let’s have most of the country all go on vacation during the same few weeks. And then let’s ensure that those few weeks are set during one of the hottest, muggiest months of the year. What could go wrong (other than ridiculous travel costs, heat stroke amid the capitals of Europe, and the better-than-average chance of getting hit by a tropical storm)?It has FOMOIt’s probably not true that everyone is having more fun than you this summer, all evidence on social media notwithstanding. But it will feel that way.It’s become a verbLet me give you one last piece of advice. If you encounter someone who uses the term “summering” in a sentence, get far, far away. You are dangerously close to getting into a conversation about the best way to clean linen pants.I realize I’m not going to change a lot of minds here. There’s something deep in our biological clocks that can’t seem to help but welcome the days when the sun stays up past 8 pm and the air temperature reaches equilibrium with our bodies. Add that to the enforced summer love that comes from all the industries that capitalize on this seasonal affliction. We summer haters are few and rarely invited to parties, but at least we see the truth. The truth is that you might actually enjoy your summer more if you lower your expectations. It’s not the summer of your life — it’s just three months in the middle of the year. And please, put on some sunscreen. That big thing in the sky really is trying to kill you. Update, May 26, 9 am ET: This story was originally published on July 8, 2024, and has been updated with new data on heat waves in Phoenix.You’ve read 1 article in the last monthHere at Vox, we're unwavering in our commitment to covering the issues that matter most to you — threats to democracy, immigration, reproductive rights, the environment, and the rising polarization across this country.Our mission is to provide clear, accessible journalism that empowers you to stay informed and engaged in shaping our world. By becoming a Vox Member, you directly strengthen our ability to deliver in-depth, independent reporting that drives meaningful change.We rely on readers like you — join us.Swati SharmaVox Editor-in-ChiefSee More:
    0 Commentarii 0 Distribuiri 0 previzualizare
  • Is Science Slowing Down?

    Basic scientific research is a key contributor to economic productivity.getty
    Is science running out of steam? A growing body of research suggests that disruptive breakthroughs—the kind that fundamentally redefine entire fields—may be occurring less frequently. A 2023 article in Nature reported that scientific papers and patents are, on average, less “disruptive” than they were in the mid-20th century. The study sparked intense interest and considerable controversy, covered in a recent news feature provocatively titled “Are Groundbreaking Science Discoveries Becoming Harder To Find?”

    Before weighing in, however, it is worth interrogating a more fundamental question: What do we mean when we call science “disruptive”? And is that, in fact, the appropriate benchmark for progress?

    The study in question, led by entrepreneurship scholar Russell Funk, employs a citation-based metric known as the Consolidation–Disruptionindex. The tool attempts to quantify whether new research displaces prior work—a signal of disruption—or builds directly upon it, thereby reinforcing existing paradigms. It represents a noteworthy contribution to our understanding of scientific change. Their conclusion, that disruption has declined across disciplines even as the volume of scientific output has expanded, has ignited debate among scientists, scholars and policymakers.

    Innovation May Be Getting Harder—But Also Deeper
    At a structural level, science becomes more complex as it matures. In some sense it has to slow down. The simplest questions are often the first to be answered, and what remains are challenges that are more subtle, more interdependent, and more difficult to resolve. The law of diminishing marginal returns, long familiar in economics, finds a natural corollary in research: at some point the intellectual “low-hanging fruit” has largely been harvested.

    Yet this does not necessarily imply stagnation. In fact, science itself is evolving. I think that apparent declines in disruption reflect not an impoverishment of ideas, but a transformation in the conduct and culture of research itself. Citation practices have shifted. Publication incentives have changed. The sheer availability of data and digital resources has exploded. Comparing contemporary citation behavior to that of earlier decades is not simply apples to oranges; it’s more like comparing ecosystems separated by tectonic time.
    More profoundly, we might ask whether paradigm shifts—particularly those in the Kuhnian sense—are truly the milestones we should prize above all others. Much of the innovation that drives societal progress and economic productivity does not emerge from revolutions in thought, but from the subtle extension and application of existing knowledge. In fields as varied as biomedicine, agriculture, and climate science, incremental refinement has yielded results of transformative impact.Brighter green hybrid rice plantshelp increase yields at this Filipino farm.Getty Images

    Science Today Is More Sophisticated—And More Efficient
    Scientists are publishing more today than ever. Critics of contemporary science attribute this to metric-driven culture of “salami slicing,” in which ideas are fragmented into the “minimum publishable unit” so that scientists can accrue an ever-growing publication count to secure career viability in a publish-or-perish environment. But such critiques overlook the extraordinary gains in research efficiency that have occurred in the past few decades, which I think are a far more compelling explanation for the massive output of scientific research today.
    Since the 1980s, personal computing has transformed nearly every dimension of the scientific process. Manuscript preparation, once the province of typewriters and retyped drafts, has become seamless. Data acquisition now involves automated sensors and real-time monitoring. Analytical tools like Python and R allow researchers to conduct sophisticated modeling and statistics with unprecedented speed. Communication is instantaneous. Knowledge-sharing platforms and open-access journals have dismantled many of the old barriers to entry.Advances in microcomputer technology in the 1980s and 1990s dramatically accelerated scientific ... More research.Denver Post via Getty Images
    Indeed, one wonders whether critics have recently read a research paper from the 1930s or 1970s. The methodological rigor, analytical depth, and interdisciplinary scope of modern research are, by nearly any standard, vastly more advanced.
    The Horizon Has Expanded
    In biology alone, high-throughput technologies—part of the broader “omics” revolution catalyzed by innovations like the polymerase chain reaction, which enabled rapid DNA amplification and supported the eventual success of the Human Genome Project—continue to propel discovery at an astonishing pace.Nobel Prize laureate James D. Watson speaks at a press conference to announce that a six-country ... More consortium has successfully drawn up a complete map of the human genome, completing one of the most ambitious scientific projects ever and offering a major opportunity for medical advances, 14 April 2003 at the National Institute of Health in Bethesda, Maryland. The announcement coincides with the 50th anniversary of the publication of the landmark paper describing DNA's double helix by Watson and Francis Crick. AFP PHOTO / Robyn BECKAFP via Getty Images
    When critics lament the apparent decline of Nobel-caliber “blockbusters” they overlook that the frontier of science has expanded—not narrowed. If we consider scientific knowledge as a volume, then it is bounded by an outer edge where discovery occurs. In Euclidean geometry, as the radius of a sphere increases, the surface areagrows more slowly than the volume. While the volume of knowledge grows more rapidly—encompassing established theories and tools that continue to yield applications—the surface area also expands, and it is along this widening frontier, where the known meets the unknown, that innovation arises.
    Rethinking Returns on Investment
    The modern belief that science must deliver measurable economic returns is, historically speaking, a relatively recent development. Before the Second World War, scientific research was not broadly viewed as a driver of productivity. Economist Daniel Susskind has argued that even the concept of economic growth as a central policy goal is a mid-20th century invention.
    After the war, that changed dramatically. Governments began to see research as critical to national development, security, and public health. Yet even as expectations have grown, relative public investment in science has, paradoxically, diminished, despite the fact that basic scientific research is a massive accelerant of economic productivity and effectively self-financing. While absolute funding has increased, government spending on science as a share of GDP has declined in the US and many other countries. Given the scale and complexity of the challenges we now face, we may be underinvesting in the very enterprise that could deliver solutions. Recent proposals to cut funding for NIH and NSF could, by some estimates, cost the U.S. tens of billions in lost productivity.
    There is compelling evidence to suggest that significantly increasing R&D expenditures—doubling or even tripling them—would yield strong and sustained returns.
    AI and the Next Wave of Scientific Efficiency
    Looking to the future, artificial intelligence offers the potential to not only streamline research but also to augment the process of innovation itself. AI tools—from large language models like ChatGPT to specialized engines for data mining and synthesis—enable researchers to traverse disciplines, identify patterns, and generate new hypotheses with remarkable speed.
    The ability to navigate vast bodies of scientific literature—once reserved for those with access to elite research libraries and ample time for reading—has been radically democratized. Scientists today can access digitized repositories, annotate papers with precision tools, manage bibliographies with software, and instantly trace the intellectual lineage of ideas. AI-powered tools support researchers in sifting through and synthesizing material across disciplines, helping to identify patterns, highlight connections, and bring under-explored ideas into view. For researchers like myself—an ecologist who often draws inspiration from nonlinear dynamics, statistical physics, and cognitive psychology—these technologies function as accelerators of thought rather than substitutes for it. They support the process of discovering latent analogies and assembling novel constellations of insight, the kind of cognitive recombination that underlies true creativity. While deep understanding still demands sustained intellectual engagement—reading, interpretation, and critical analysis—these tools lower the barrier to discovery and expand the range of intellectual possibilities.
    By enhancing cross-disciplinary thinking and reducing the latency between idea and investigation, AI may well reignite the kind of scientific innovation that some believe is slipping from reach.
    Science as a Cultural Endeavor
    Finally, it bears emphasizing that the value of science is not solely, or even primarily, economic. Like the arts, literature, or philosophy, science is a cultural and intellectual enterprise. It is an expression of curiosity, a vehicle for collective self-understanding, and a means of situating ourselves within the universe.
    From my vantage point, and that of many colleagues, the current landscape of discovery feels more fertile than ever. The questions we pose are more ambitious, the tools at our disposal more refined, and the connections we are able to make more multidimensional.
    If the signal of disruption appears to be dimming, perhaps it is only because the spectrum of science has grown too broad for any single wavelength to dominate. Rather than lament an apparent slowdown, we might ask a more constructive question: Are we measuring the right things? And are we creating the conditions that allow the most vital forms of science—creative, integrative, and with the potential to transform human society for the better—to flourish?
    #science #slowing #down
    Is Science Slowing Down?
    Basic scientific research is a key contributor to economic productivity.getty Is science running out of steam? A growing body of research suggests that disruptive breakthroughs—the kind that fundamentally redefine entire fields—may be occurring less frequently. A 2023 article in Nature reported that scientific papers and patents are, on average, less “disruptive” than they were in the mid-20th century. The study sparked intense interest and considerable controversy, covered in a recent news feature provocatively titled “Are Groundbreaking Science Discoveries Becoming Harder To Find?” Before weighing in, however, it is worth interrogating a more fundamental question: What do we mean when we call science “disruptive”? And is that, in fact, the appropriate benchmark for progress? The study in question, led by entrepreneurship scholar Russell Funk, employs a citation-based metric known as the Consolidation–Disruptionindex. The tool attempts to quantify whether new research displaces prior work—a signal of disruption—or builds directly upon it, thereby reinforcing existing paradigms. It represents a noteworthy contribution to our understanding of scientific change. Their conclusion, that disruption has declined across disciplines even as the volume of scientific output has expanded, has ignited debate among scientists, scholars and policymakers. Innovation May Be Getting Harder—But Also Deeper At a structural level, science becomes more complex as it matures. In some sense it has to slow down. The simplest questions are often the first to be answered, and what remains are challenges that are more subtle, more interdependent, and more difficult to resolve. The law of diminishing marginal returns, long familiar in economics, finds a natural corollary in research: at some point the intellectual “low-hanging fruit” has largely been harvested. Yet this does not necessarily imply stagnation. In fact, science itself is evolving. I think that apparent declines in disruption reflect not an impoverishment of ideas, but a transformation in the conduct and culture of research itself. Citation practices have shifted. Publication incentives have changed. The sheer availability of data and digital resources has exploded. Comparing contemporary citation behavior to that of earlier decades is not simply apples to oranges; it’s more like comparing ecosystems separated by tectonic time. More profoundly, we might ask whether paradigm shifts—particularly those in the Kuhnian sense—are truly the milestones we should prize above all others. Much of the innovation that drives societal progress and economic productivity does not emerge from revolutions in thought, but from the subtle extension and application of existing knowledge. In fields as varied as biomedicine, agriculture, and climate science, incremental refinement has yielded results of transformative impact.Brighter green hybrid rice plantshelp increase yields at this Filipino farm.Getty Images Science Today Is More Sophisticated—And More Efficient Scientists are publishing more today than ever. Critics of contemporary science attribute this to metric-driven culture of “salami slicing,” in which ideas are fragmented into the “minimum publishable unit” so that scientists can accrue an ever-growing publication count to secure career viability in a publish-or-perish environment. But such critiques overlook the extraordinary gains in research efficiency that have occurred in the past few decades, which I think are a far more compelling explanation for the massive output of scientific research today. Since the 1980s, personal computing has transformed nearly every dimension of the scientific process. Manuscript preparation, once the province of typewriters and retyped drafts, has become seamless. Data acquisition now involves automated sensors and real-time monitoring. Analytical tools like Python and R allow researchers to conduct sophisticated modeling and statistics with unprecedented speed. Communication is instantaneous. Knowledge-sharing platforms and open-access journals have dismantled many of the old barriers to entry.Advances in microcomputer technology in the 1980s and 1990s dramatically accelerated scientific ... More research.Denver Post via Getty Images Indeed, one wonders whether critics have recently read a research paper from the 1930s or 1970s. The methodological rigor, analytical depth, and interdisciplinary scope of modern research are, by nearly any standard, vastly more advanced. The Horizon Has Expanded In biology alone, high-throughput technologies—part of the broader “omics” revolution catalyzed by innovations like the polymerase chain reaction, which enabled rapid DNA amplification and supported the eventual success of the Human Genome Project—continue to propel discovery at an astonishing pace.Nobel Prize laureate James D. Watson speaks at a press conference to announce that a six-country ... More consortium has successfully drawn up a complete map of the human genome, completing one of the most ambitious scientific projects ever and offering a major opportunity for medical advances, 14 April 2003 at the National Institute of Health in Bethesda, Maryland. The announcement coincides with the 50th anniversary of the publication of the landmark paper describing DNA's double helix by Watson and Francis Crick. AFP PHOTO / Robyn BECKAFP via Getty Images When critics lament the apparent decline of Nobel-caliber “blockbusters” they overlook that the frontier of science has expanded—not narrowed. If we consider scientific knowledge as a volume, then it is bounded by an outer edge where discovery occurs. In Euclidean geometry, as the radius of a sphere increases, the surface areagrows more slowly than the volume. While the volume of knowledge grows more rapidly—encompassing established theories and tools that continue to yield applications—the surface area also expands, and it is along this widening frontier, where the known meets the unknown, that innovation arises. Rethinking Returns on Investment The modern belief that science must deliver measurable economic returns is, historically speaking, a relatively recent development. Before the Second World War, scientific research was not broadly viewed as a driver of productivity. Economist Daniel Susskind has argued that even the concept of economic growth as a central policy goal is a mid-20th century invention. After the war, that changed dramatically. Governments began to see research as critical to national development, security, and public health. Yet even as expectations have grown, relative public investment in science has, paradoxically, diminished, despite the fact that basic scientific research is a massive accelerant of economic productivity and effectively self-financing. While absolute funding has increased, government spending on science as a share of GDP has declined in the US and many other countries. Given the scale and complexity of the challenges we now face, we may be underinvesting in the very enterprise that could deliver solutions. Recent proposals to cut funding for NIH and NSF could, by some estimates, cost the U.S. tens of billions in lost productivity. There is compelling evidence to suggest that significantly increasing R&D expenditures—doubling or even tripling them—would yield strong and sustained returns. AI and the Next Wave of Scientific Efficiency Looking to the future, artificial intelligence offers the potential to not only streamline research but also to augment the process of innovation itself. AI tools—from large language models like ChatGPT to specialized engines for data mining and synthesis—enable researchers to traverse disciplines, identify patterns, and generate new hypotheses with remarkable speed. The ability to navigate vast bodies of scientific literature—once reserved for those with access to elite research libraries and ample time for reading—has been radically democratized. Scientists today can access digitized repositories, annotate papers with precision tools, manage bibliographies with software, and instantly trace the intellectual lineage of ideas. AI-powered tools support researchers in sifting through and synthesizing material across disciplines, helping to identify patterns, highlight connections, and bring under-explored ideas into view. For researchers like myself—an ecologist who often draws inspiration from nonlinear dynamics, statistical physics, and cognitive psychology—these technologies function as accelerators of thought rather than substitutes for it. They support the process of discovering latent analogies and assembling novel constellations of insight, the kind of cognitive recombination that underlies true creativity. While deep understanding still demands sustained intellectual engagement—reading, interpretation, and critical analysis—these tools lower the barrier to discovery and expand the range of intellectual possibilities. By enhancing cross-disciplinary thinking and reducing the latency between idea and investigation, AI may well reignite the kind of scientific innovation that some believe is slipping from reach. Science as a Cultural Endeavor Finally, it bears emphasizing that the value of science is not solely, or even primarily, economic. Like the arts, literature, or philosophy, science is a cultural and intellectual enterprise. It is an expression of curiosity, a vehicle for collective self-understanding, and a means of situating ourselves within the universe. From my vantage point, and that of many colleagues, the current landscape of discovery feels more fertile than ever. The questions we pose are more ambitious, the tools at our disposal more refined, and the connections we are able to make more multidimensional. If the signal of disruption appears to be dimming, perhaps it is only because the spectrum of science has grown too broad for any single wavelength to dominate. Rather than lament an apparent slowdown, we might ask a more constructive question: Are we measuring the right things? And are we creating the conditions that allow the most vital forms of science—creative, integrative, and with the potential to transform human society for the better—to flourish? #science #slowing #down
    WWW.FORBES.COM
    Is Science Slowing Down?
    Basic scientific research is a key contributor to economic productivity.getty Is science running out of steam? A growing body of research suggests that disruptive breakthroughs—the kind that fundamentally redefine entire fields—may be occurring less frequently. A 2023 article in Nature reported that scientific papers and patents are, on average, less “disruptive” than they were in the mid-20th century. The study sparked intense interest and considerable controversy, covered in a recent news feature provocatively titled “Are Groundbreaking Science Discoveries Becoming Harder To Find?” Before weighing in, however, it is worth interrogating a more fundamental question: What do we mean when we call science “disruptive”? And is that, in fact, the appropriate benchmark for progress? The study in question, led by entrepreneurship scholar Russell Funk, employs a citation-based metric known as the Consolidation–Disruption (CD) index. The tool attempts to quantify whether new research displaces prior work—a signal of disruption—or builds directly upon it, thereby reinforcing existing paradigms. It represents a noteworthy contribution to our understanding of scientific change. Their conclusion, that disruption has declined across disciplines even as the volume of scientific output has expanded, has ignited debate among scientists, scholars and policymakers. Innovation May Be Getting Harder—But Also Deeper At a structural level, science becomes more complex as it matures. In some sense it has to slow down. The simplest questions are often the first to be answered, and what remains are challenges that are more subtle, more interdependent, and more difficult to resolve. The law of diminishing marginal returns, long familiar in economics, finds a natural corollary in research: at some point the intellectual “low-hanging fruit” has largely been harvested. Yet this does not necessarily imply stagnation. In fact, science itself is evolving. I think that apparent declines in disruption reflect not an impoverishment of ideas, but a transformation in the conduct and culture of research itself. Citation practices have shifted. Publication incentives have changed. The sheer availability of data and digital resources has exploded. Comparing contemporary citation behavior to that of earlier decades is not simply apples to oranges; it’s more like comparing ecosystems separated by tectonic time. More profoundly, we might ask whether paradigm shifts—particularly those in the Kuhnian sense—are truly the milestones we should prize above all others. Much of the innovation that drives societal progress and economic productivity does not emerge from revolutions in thought, but from the subtle extension and application of existing knowledge. In fields as varied as biomedicine, agriculture, and climate science, incremental refinement has yielded results of transformative impact.Brighter green hybrid rice plants (left) help increase yields at this Filipino farm. (Photo by ... More Dick Swanson/Getty Images)Getty Images Science Today Is More Sophisticated—And More Efficient Scientists are publishing more today than ever. Critics of contemporary science attribute this to metric-driven culture of “salami slicing,” in which ideas are fragmented into the “minimum publishable unit” so that scientists can accrue an ever-growing publication count to secure career viability in a publish-or-perish environment. But such critiques overlook the extraordinary gains in research efficiency that have occurred in the past few decades, which I think are a far more compelling explanation for the massive output of scientific research today. Since the 1980s, personal computing has transformed nearly every dimension of the scientific process. Manuscript preparation, once the province of typewriters and retyped drafts, has become seamless. Data acquisition now involves automated sensors and real-time monitoring. Analytical tools like Python and R allow researchers to conduct sophisticated modeling and statistics with unprecedented speed. Communication is instantaneous. Knowledge-sharing platforms and open-access journals have dismantled many of the old barriers to entry.Advances in microcomputer technology in the 1980s and 1990s dramatically accelerated scientific ... More research.Denver Post via Getty Images Indeed, one wonders whether critics have recently read a research paper from the 1930s or 1970s. The methodological rigor, analytical depth, and interdisciplinary scope of modern research are, by nearly any standard, vastly more advanced. The Horizon Has Expanded In biology alone, high-throughput technologies—part of the broader “omics” revolution catalyzed by innovations like the polymerase chain reaction (PCR), which enabled rapid DNA amplification and supported the eventual success of the Human Genome Project—continue to propel discovery at an astonishing pace.Nobel Prize laureate James D. Watson speaks at a press conference to announce that a six-country ... More consortium has successfully drawn up a complete map of the human genome, completing one of the most ambitious scientific projects ever and offering a major opportunity for medical advances, 14 April 2003 at the National Institute of Health in Bethesda, Maryland. The announcement coincides with the 50th anniversary of the publication of the landmark paper describing DNA's double helix by Watson and Francis Crick. AFP PHOTO / Robyn BECK (Photo credit should read ROBYN BECK/AFP via Getty Images)AFP via Getty Images When critics lament the apparent decline of Nobel-caliber “blockbusters” they overlook that the frontier of science has expanded—not narrowed. If we consider scientific knowledge as a volume, then it is bounded by an outer edge where discovery occurs. In Euclidean geometry, as the radius of a sphere increases, the surface area (scaling with the square of the radius) grows more slowly than the volume (which scales with the cube). While the volume of knowledge grows more rapidly—encompassing established theories and tools that continue to yield applications—the surface area also expands, and it is along this widening frontier, where the known meets the unknown, that innovation arises. Rethinking Returns on Investment The modern belief that science must deliver measurable economic returns is, historically speaking, a relatively recent development. Before the Second World War, scientific research was not broadly viewed as a driver of productivity. Economist Daniel Susskind has argued that even the concept of economic growth as a central policy goal is a mid-20th century invention. After the war, that changed dramatically. Governments began to see research as critical to national development, security, and public health. Yet even as expectations have grown, relative public investment in science has, paradoxically, diminished, despite the fact that basic scientific research is a massive accelerant of economic productivity and effectively self-financing. While absolute funding has increased, government spending on science as a share of GDP has declined in the US and many other countries. Given the scale and complexity of the challenges we now face, we may be underinvesting in the very enterprise that could deliver solutions. Recent proposals to cut funding for NIH and NSF could, by some estimates, cost the U.S. tens of billions in lost productivity. There is compelling evidence to suggest that significantly increasing R&D expenditures—doubling or even tripling them—would yield strong and sustained returns. AI and the Next Wave of Scientific Efficiency Looking to the future, artificial intelligence offers the potential to not only streamline research but also to augment the process of innovation itself. AI tools—from large language models like ChatGPT to specialized engines for data mining and synthesis—enable researchers to traverse disciplines, identify patterns, and generate new hypotheses with remarkable speed. The ability to navigate vast bodies of scientific literature—once reserved for those with access to elite research libraries and ample time for reading—has been radically democratized. Scientists today can access digitized repositories, annotate papers with precision tools, manage bibliographies with software, and instantly trace the intellectual lineage of ideas. AI-powered tools support researchers in sifting through and synthesizing material across disciplines, helping to identify patterns, highlight connections, and bring under-explored ideas into view. For researchers like myself—an ecologist who often draws inspiration from nonlinear dynamics, statistical physics, and cognitive psychology—these technologies function as accelerators of thought rather than substitutes for it. They support the process of discovering latent analogies and assembling novel constellations of insight, the kind of cognitive recombination that underlies true creativity. While deep understanding still demands sustained intellectual engagement—reading, interpretation, and critical analysis—these tools lower the barrier to discovery and expand the range of intellectual possibilities. By enhancing cross-disciplinary thinking and reducing the latency between idea and investigation, AI may well reignite the kind of scientific innovation that some believe is slipping from reach. Science as a Cultural Endeavor Finally, it bears emphasizing that the value of science is not solely, or even primarily, economic. Like the arts, literature, or philosophy, science is a cultural and intellectual enterprise. It is an expression of curiosity, a vehicle for collective self-understanding, and a means of situating ourselves within the universe. From my vantage point, and that of many colleagues, the current landscape of discovery feels more fertile than ever. The questions we pose are more ambitious, the tools at our disposal more refined, and the connections we are able to make more multidimensional. If the signal of disruption appears to be dimming, perhaps it is only because the spectrum of science has grown too broad for any single wavelength to dominate. Rather than lament an apparent slowdown, we might ask a more constructive question: Are we measuring the right things? And are we creating the conditions that allow the most vital forms of science—creative, integrative, and with the potential to transform human society for the better—to flourish?
    0 Commentarii 0 Distribuiri 0 previzualizare
CGShares https://cgshares.com