• Tech billionaires are making a risky bet with humanity’s future

    “The best way to predict the future is to invent it,” the famed computer scientist Alan Kay once said. Uttered more out of exasperation than as inspiration, his remark has nevertheless attained gospel-like status among Silicon Valley entrepreneurs, in particular a handful of tech billionaires who fancy themselves the chief architects of humanity’s future. 

    Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals and ambitions in the near term, but their grand visions for the next decade and beyond are remarkably similar. Framed less as technological objectives and more as existential imperatives, they include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality; establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos.

    While there’s a sprawling patchwork of ideas and philosophies powering these visions, three features play a central role, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits. In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker calls this triumvirate of beliefs the “ideology of technological salvation” and warns that tech titans are using it to steer humanity in a dangerous direction. 

    “In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress.”

    “The credence that tech billionaires give to these specific science-fictional futures validates their pursuit of more—to portray the growth of their businesses as a moral imperative, to reduce the complex problems of the world to simple questions of technology,to justify nearly any action they might want to take,” he writes. Becker argues that the only way to break free of these visions is to see them for what they are: a convenient excuse to continue destroying the environment, skirt regulations, amass more power and control, and dismiss the very real problems of today to focus on the imagined ones of tomorrow. 

    A lot of critics, academics, and journalists have tried to define or distill the Silicon Valley ethos over the years. There was the “Californian Ideology” in the mid-’90s, the “Move fast and break things” era of the early 2000s, and more recently the “Libertarianism for me, feudalism for thee”  or “techno-­authoritarian” views. How do you see the “ideology of technological salvation” fitting in? 

    I’d say it’s very much of a piece with those earlier attempts to describe the Silicon Valley mindset. I mean, you can draw a pretty straight line from Max More’s principles of transhumanism in the ’90s to the Californian Ideologyand through to what I call the ideology of technological salvation. The fact is, many of the ideas that define or animate Silicon Valley thinking have never been much of a ­mystery—libertarianism, an antipathy toward the government and regulation, the boundless faith in technology, the obsession with optimization. 

    What can be difficult is to parse where all these ideas come from and how they fit together—or if they fit together at all. I came up with the ideology of technological salvation as a way to name and give shape to a group of interrelated concepts and philosophies that can seem sprawling and ill-defined at first, but that actually sit at the center of a worldview shared by venture capitalists, executives, and other thought leaders in the tech industry. 

    Readers will likely be familiar with the tech billionaires featured in your book and at least some of their ambitions. I’m guessing they’ll be less familiar with the various “isms” that you argue have influenced or guided their thinking. Effective altruism, rationalism, long­termism, extropianism, effective accelerationism, futurism, singularitarianism, ­transhumanism—there are a lot of them. Is there something that they all share? 

    They’re definitely connected. In a sense, you could say they’re all versions or instantiations of the ideology of technological salvation, but there are also some very deep historical connections between the people in these groups and their aims and beliefs. The Extropians in the late ’80s believed in self-­transformation through technology and freedom from limitations of any kind—ideas that Ray Kurzweil eventually helped popularize and legitimize for a larger audience with the Singularity. 

    In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress. I should say that AI researcher Timnit Gebru and philosopher Émile Torres have also done a lot of great work linking these ideologies to one another and showing how they all have ties to racism, misogyny, and eugenics.

    You argue that the Singularity is the purest expression of the ideology of technological salvation. How so?

    Well, for one thing, it’s just this very simple, straightforward idea—the Singularity is coming and will occur when we merge our brains with the cloud and expand our intelligence a millionfold. This will then deepen our awareness and consciousness and everything will be amazing. In many ways, it’s a fantastical vision of a perfect technological utopia. We’re all going to live as long as we want in an eternal paradise, watched over by machines of loving grace, and everything will just get exponentially better forever. The end.

    The other isms I talk about in the book have a little more … heft isn’t the right word—they just have more stuff going on. There’s more to them, right? The rationalists and the effective altruists and the longtermists—they think that something like a singularity will happen, or could happen, but that there’s this really big danger between where we are now and that potential event. We have to address the fact that an all-powerful AI might destroy humanity—the so-called alignment problem—before any singularity can happen. 

    Then you’ve got the effective accelerationists, who are more like Kurzweil, but they’ve got more of a tech-bro spin on things. They’ve taken some of the older transhumanist ideas from the Singularity and updated them for startup culture. Marc Andreessen’s “Techno-Optimist Manifesto”is a good example. You could argue that all of these other philosophies that have gained purchase in Silicon Valley are just twists on Kurzweil’s Singularity, each one building on top of the core ideas of transcendence, techno­-optimism, and exponential growth. 

    Early on in the book you take aim at that idea of exponential growth—specifically, Kurzweil’s “Law of Accelerating Returns.” Could you explain what that is and why you think it’s flawed?

    Kurzweil thinks there’s this immutable “Law of Accelerating Returns” at work in the affairs of the universe, especially when it comes to technology. It’s the idea that technological progress isn’t linear but exponential. Advancements in one technology fuel even more rapid advancements in the future, which in turn lead to greater complexity and greater technological power, and on and on. This is just a mistake. Kurzweil uses the Law of Accelerating Returns to explain why the Singularity is inevitable, but to be clear, he’s far from the only one who believes in this so-called law.

    “I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear.”

    My sense is that it’s an idea that comes from staring at Moore’s Law for too long. Moore’s Law is of course the famous prediction that the number of transistors on a chip will double roughly every two years, with a minimal increase in cost. Now, that has in fact happened for the last 50 years or so, but not because of some fundamental law in the universe. It’s because the tech industry made a choice and some very sizable investments to make it happen. Moore’s Law was ultimately this really interesting observation or projection of a historical trend, but even Gordon Mooreknew that it wouldn’t and couldn’t last forever. In fact, some think it’s already over. 

    These ideologies take inspiration from some pretty unsavory characters. Transhumanism, you say, was first popularized by the eugenicist Julian Huxley in a speech in 1951. Marc Andreessen’s “Techno-Optimist Manifesto” name-checks the noted fascist Filippo Tommaso Marinetti and his futurist manifesto. Did you get the sense while researching the book that the tech titans who champion these ideas understand their dangerous origins?

    You’re assuming in the framing of that question that there’s any rigorous thought going on here at all. As I say in the book, Andreessen’s manifesto runs almost entirely on vibes, not logic. I think someone may have told him about the futurist manifesto at some point, and he just sort of liked the general vibe, which is why he paraphrases a part of it. Maybe he learned something about Marinetti and forgot it. Maybe he didn’t care. 

    I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear. For many of these billionaires, the vibes of fascism, authoritarianism, and colonialism are attractive because they’re fundamentally about creating a fantasy of control. 

    You argue that these visions of the future are being used to hasten environmental destruction, increase authoritarianism, and exacerbate inequalities. You also admit that they appeal to lots of people who aren’t billionaires. Why do you think that is? 

    I think a lot of us are also attracted to these ideas for the same reasons the tech billionaires are—they offer this fantasy of knowing what the future holds, of transcending death, and a sense that someone or something out there is in control. It’s hard to overstate how comforting a simple, coherent narrative can be in an increasingly complex and fast-moving world. This is of course what religion offers for many of us, and I don’t think it’s an accident that a sizable number of people in the rationalist and effective altruist communities are actually ex-evangelicals.

    More than any one specific technology, it seems like the most consequential thing these billionaires have invented is a sense of inevitability—that their visions for the future are somehow predestined. How does one fight against that?

    It’s a difficult question. For me, the answer was to write this book. I guess I’d also say this: Silicon Valley enjoyed well over a decade with little to no pushback on anything. That’s definitely a big part of how we ended up in this mess. There was no regulation, very little critical coverage in the press, and a lot of self-mythologizing going on. Things have started to change, especially as the social and environmental damage that tech companies and industry leaders have helped facilitate has become more clear. That understanding is an essential part of deflating the power of these tech billionaires and breaking free of their visions. When we understand that these dreams of the future are actually nightmares for the rest of us, I think you’ll see that senseof inevitability vanish pretty fast. 

    This interview was edited for length and clarity.

    Bryan Gardiner is a writer based in Oakland, California. 
    #tech #billionaires #are #making #risky
    Tech billionaires are making a risky bet with humanity’s future
    “The best way to predict the future is to invent it,” the famed computer scientist Alan Kay once said. Uttered more out of exasperation than as inspiration, his remark has nevertheless attained gospel-like status among Silicon Valley entrepreneurs, in particular a handful of tech billionaires who fancy themselves the chief architects of humanity’s future.  Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals and ambitions in the near term, but their grand visions for the next decade and beyond are remarkably similar. Framed less as technological objectives and more as existential imperatives, they include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality; establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos. While there’s a sprawling patchwork of ideas and philosophies powering these visions, three features play a central role, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits. In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker calls this triumvirate of beliefs the “ideology of technological salvation” and warns that tech titans are using it to steer humanity in a dangerous direction.  “In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress.” “The credence that tech billionaires give to these specific science-fictional futures validates their pursuit of more—to portray the growth of their businesses as a moral imperative, to reduce the complex problems of the world to simple questions of technology,to justify nearly any action they might want to take,” he writes. Becker argues that the only way to break free of these visions is to see them for what they are: a convenient excuse to continue destroying the environment, skirt regulations, amass more power and control, and dismiss the very real problems of today to focus on the imagined ones of tomorrow.  A lot of critics, academics, and journalists have tried to define or distill the Silicon Valley ethos over the years. There was the “Californian Ideology” in the mid-’90s, the “Move fast and break things” era of the early 2000s, and more recently the “Libertarianism for me, feudalism for thee”  or “techno-­authoritarian” views. How do you see the “ideology of technological salvation” fitting in?  I’d say it’s very much of a piece with those earlier attempts to describe the Silicon Valley mindset. I mean, you can draw a pretty straight line from Max More’s principles of transhumanism in the ’90s to the Californian Ideologyand through to what I call the ideology of technological salvation. The fact is, many of the ideas that define or animate Silicon Valley thinking have never been much of a ­mystery—libertarianism, an antipathy toward the government and regulation, the boundless faith in technology, the obsession with optimization.  What can be difficult is to parse where all these ideas come from and how they fit together—or if they fit together at all. I came up with the ideology of technological salvation as a way to name and give shape to a group of interrelated concepts and philosophies that can seem sprawling and ill-defined at first, but that actually sit at the center of a worldview shared by venture capitalists, executives, and other thought leaders in the tech industry.  Readers will likely be familiar with the tech billionaires featured in your book and at least some of their ambitions. I’m guessing they’ll be less familiar with the various “isms” that you argue have influenced or guided their thinking. Effective altruism, rationalism, long­termism, extropianism, effective accelerationism, futurism, singularitarianism, ­transhumanism—there are a lot of them. Is there something that they all share?  They’re definitely connected. In a sense, you could say they’re all versions or instantiations of the ideology of technological salvation, but there are also some very deep historical connections between the people in these groups and their aims and beliefs. The Extropians in the late ’80s believed in self-­transformation through technology and freedom from limitations of any kind—ideas that Ray Kurzweil eventually helped popularize and legitimize for a larger audience with the Singularity.  In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress. I should say that AI researcher Timnit Gebru and philosopher Émile Torres have also done a lot of great work linking these ideologies to one another and showing how they all have ties to racism, misogyny, and eugenics. You argue that the Singularity is the purest expression of the ideology of technological salvation. How so? Well, for one thing, it’s just this very simple, straightforward idea—the Singularity is coming and will occur when we merge our brains with the cloud and expand our intelligence a millionfold. This will then deepen our awareness and consciousness and everything will be amazing. In many ways, it’s a fantastical vision of a perfect technological utopia. We’re all going to live as long as we want in an eternal paradise, watched over by machines of loving grace, and everything will just get exponentially better forever. The end. The other isms I talk about in the book have a little more … heft isn’t the right word—they just have more stuff going on. There’s more to them, right? The rationalists and the effective altruists and the longtermists—they think that something like a singularity will happen, or could happen, but that there’s this really big danger between where we are now and that potential event. We have to address the fact that an all-powerful AI might destroy humanity—the so-called alignment problem—before any singularity can happen.  Then you’ve got the effective accelerationists, who are more like Kurzweil, but they’ve got more of a tech-bro spin on things. They’ve taken some of the older transhumanist ideas from the Singularity and updated them for startup culture. Marc Andreessen’s “Techno-Optimist Manifesto”is a good example. You could argue that all of these other philosophies that have gained purchase in Silicon Valley are just twists on Kurzweil’s Singularity, each one building on top of the core ideas of transcendence, techno­-optimism, and exponential growth.  Early on in the book you take aim at that idea of exponential growth—specifically, Kurzweil’s “Law of Accelerating Returns.” Could you explain what that is and why you think it’s flawed? Kurzweil thinks there’s this immutable “Law of Accelerating Returns” at work in the affairs of the universe, especially when it comes to technology. It’s the idea that technological progress isn’t linear but exponential. Advancements in one technology fuel even more rapid advancements in the future, which in turn lead to greater complexity and greater technological power, and on and on. This is just a mistake. Kurzweil uses the Law of Accelerating Returns to explain why the Singularity is inevitable, but to be clear, he’s far from the only one who believes in this so-called law. “I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear.” My sense is that it’s an idea that comes from staring at Moore’s Law for too long. Moore’s Law is of course the famous prediction that the number of transistors on a chip will double roughly every two years, with a minimal increase in cost. Now, that has in fact happened for the last 50 years or so, but not because of some fundamental law in the universe. It’s because the tech industry made a choice and some very sizable investments to make it happen. Moore’s Law was ultimately this really interesting observation or projection of a historical trend, but even Gordon Mooreknew that it wouldn’t and couldn’t last forever. In fact, some think it’s already over.  These ideologies take inspiration from some pretty unsavory characters. Transhumanism, you say, was first popularized by the eugenicist Julian Huxley in a speech in 1951. Marc Andreessen’s “Techno-Optimist Manifesto” name-checks the noted fascist Filippo Tommaso Marinetti and his futurist manifesto. Did you get the sense while researching the book that the tech titans who champion these ideas understand their dangerous origins? You’re assuming in the framing of that question that there’s any rigorous thought going on here at all. As I say in the book, Andreessen’s manifesto runs almost entirely on vibes, not logic. I think someone may have told him about the futurist manifesto at some point, and he just sort of liked the general vibe, which is why he paraphrases a part of it. Maybe he learned something about Marinetti and forgot it. Maybe he didn’t care.  I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear. For many of these billionaires, the vibes of fascism, authoritarianism, and colonialism are attractive because they’re fundamentally about creating a fantasy of control.  You argue that these visions of the future are being used to hasten environmental destruction, increase authoritarianism, and exacerbate inequalities. You also admit that they appeal to lots of people who aren’t billionaires. Why do you think that is?  I think a lot of us are also attracted to these ideas for the same reasons the tech billionaires are—they offer this fantasy of knowing what the future holds, of transcending death, and a sense that someone or something out there is in control. It’s hard to overstate how comforting a simple, coherent narrative can be in an increasingly complex and fast-moving world. This is of course what religion offers for many of us, and I don’t think it’s an accident that a sizable number of people in the rationalist and effective altruist communities are actually ex-evangelicals. More than any one specific technology, it seems like the most consequential thing these billionaires have invented is a sense of inevitability—that their visions for the future are somehow predestined. How does one fight against that? It’s a difficult question. For me, the answer was to write this book. I guess I’d also say this: Silicon Valley enjoyed well over a decade with little to no pushback on anything. That’s definitely a big part of how we ended up in this mess. There was no regulation, very little critical coverage in the press, and a lot of self-mythologizing going on. Things have started to change, especially as the social and environmental damage that tech companies and industry leaders have helped facilitate has become more clear. That understanding is an essential part of deflating the power of these tech billionaires and breaking free of their visions. When we understand that these dreams of the future are actually nightmares for the rest of us, I think you’ll see that senseof inevitability vanish pretty fast.  This interview was edited for length and clarity. Bryan Gardiner is a writer based in Oakland, California.  #tech #billionaires #are #making #risky
    Tech billionaires are making a risky bet with humanity’s future
    www.technologyreview.com
    “The best way to predict the future is to invent it,” the famed computer scientist Alan Kay once said. Uttered more out of exasperation than as inspiration, his remark has nevertheless attained gospel-like status among Silicon Valley entrepreneurs, in particular a handful of tech billionaires who fancy themselves the chief architects of humanity’s future.  Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals and ambitions in the near term, but their grand visions for the next decade and beyond are remarkably similar. Framed less as technological objectives and more as existential imperatives, they include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality (or something close to it); establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos. While there’s a sprawling patchwork of ideas and philosophies powering these visions, three features play a central role, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits. In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker calls this triumvirate of beliefs the “ideology of technological salvation” and warns that tech titans are using it to steer humanity in a dangerous direction.  “In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress.” “The credence that tech billionaires give to these specific science-fictional futures validates their pursuit of more—to portray the growth of their businesses as a moral imperative, to reduce the complex problems of the world to simple questions of technology, [and] to justify nearly any action they might want to take,” he writes. Becker argues that the only way to break free of these visions is to see them for what they are: a convenient excuse to continue destroying the environment, skirt regulations, amass more power and control, and dismiss the very real problems of today to focus on the imagined ones of tomorrow.  A lot of critics, academics, and journalists have tried to define or distill the Silicon Valley ethos over the years. There was the “Californian Ideology” in the mid-’90s, the “Move fast and break things” era of the early 2000s, and more recently the “Libertarianism for me, feudalism for thee”  or “techno-­authoritarian” views. How do you see the “ideology of technological salvation” fitting in?  I’d say it’s very much of a piece with those earlier attempts to describe the Silicon Valley mindset. I mean, you can draw a pretty straight line from Max More’s principles of transhumanism in the ’90s to the Californian Ideology [a mashup of countercultural, libertarian, and neoliberal values] and through to what I call the ideology of technological salvation. The fact is, many of the ideas that define or animate Silicon Valley thinking have never been much of a ­mystery—libertarianism, an antipathy toward the government and regulation, the boundless faith in technology, the obsession with optimization.  What can be difficult is to parse where all these ideas come from and how they fit together—or if they fit together at all. I came up with the ideology of technological salvation as a way to name and give shape to a group of interrelated concepts and philosophies that can seem sprawling and ill-defined at first, but that actually sit at the center of a worldview shared by venture capitalists, executives, and other thought leaders in the tech industry.  Readers will likely be familiar with the tech billionaires featured in your book and at least some of their ambitions. I’m guessing they’ll be less familiar with the various “isms” that you argue have influenced or guided their thinking. Effective altruism, rationalism, long­termism, extropianism, effective accelerationism, futurism, singularitarianism, ­transhumanism—there are a lot of them. Is there something that they all share?  They’re definitely connected. In a sense, you could say they’re all versions or instantiations of the ideology of technological salvation, but there are also some very deep historical connections between the people in these groups and their aims and beliefs. The Extropians in the late ’80s believed in self-­transformation through technology and freedom from limitations of any kind—ideas that Ray Kurzweil eventually helped popularize and legitimize for a larger audience with the Singularity.  In most of these isms you’ll find the idea of escape and transcendence, as well as the promise of an amazing future, full of unimaginable wonders—so long as we don’t get in the way of technological progress. I should say that AI researcher Timnit Gebru and philosopher Émile Torres have also done a lot of great work linking these ideologies to one another and showing how they all have ties to racism, misogyny, and eugenics. You argue that the Singularity is the purest expression of the ideology of technological salvation. How so? Well, for one thing, it’s just this very simple, straightforward idea—the Singularity is coming and will occur when we merge our brains with the cloud and expand our intelligence a millionfold. This will then deepen our awareness and consciousness and everything will be amazing. In many ways, it’s a fantastical vision of a perfect technological utopia. We’re all going to live as long as we want in an eternal paradise, watched over by machines of loving grace, and everything will just get exponentially better forever. The end. The other isms I talk about in the book have a little more … heft isn’t the right word—they just have more stuff going on. There’s more to them, right? The rationalists and the effective altruists and the longtermists—they think that something like a singularity will happen, or could happen, but that there’s this really big danger between where we are now and that potential event. We have to address the fact that an all-powerful AI might destroy humanity—the so-called alignment problem—before any singularity can happen.  Then you’ve got the effective accelerationists, who are more like Kurzweil, but they’ve got more of a tech-bro spin on things. They’ve taken some of the older transhumanist ideas from the Singularity and updated them for startup culture. Marc Andreessen’s “Techno-Optimist Manifesto” [from 2023] is a good example. You could argue that all of these other philosophies that have gained purchase in Silicon Valley are just twists on Kurzweil’s Singularity, each one building on top of the core ideas of transcendence, techno­-optimism, and exponential growth.  Early on in the book you take aim at that idea of exponential growth—specifically, Kurzweil’s “Law of Accelerating Returns.” Could you explain what that is and why you think it’s flawed? Kurzweil thinks there’s this immutable “Law of Accelerating Returns” at work in the affairs of the universe, especially when it comes to technology. It’s the idea that technological progress isn’t linear but exponential. Advancements in one technology fuel even more rapid advancements in the future, which in turn lead to greater complexity and greater technological power, and on and on. This is just a mistake. Kurzweil uses the Law of Accelerating Returns to explain why the Singularity is inevitable, but to be clear, he’s far from the only one who believes in this so-called law. “I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear.” My sense is that it’s an idea that comes from staring at Moore’s Law for too long. Moore’s Law is of course the famous prediction that the number of transistors on a chip will double roughly every two years, with a minimal increase in cost. Now, that has in fact happened for the last 50 years or so, but not because of some fundamental law in the universe. It’s because the tech industry made a choice and some very sizable investments to make it happen. Moore’s Law was ultimately this really interesting observation or projection of a historical trend, but even Gordon Moore [who first articulated it] knew that it wouldn’t and couldn’t last forever. In fact, some think it’s already over.  These ideologies take inspiration from some pretty unsavory characters. Transhumanism, you say, was first popularized by the eugenicist Julian Huxley in a speech in 1951. Marc Andreessen’s “Techno-Optimist Manifesto” name-checks the noted fascist Filippo Tommaso Marinetti and his futurist manifesto. Did you get the sense while researching the book that the tech titans who champion these ideas understand their dangerous origins? You’re assuming in the framing of that question that there’s any rigorous thought going on here at all. As I say in the book, Andreessen’s manifesto runs almost entirely on vibes, not logic. I think someone may have told him about the futurist manifesto at some point, and he just sort of liked the general vibe, which is why he paraphrases a part of it. Maybe he learned something about Marinetti and forgot it. Maybe he didn’t care.  I really believe that when you get as rich as some of these guys are, you can just do things that seem like thinking and no one is really going to correct you or tell you things you don’t want to hear. For many of these billionaires, the vibes of fascism, authoritarianism, and colonialism are attractive because they’re fundamentally about creating a fantasy of control.  You argue that these visions of the future are being used to hasten environmental destruction, increase authoritarianism, and exacerbate inequalities. You also admit that they appeal to lots of people who aren’t billionaires. Why do you think that is?  I think a lot of us are also attracted to these ideas for the same reasons the tech billionaires are—they offer this fantasy of knowing what the future holds, of transcending death, and a sense that someone or something out there is in control. It’s hard to overstate how comforting a simple, coherent narrative can be in an increasingly complex and fast-moving world. This is of course what religion offers for many of us, and I don’t think it’s an accident that a sizable number of people in the rationalist and effective altruist communities are actually ex-evangelicals. More than any one specific technology, it seems like the most consequential thing these billionaires have invented is a sense of inevitability—that their visions for the future are somehow predestined. How does one fight against that? It’s a difficult question. For me, the answer was to write this book. I guess I’d also say this: Silicon Valley enjoyed well over a decade with little to no pushback on anything. That’s definitely a big part of how we ended up in this mess. There was no regulation, very little critical coverage in the press, and a lot of self-mythologizing going on. Things have started to change, especially as the social and environmental damage that tech companies and industry leaders have helped facilitate has become more clear. That understanding is an essential part of deflating the power of these tech billionaires and breaking free of their visions. When we understand that these dreams of the future are actually nightmares for the rest of us, I think you’ll see that senseof inevitability vanish pretty fast.  This interview was edited for length and clarity. Bryan Gardiner is a writer based in Oakland, California. 
    Like
    Love
    Wow
    Sad
    Angry
    535
    · 2 Yorumlar ·0 hisse senetleri ·0 önizleme
  • Suspect in Minnesota Shooting Linked to Security Company, Evangelical Ministry

    The alleged shooter is a 57-year-old white male; according to his ministry's website, he “sought out militant Islamists in order to share the gospel and tell them that violence wasn't the answer.”
    #suspect #minnesota #shooting #linked #security
    Suspect in Minnesota Shooting Linked to Security Company, Evangelical Ministry
    The alleged shooter is a 57-year-old white male; according to his ministry's website, he “sought out militant Islamists in order to share the gospel and tell them that violence wasn't the answer.” #suspect #minnesota #shooting #linked #security
    Suspect in Minnesota Shooting Linked to Security Company, Evangelical Ministry
    www.wired.com
    The alleged shooter is a 57-year-old white male; according to his ministry's website, he “sought out militant Islamists in order to share the gospel and tell them that violence wasn't the answer.”
    0 Yorumlar ·0 hisse senetleri ·0 önizleme
  • Christian nationalists decided empathy is a sin. Now it’s gone mainstream.

    It’s a provocative idea: that empathy — that is, putting yourself in another person’s proverbial shoes, and feeling what they feel — is a sin. The Bible contains repeated invocations from Jesus to show deep empathy and compassion for others, including complete strangers. He’s very clear on this point. Moreover, Christianity is built around a fundamental act of empathy so radical — Jesus dying for our sins — that it’s difficult to spin as harmful. Yet as stunning as it may sound, “empathy is a sin” is a claim that’s been growing in recent years across the Christian right. It was first articulated six years ago by controversial pastor and theologian Joe Rigney, now author of the recently published book, The Sin of Empathy, which has drawn plenty of debate among religious commentators. In this construction, empathy is a cudgel that progressives and liberals use to berate and/or guilt-trip Christians into showing empathy to the “wrong” people. Had it stayed within the realm of far-right evangelicals, we likely wouldn’t be discussing this strange view of empathy at all. Yet we are living in an age when the Christian right has gained unprecedented power, both sociocultural and political. The increasing overlap between conservative culture and right-leaning tech spaces means that many disparate public figures are all drinking from the same well of ideas — and so a broader, secular version of the belief that empathy is a tool of manipulation has bubbled into the mainstream through influential figures like Elon Musk.What “empathy is a sin” actually meansThe proposition that too much empathy is a bad thing is far from an idea that belongs to the right. On Reddit, which tends to be relatively left-wing, one popular mantra is that you can’t set yourself on fire to keep someone else warm. That is, too much empathy for someone else can erode your own sense of self, leaving you codependent or open to emotional abuse and manipulation. That’s a pretty standard part of most relationship and self-help advice — even from some Christian advice authors. But in recent months, the idea that empathy is inherently destructive has not only become a major source of debate among Christians, it’s escaped containment and barreled into the mainstream by way of major media outlets, political figures, and influencers.The conversation began with an incendiary 2019 essay by Rigney, then a longtime teacher and pastor at a Baptist seminary, in which he introduced “the enticing sin of empathy” and argued that Satan manipulates people through the intense cultural pressure to feel others’ pain and suffering. Rigney’s ideas were met with ideological pushback, with one Christian blogger saying it “may be the most unwise piece of pastoral theology I’ve seen in my lifetime.” As his essay incited national debate, Rigney himself grew more controversial, facing allegations of dismissing women and telling one now-former Black congregant at his Minneapolis church that “it wouldn’t be sinful for him to own me & my family today.”Rigney also has a longtime affiliation with Doug Wilson, the leader of the Reformed Christian Christ Church in Moscow, Idaho. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community.Now well-known for spreading Christian nationalism, and for allegedly fostering a culture of abuse, Wilson’s infamy also comes from his co-authored 1996 essay “Southern Slavery: As It Was,” in which he claimed that “Slavery produced in the South a genuine affection between the races that we believe we can say has never existed in any nation before the War or since.”Rigney appeared on Wilson’s 2019 podcast series Man Rampant to discuss empathy; their conversation quickly devolved into decrying fake rape allegations and musing that victims of police violence might have “deserved to be shot.” In an email, Rigney told me that both he and Wilson developed their similar views on empathy from the therapist and Rabbi Edwin Friedman, whose posthumously published 1999 book, A Failure of Nerve: Leadership in the Age of the Quick Fix, has influenced not only family therapy but conservative church leadership and thought. In the book, Friedman argues that American society has devalued the self, leading to an emotional regression and a “low pain threshold.” Alongside this he compares “political correctness” to the Inquisition, and frames a “chronically anxious America” as one that is “organizearound its most dysfunctional elements,” in which leaders have difficulty making tough decisions. This correlation of emotional weakness with societal excess paved the way for Rigney to frame empathy itself as a dangerous weapon. Despite using the incendiary generalization, “empathy is sin,” Rigney told me that it is not all empathy that is sinful, but specifically “untethered empathy.” He describes this as “empathy that is detached or unmoored from reality, from what is good and right.”“Just as ‘the sin of anger’ refers to unrighteous or ungoverned anger, so the sin of empathy refers to ungoverned, excessive, and untethered empathy,” Rigney told me. This kind of unrestrained empathy, he writes, is a recipe for cultural mayhem. In theory, Rigney argues that one should be “tethered” to God’s will and not to Satan. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community. When I asked him for a general reconciliation of his views with the Golden Rule, he sent me a response in which he brought up trans identity in order to label it a “fantasy” that contradicts “God-given biological reality,” while misgendering a hypothetical trans person. The demonization of empathy moves into the mainstreamDespite receiving firm pushback from most religious leaderswho hear about it, Rigney’s argument has been spreading through the Christian right at large. Last year, conservative personality and author Allie Stuckey published Toxic Empathy: How Progressives Exploit Christian Compassion, in which she argues that “toxic empathy is a dangerous guide for our decisions, behavior, and public policy” while condemning queer people and feminists. “Empathy almost needs to be struck from the Christian vocabulary,” Josh McPherson, host of the Christian-centered Stronger Man Nation podcast and an adherent of Wilson and Rigney’s ideas, said in January, in a clip that garnered an outsize amount of attention relative to the podcast episode itself. That same month, Vice President JD Vance struck a nerve with a controversial appearance on Fox News in which he seemed to reference both the empathy conversation and the archaic Catholic concept of “ordo amoris,” meaning “the order of love.” As Vance put it, it’s the idea that one’s family should come before anyone else: “You love your family, and then you love your neighbor, and then you love your community, and then you love your fellow citizens in your own country,” he said. “And then after that, you can focus and prioritize the rest of the world.” In a follow-up on X, he posted, “the idea that there isn’t a hierarchy of obligations violates basic common sense.” Vance’s statements received backlash from many people, including both the late Pope Francis and then-future Pope Leo XIV — but the controversy just drove the idea further into the mainstream. As part of the odd crossover between far-right religion and online reactionaries, it picked up surprising alliances along the way, including evolutionary biologist turned far-right gadfly Gad Saad. In January, Saad, applying a survival-of-the-fittest approach to our emotions, argued against “suicidal empathy,” which he described as “the inability to implement optimal decisions when our emotional system is tricked into an orgiastic hyperactive form of empathy, deployed on the wrong targets.”In a February appearance on Joe Rogan’s podcast, Elon Musk explicitly referenced Saad but went even further, stating, “The fundamental weakness of Western civilization is empathy — the empathy exploit. They’re exploiting a bug in Western civilization” — the “they” here being the left wing. “I think empathy is good,” Musk added, “but you need to think it through, and not just be programmed like a robot.” By March, mainstream media had noticed the conversation. David French had observed the “strange spectacle” of the Christian turn against empathy in a column for the New York Times. In April, a deep-dive in the Guardian followed. That same month, a broad-ranging conversation in the New Yorker with Albert Mohler, president of the Southern Baptist Theological Seminary, led to interviewer Isaac Chotiner pressing him about why empathy is bad. The discussion, of deported Venezuelan immigrants wrongfully suspected of having gang tattoos, led to Mohler saying that “there’s no reason anyone other than a gang member should have that tattoo.”The pro-empathy backlash is fierce The connective tissue across all these disparate anti-empathy voices is two-fold, according to Christian scholar Karen Swallow Prior. Prior, an anti-abortion ethicist and former longtime Liberty University professor, singled out the argument’s outsize emphasis on attacking very small, very vulnerable groups — as well as the moment in which it’s all happening.“The entire discourse around empathy is backlash against those who are questioning the authority of those in power,” she told me, “not coincidentally emerging in a period where we have a rise in recognition of overly controlling and narcissistic leaders, both in and outside the church.” Those people “understand and appreciate empathy the least.”“Trump made it okay to not be okay with culture,” Peter Bell, co-creator and producer of the Sons of Patriarchy podcast, which explores longstanding allegations of emotional and sexual abuse against Doug Wilson’s Christ Church, told me.“He made it kind of cool for Christians to be jerks,” Bell said. “He made the unspoken things spoken, the whispered things shouted out loud.”Prior believes that the argument won’t have a long shelf life because Rigney’s idea is so convoluted. Yet she added that it’s born out of toxic masculinity, in an age where stoicism, traditionally male-coded, is increasingly part of the regular cultural diet of men via figures like Jordan Peterson. That hypermasculinity goes hand in hand with evangelical culture, and with the ideas Rigney borrowed from Friedman about too many emotions being a weakness. In this framing, emotion becomes non-masculine by default — i.e., feminine.“Everybody’s supposed to have sympathy for the white male, but when you show empathy to anyone else, suddenly empathy is a sin.”— Karen Swallow Prior, Christian scholarThat leads us to the grimmest part of Rigney’s “untethered empathy” claims: the way he explicitly genders it — and demonizes it — as feminine. Throughout his book, he argues that women are more empathetic than men, and that as a result, they are more prone to giving into it as a sin. It’s an inherently misogynistic view that undermines women’s decision-making and leadership abilities. Though Rigney pushed back against this characterization in an email to me, arguing that critics have distorted what he views as merely “gendered tendencies and susceptibility to particular temptations,” he also couldn’t help reinforcing it. “emale tendencies, like male tendencies, have particular dangers, temptations, and weaknesses,” he wrote. Women thus should recognize this and “take deliberate, Spirit-wrought action to resist the impulse to become a devouring HR department that wants to run the world.”As Prior explains, though, Rigney’s just fine with a mythic national human resources department, as long as it supports the status quo. “Everybody’s supposed to have sympathy for the white male,” she said, “but when you show empathy to anyone else, suddenly empathy is a sin.”What’s heartening is that, whether they realize what kind of dangerous extremism undergirds it, most people aren’t buying Rigney’s “empathy is sin” claim. Across the nation, in response to Rigney’s assertion, the catchphrase, “If empathy is a sin, then sin boldly” has arisen, as heard in pulpits, seen on church marquees, and worn on T-shirts — a reminder that it takes much more than the semantic whims of a few extremists to shake something most people hold in their hearts.See More:
    #christian #nationalists #decided #empathy #sin
    Christian nationalists decided empathy is a sin. Now it’s gone mainstream.
    It’s a provocative idea: that empathy — that is, putting yourself in another person’s proverbial shoes, and feeling what they feel — is a sin. The Bible contains repeated invocations from Jesus to show deep empathy and compassion for others, including complete strangers. He’s very clear on this point. Moreover, Christianity is built around a fundamental act of empathy so radical — Jesus dying for our sins — that it’s difficult to spin as harmful. Yet as stunning as it may sound, “empathy is a sin” is a claim that’s been growing in recent years across the Christian right. It was first articulated six years ago by controversial pastor and theologian Joe Rigney, now author of the recently published book, The Sin of Empathy, which has drawn plenty of debate among religious commentators. In this construction, empathy is a cudgel that progressives and liberals use to berate and/or guilt-trip Christians into showing empathy to the “wrong” people. Had it stayed within the realm of far-right evangelicals, we likely wouldn’t be discussing this strange view of empathy at all. Yet we are living in an age when the Christian right has gained unprecedented power, both sociocultural and political. The increasing overlap between conservative culture and right-leaning tech spaces means that many disparate public figures are all drinking from the same well of ideas — and so a broader, secular version of the belief that empathy is a tool of manipulation has bubbled into the mainstream through influential figures like Elon Musk.What “empathy is a sin” actually meansThe proposition that too much empathy is a bad thing is far from an idea that belongs to the right. On Reddit, which tends to be relatively left-wing, one popular mantra is that you can’t set yourself on fire to keep someone else warm. That is, too much empathy for someone else can erode your own sense of self, leaving you codependent or open to emotional abuse and manipulation. That’s a pretty standard part of most relationship and self-help advice — even from some Christian advice authors. But in recent months, the idea that empathy is inherently destructive has not only become a major source of debate among Christians, it’s escaped containment and barreled into the mainstream by way of major media outlets, political figures, and influencers.The conversation began with an incendiary 2019 essay by Rigney, then a longtime teacher and pastor at a Baptist seminary, in which he introduced “the enticing sin of empathy” and argued that Satan manipulates people through the intense cultural pressure to feel others’ pain and suffering. Rigney’s ideas were met with ideological pushback, with one Christian blogger saying it “may be the most unwise piece of pastoral theology I’ve seen in my lifetime.” As his essay incited national debate, Rigney himself grew more controversial, facing allegations of dismissing women and telling one now-former Black congregant at his Minneapolis church that “it wouldn’t be sinful for him to own me & my family today.”Rigney also has a longtime affiliation with Doug Wilson, the leader of the Reformed Christian Christ Church in Moscow, Idaho. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community.Now well-known for spreading Christian nationalism, and for allegedly fostering a culture of abuse, Wilson’s infamy also comes from his co-authored 1996 essay “Southern Slavery: As It Was,” in which he claimed that “Slavery produced in the South a genuine affection between the races that we believe we can say has never existed in any nation before the War or since.”Rigney appeared on Wilson’s 2019 podcast series Man Rampant to discuss empathy; their conversation quickly devolved into decrying fake rape allegations and musing that victims of police violence might have “deserved to be shot.” In an email, Rigney told me that both he and Wilson developed their similar views on empathy from the therapist and Rabbi Edwin Friedman, whose posthumously published 1999 book, A Failure of Nerve: Leadership in the Age of the Quick Fix, has influenced not only family therapy but conservative church leadership and thought. In the book, Friedman argues that American society has devalued the self, leading to an emotional regression and a “low pain threshold.” Alongside this he compares “political correctness” to the Inquisition, and frames a “chronically anxious America” as one that is “organizearound its most dysfunctional elements,” in which leaders have difficulty making tough decisions. This correlation of emotional weakness with societal excess paved the way for Rigney to frame empathy itself as a dangerous weapon. Despite using the incendiary generalization, “empathy is sin,” Rigney told me that it is not all empathy that is sinful, but specifically “untethered empathy.” He describes this as “empathy that is detached or unmoored from reality, from what is good and right.”“Just as ‘the sin of anger’ refers to unrighteous or ungoverned anger, so the sin of empathy refers to ungoverned, excessive, and untethered empathy,” Rigney told me. This kind of unrestrained empathy, he writes, is a recipe for cultural mayhem. In theory, Rigney argues that one should be “tethered” to God’s will and not to Satan. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community. When I asked him for a general reconciliation of his views with the Golden Rule, he sent me a response in which he brought up trans identity in order to label it a “fantasy” that contradicts “God-given biological reality,” while misgendering a hypothetical trans person. The demonization of empathy moves into the mainstreamDespite receiving firm pushback from most religious leaderswho hear about it, Rigney’s argument has been spreading through the Christian right at large. Last year, conservative personality and author Allie Stuckey published Toxic Empathy: How Progressives Exploit Christian Compassion, in which she argues that “toxic empathy is a dangerous guide for our decisions, behavior, and public policy” while condemning queer people and feminists. “Empathy almost needs to be struck from the Christian vocabulary,” Josh McPherson, host of the Christian-centered Stronger Man Nation podcast and an adherent of Wilson and Rigney’s ideas, said in January, in a clip that garnered an outsize amount of attention relative to the podcast episode itself. That same month, Vice President JD Vance struck a nerve with a controversial appearance on Fox News in which he seemed to reference both the empathy conversation and the archaic Catholic concept of “ordo amoris,” meaning “the order of love.” As Vance put it, it’s the idea that one’s family should come before anyone else: “You love your family, and then you love your neighbor, and then you love your community, and then you love your fellow citizens in your own country,” he said. “And then after that, you can focus and prioritize the rest of the world.” In a follow-up on X, he posted, “the idea that there isn’t a hierarchy of obligations violates basic common sense.” Vance’s statements received backlash from many people, including both the late Pope Francis and then-future Pope Leo XIV — but the controversy just drove the idea further into the mainstream. As part of the odd crossover between far-right religion and online reactionaries, it picked up surprising alliances along the way, including evolutionary biologist turned far-right gadfly Gad Saad. In January, Saad, applying a survival-of-the-fittest approach to our emotions, argued against “suicidal empathy,” which he described as “the inability to implement optimal decisions when our emotional system is tricked into an orgiastic hyperactive form of empathy, deployed on the wrong targets.”In a February appearance on Joe Rogan’s podcast, Elon Musk explicitly referenced Saad but went even further, stating, “The fundamental weakness of Western civilization is empathy — the empathy exploit. They’re exploiting a bug in Western civilization” — the “they” here being the left wing. “I think empathy is good,” Musk added, “but you need to think it through, and not just be programmed like a robot.” By March, mainstream media had noticed the conversation. David French had observed the “strange spectacle” of the Christian turn against empathy in a column for the New York Times. In April, a deep-dive in the Guardian followed. That same month, a broad-ranging conversation in the New Yorker with Albert Mohler, president of the Southern Baptist Theological Seminary, led to interviewer Isaac Chotiner pressing him about why empathy is bad. The discussion, of deported Venezuelan immigrants wrongfully suspected of having gang tattoos, led to Mohler saying that “there’s no reason anyone other than a gang member should have that tattoo.”The pro-empathy backlash is fierce The connective tissue across all these disparate anti-empathy voices is two-fold, according to Christian scholar Karen Swallow Prior. Prior, an anti-abortion ethicist and former longtime Liberty University professor, singled out the argument’s outsize emphasis on attacking very small, very vulnerable groups — as well as the moment in which it’s all happening.“The entire discourse around empathy is backlash against those who are questioning the authority of those in power,” she told me, “not coincidentally emerging in a period where we have a rise in recognition of overly controlling and narcissistic leaders, both in and outside the church.” Those people “understand and appreciate empathy the least.”“Trump made it okay to not be okay with culture,” Peter Bell, co-creator and producer of the Sons of Patriarchy podcast, which explores longstanding allegations of emotional and sexual abuse against Doug Wilson’s Christ Church, told me.“He made it kind of cool for Christians to be jerks,” Bell said. “He made the unspoken things spoken, the whispered things shouted out loud.”Prior believes that the argument won’t have a long shelf life because Rigney’s idea is so convoluted. Yet she added that it’s born out of toxic masculinity, in an age where stoicism, traditionally male-coded, is increasingly part of the regular cultural diet of men via figures like Jordan Peterson. That hypermasculinity goes hand in hand with evangelical culture, and with the ideas Rigney borrowed from Friedman about too many emotions being a weakness. In this framing, emotion becomes non-masculine by default — i.e., feminine.“Everybody’s supposed to have sympathy for the white male, but when you show empathy to anyone else, suddenly empathy is a sin.”— Karen Swallow Prior, Christian scholarThat leads us to the grimmest part of Rigney’s “untethered empathy” claims: the way he explicitly genders it — and demonizes it — as feminine. Throughout his book, he argues that women are more empathetic than men, and that as a result, they are more prone to giving into it as a sin. It’s an inherently misogynistic view that undermines women’s decision-making and leadership abilities. Though Rigney pushed back against this characterization in an email to me, arguing that critics have distorted what he views as merely “gendered tendencies and susceptibility to particular temptations,” he also couldn’t help reinforcing it. “emale tendencies, like male tendencies, have particular dangers, temptations, and weaknesses,” he wrote. Women thus should recognize this and “take deliberate, Spirit-wrought action to resist the impulse to become a devouring HR department that wants to run the world.”As Prior explains, though, Rigney’s just fine with a mythic national human resources department, as long as it supports the status quo. “Everybody’s supposed to have sympathy for the white male,” she said, “but when you show empathy to anyone else, suddenly empathy is a sin.”What’s heartening is that, whether they realize what kind of dangerous extremism undergirds it, most people aren’t buying Rigney’s “empathy is sin” claim. Across the nation, in response to Rigney’s assertion, the catchphrase, “If empathy is a sin, then sin boldly” has arisen, as heard in pulpits, seen on church marquees, and worn on T-shirts — a reminder that it takes much more than the semantic whims of a few extremists to shake something most people hold in their hearts.See More: #christian #nationalists #decided #empathy #sin
    Christian nationalists decided empathy is a sin. Now it’s gone mainstream.
    www.vox.com
    It’s a provocative idea: that empathy — that is, putting yourself in another person’s proverbial shoes, and feeling what they feel — is a sin. The Bible contains repeated invocations from Jesus to show deep empathy and compassion for others, including complete strangers. He’s very clear on this point. Moreover, Christianity is built around a fundamental act of empathy so radical — Jesus dying for our sins — that it’s difficult to spin as harmful. Yet as stunning as it may sound, “empathy is a sin” is a claim that’s been growing in recent years across the Christian right. It was first articulated six years ago by controversial pastor and theologian Joe Rigney, now author of the recently published book, The Sin of Empathy, which has drawn plenty of debate among religious commentators. In this construction, empathy is a cudgel that progressives and liberals use to berate and/or guilt-trip Christians into showing empathy to the “wrong” people. Had it stayed within the realm of far-right evangelicals, we likely wouldn’t be discussing this strange view of empathy at all. Yet we are living in an age when the Christian right has gained unprecedented power, both sociocultural and political. The increasing overlap between conservative culture and right-leaning tech spaces means that many disparate public figures are all drinking from the same well of ideas — and so a broader, secular version of the belief that empathy is a tool of manipulation has bubbled into the mainstream through influential figures like Elon Musk.What “empathy is a sin” actually meansThe proposition that too much empathy is a bad thing is far from an idea that belongs to the right. On Reddit, which tends to be relatively left-wing, one popular mantra is that you can’t set yourself on fire to keep someone else warm. That is, too much empathy for someone else can erode your own sense of self, leaving you codependent or open to emotional abuse and manipulation. That’s a pretty standard part of most relationship and self-help advice — even from some Christian advice authors. But in recent months, the idea that empathy is inherently destructive has not only become a major source of debate among Christians, it’s escaped containment and barreled into the mainstream by way of major media outlets, political figures, and influencers.The conversation began with an incendiary 2019 essay by Rigney, then a longtime teacher and pastor at a Baptist seminary, in which he introduced “the enticing sin of empathy” and argued that Satan manipulates people through the intense cultural pressure to feel others’ pain and suffering. Rigney’s ideas were met with ideological pushback, with one Christian blogger saying it “may be the most unwise piece of pastoral theology I’ve seen in my lifetime.” As his essay incited national debate, Rigney himself grew more controversial, facing allegations of dismissing women and telling one now-former Black congregant at his Minneapolis church that “it wouldn’t be sinful for him to own me & my family today.” (In an email to Vox, Rigney denied the congregant’s version of events.) Rigney also has a longtime affiliation with Doug Wilson, the leader of the Reformed Christian Christ Church in Moscow, Idaho. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community.Now well-known for spreading Christian nationalism, and for allegedly fostering a culture of abuse (allegations he has denied), Wilson’s infamy also comes from his co-authored 1996 essay “Southern Slavery: As It Was,” in which he claimed that “Slavery produced in the South a genuine affection between the races that we believe we can say has never existed in any nation before the War or since.” (“My defense of the South does not make me a racist,” Wilson said in 2003.) Rigney appeared on Wilson’s 2019 podcast series Man Rampant to discuss empathy; their conversation quickly devolved into decrying fake rape allegations and musing that victims of police violence might have “deserved to be shot.” In an email, Rigney told me that both he and Wilson developed their similar views on empathy from the therapist and Rabbi Edwin Friedman, whose posthumously published 1999 book, A Failure of Nerve: Leadership in the Age of the Quick Fix, has influenced not only family therapy but conservative church leadership and thought. In the book, Friedman argues that American society has devalued the self, leading to an emotional regression and a “low pain threshold.” Alongside this he compares “political correctness” to the Inquisition, and frames a “chronically anxious America” as one that is “organize[d] around its most dysfunctional elements,” in which leaders have difficulty making tough decisions. This correlation of emotional weakness with societal excess paved the way for Rigney to frame empathy itself as a dangerous weapon. Despite using the incendiary generalization, “empathy is sin,” Rigney told me that it is not all empathy that is sinful, but specifically “untethered empathy.” He describes this as “empathy that is detached or unmoored from reality, from what is good and right.” (An explanation that begs definitions for “reality,” “good,” and “right.”)“Just as ‘the sin of anger’ refers to unrighteous or ungoverned anger, so the sin of empathy refers to ungoverned, excessive, and untethered empathy,” Rigney told me. This kind of unrestrained empathy, he writes, is a recipe for cultural mayhem. In theory, Rigney argues that one should be “tethered” to God’s will and not to Satan. In practice, what Rigney is typically decrying is any empathy for a liberal perspective or for someone who’s part of a marginalized community. When I asked him for a general reconciliation of his views with the Golden Rule, he sent me a response in which he brought up trans identity in order to label it a “fantasy” that contradicts “God-given biological reality,” while misgendering a hypothetical trans person. The demonization of empathy moves into the mainstreamDespite receiving firm pushback from most religious leaders (and indeed most people) who hear about it, Rigney’s argument has been spreading through the Christian right at large. Last year, conservative personality and author Allie Stuckey published Toxic Empathy: How Progressives Exploit Christian Compassion, in which she argues that “toxic empathy is a dangerous guide for our decisions, behavior, and public policy” while condemning queer people and feminists. “Empathy almost needs to be struck from the Christian vocabulary,” Josh McPherson, host of the Christian-centered Stronger Man Nation podcast and an adherent of Wilson and Rigney’s ideas, said in January, in a clip that garnered an outsize amount of attention relative to the podcast episode itself. That same month, Vice President JD Vance struck a nerve with a controversial appearance on Fox News in which he seemed to reference both the empathy conversation and the archaic Catholic concept of “ordo amoris,” meaning “the order of love.” As Vance put it, it’s the idea that one’s family should come before anyone else: “You love your family, and then you love your neighbor, and then you love your community, and then you love your fellow citizens in your own country,” he said. “And then after that, you can focus and prioritize the rest of the world.” In a follow-up on X, he posted, “the idea that there isn’t a hierarchy of obligations violates basic common sense.” Vance’s statements received backlash from many people, including both the late Pope Francis and then-future Pope Leo XIV — but the controversy just drove the idea further into the mainstream. As part of the odd crossover between far-right religion and online reactionaries, it picked up surprising alliances along the way, including evolutionary biologist turned far-right gadfly Gad Saad. In January, Saad, applying a survival-of-the-fittest approach to our emotions, argued against “suicidal empathy,” which he described as “the inability to implement optimal decisions when our emotional system is tricked into an orgiastic hyperactive form of empathy, deployed on the wrong targets.” (Who are the wrong targets according to Saad? Trans women and immigrants.)In a February appearance on Joe Rogan’s podcast, Elon Musk explicitly referenced Saad but went even further, stating, “The fundamental weakness of Western civilization is empathy — the empathy exploit. They’re exploiting a bug in Western civilization” — the “they” here being the left wing. “I think empathy is good,” Musk added, “but you need to think it through, and not just be programmed like a robot.” By March, mainstream media had noticed the conversation. David French had observed the “strange spectacle” of the Christian turn against empathy in a column for the New York Times. In April, a deep-dive in the Guardian followed. That same month, a broad-ranging conversation in the New Yorker with Albert Mohler, president of the Southern Baptist Theological Seminary, led to interviewer Isaac Chotiner pressing him about why empathy is bad. The discussion, of deported Venezuelan immigrants wrongfully suspected of having gang tattoos, led to Mohler saying that “there’s no reason anyone other than a gang member should have that tattoo.” (Among the tattoos wrongly flagged as gang symbols were the words “Mom” and “Dad” on the wrists of one detainee.)The pro-empathy backlash is fierce The connective tissue across all these disparate anti-empathy voices is two-fold, according to Christian scholar Karen Swallow Prior. Prior, an anti-abortion ethicist and former longtime Liberty University professor, singled out the argument’s outsize emphasis on attacking very small, very vulnerable groups — as well as the moment in which it’s all happening.“The entire discourse around empathy is backlash against those who are questioning the authority of those in power,” she told me, “not coincidentally emerging in a period where we have a rise in recognition of overly controlling and narcissistic leaders, both in and outside the church.” Those people “understand and appreciate empathy the least.”“Trump made it okay to not be okay with culture,” Peter Bell, co-creator and producer of the Sons of Patriarchy podcast, which explores longstanding allegations of emotional and sexual abuse against Doug Wilson’s Christ Church, told me. (Wilson has denied that the church has a culture of abuse or coercion.) “He made it kind of cool for Christians to be jerks,” Bell said. “He made the unspoken things spoken, the whispered things shouted out loud.”Prior believes that the argument won’t have a long shelf life because Rigney’s idea is so convoluted. Yet she added that it’s born out of toxic masculinity, in an age where stoicism, traditionally male-coded, is increasingly part of the regular cultural diet of men via figures like Jordan Peterson. That hypermasculinity goes hand in hand with evangelical culture, and with the ideas Rigney borrowed from Friedman about too many emotions being a weakness. In this framing, emotion becomes non-masculine by default — i.e., feminine.“Everybody’s supposed to have sympathy for the white male, but when you show empathy to anyone else, suddenly empathy is a sin.”— Karen Swallow Prior, Christian scholarThat leads us to the grimmest part of Rigney’s “untethered empathy” claims: the way he explicitly genders it — and demonizes it — as feminine. Throughout his book, he argues that women are more empathetic than men, and that as a result, they are more prone to giving into it as a sin. It’s an inherently misogynistic view that undermines women’s decision-making and leadership abilities. Though Rigney pushed back against this characterization in an email to me, arguing that critics have distorted what he views as merely “gendered tendencies and susceptibility to particular temptations,” he also couldn’t help reinforcing it. “[F]emale tendencies, like male tendencies, have particular dangers, temptations, and weaknesses,” he wrote. Women thus should recognize this and “take deliberate, Spirit-wrought action to resist the impulse to become a devouring HR department that wants to run the world.”As Prior explains, though, Rigney’s just fine with a mythic national human resources department, as long as it supports the status quo. “Everybody’s supposed to have sympathy for the white male,” she said, “but when you show empathy to anyone else, suddenly empathy is a sin.”What’s heartening is that, whether they realize what kind of dangerous extremism undergirds it, most people aren’t buying Rigney’s “empathy is sin” claim. Across the nation, in response to Rigney’s assertion, the catchphrase, “If empathy is a sin, then sin boldly” has arisen, as heard in pulpits, seen on church marquees, and worn on T-shirts — a reminder that it takes much more than the semantic whims of a few extremists to shake something most people hold in their hearts.See More:
    0 Yorumlar ·0 hisse senetleri ·0 önizleme
  • With Letter to Trump, Evangelical Leaders Join the AI Debate

    Two Evangelical Christian leaders sent an open letter to President Trump on Wednesday, warning of the dangers of out-of-control artificial intelligence and of automating human labor.The letter comes just weeks after the new Pope, Leo XIV, declared he was concerned with the “defense of human dignity, justice and labor” amid what he described as the “new industrial revolution” spurred by advances in AI.“As people of faith, we believe we should rapidly develop powerful AI tools that help cure diseases and solve practical problems, but not autonomous smarter-than-human machines that nobody knows how to control,” reads the open letter, signed by the Reverends Johnnie Moore and Samuel Rodriguez. “The world is grappling with a new reality because of the pace of the development of this technology, which represents an opportunity of great promise but also of potential peril especially as we approach artificial general intelligence.”Rodriguez, the President of the National Hispanic Christian Leadership Conference, spoke at Trump’s first presidential inauguration in 2017. Moore, who is also the founder of the public relations firm Kairos, served on Trump’s Evangelical executive board during his first presidential candidacy.The letter is a sign of growing ties between religious and AI safety groups, which share some of the same worries. It was shared with journalists by representatives of the Future of Life Institute—an AI safety organization that campaigns to reduce what it sees as the existential risk posed by advanced AI systems.The world’s biggest tech companies now all believe that it is possible to create so-called “artificial general intelligence”—a form of AI that can do any task better than a human expert. Some researchers have even invoked this technology in religious terms—for example, OpenAI’s former chief scientist Ilya Sutskever, a mystical figure who famously encouraged colleagues to chant “feel the AGI” at company gatherings. The emerging possibility of AGI presents, in one sense, a profound challenge to many theologies. If we are in a universe where a God-like machine is possible, what space does that leave for God himself?“The spiritual implications of creating intelligence that may one day surpass human capabilities raises profound theological and ethical questions that must be thoughtfully considered with wisdom,” the two Reverends wrote in their open letter to President Trump. “Virtually all religious traditions warn against a world where work is no longer necessary or where human beings can live their lives without any guardrails.”Though couched in adulatory language, the letter presents a vision of AI governance that differs from Trump’s current approach. The president has embraced the framing of the U.S. as in a race with China to get to AGI first, and his AI czar, David Sacks, has warned that regulating the technology would threaten the U.S.’s position in that race. The White House AI team is stacked with advisors who take a dismissive view of alignment risks—or the idea that a smarter-than-human AI might be hostile to humans, escape their control, and cause some kind of catastrophe.“We believe you are the world’s leader now by Divine Providence to also guide AI,” the letter says, addressing Trump, before urging him to consider convening an ethical council to consider not only “what AI can do but also what it should do.”“To be clear: we are not encouraging the United States, and our friends, to do anything but win the AI race,” the letter says. “There is no alternative. We must win. However, we are advising that this victory simply must not be a victory at any cost.”The letter echoes some themes that have increasingly been explored inside the Vatican, not just by Pope Leo XIV but also his predecessor, Pope Francis. Last year, in remarks at an event held at the Vatican about AI, Francis argued that AI must be used to improve, not degrade, human dignity.“Does it serve to satisfy the needs of humanity, to improve the well-being and integral development of people?” he asked. Or does it “serve to enrich and increase the already high power of the few technological giants despite the dangers to humanity?”To some Catholic theologians, AGI is simply the newest incarnation of a long-standing threat to the Church: false idols. “The presumption of substituting God for an artifact of human making is idolatry, a practice Scripture explicitly warns against,” reads a lengthy missive on AI published by the Vatican in January. “AI may prove even more seductive than traditional idols for, unlike idols that ‘have mouths but do not speak; eyes, but do not see; ears, but do not hear’, AI can ‘speak,’ or at least gives the illusion of doing so. Yet, it is vital to remember that AI is but a pale reflection of humanity—it is crafted by human minds, trained on human-generated material, responsive to human input, and sustained through human labor.”
    #with #letter #trump #evangelical #leaders
    With Letter to Trump, Evangelical Leaders Join the AI Debate
    Two Evangelical Christian leaders sent an open letter to President Trump on Wednesday, warning of the dangers of out-of-control artificial intelligence and of automating human labor.The letter comes just weeks after the new Pope, Leo XIV, declared he was concerned with the “defense of human dignity, justice and labor” amid what he described as the “new industrial revolution” spurred by advances in AI.“As people of faith, we believe we should rapidly develop powerful AI tools that help cure diseases and solve practical problems, but not autonomous smarter-than-human machines that nobody knows how to control,” reads the open letter, signed by the Reverends Johnnie Moore and Samuel Rodriguez. “The world is grappling with a new reality because of the pace of the development of this technology, which represents an opportunity of great promise but also of potential peril especially as we approach artificial general intelligence.”Rodriguez, the President of the National Hispanic Christian Leadership Conference, spoke at Trump’s first presidential inauguration in 2017. Moore, who is also the founder of the public relations firm Kairos, served on Trump’s Evangelical executive board during his first presidential candidacy.The letter is a sign of growing ties between religious and AI safety groups, which share some of the same worries. It was shared with journalists by representatives of the Future of Life Institute—an AI safety organization that campaigns to reduce what it sees as the existential risk posed by advanced AI systems.The world’s biggest tech companies now all believe that it is possible to create so-called “artificial general intelligence”—a form of AI that can do any task better than a human expert. Some researchers have even invoked this technology in religious terms—for example, OpenAI’s former chief scientist Ilya Sutskever, a mystical figure who famously encouraged colleagues to chant “feel the AGI” at company gatherings. The emerging possibility of AGI presents, in one sense, a profound challenge to many theologies. If we are in a universe where a God-like machine is possible, what space does that leave for God himself?“The spiritual implications of creating intelligence that may one day surpass human capabilities raises profound theological and ethical questions that must be thoughtfully considered with wisdom,” the two Reverends wrote in their open letter to President Trump. “Virtually all religious traditions warn against a world where work is no longer necessary or where human beings can live their lives without any guardrails.”Though couched in adulatory language, the letter presents a vision of AI governance that differs from Trump’s current approach. The president has embraced the framing of the U.S. as in a race with China to get to AGI first, and his AI czar, David Sacks, has warned that regulating the technology would threaten the U.S.’s position in that race. The White House AI team is stacked with advisors who take a dismissive view of alignment risks—or the idea that a smarter-than-human AI might be hostile to humans, escape their control, and cause some kind of catastrophe.“We believe you are the world’s leader now by Divine Providence to also guide AI,” the letter says, addressing Trump, before urging him to consider convening an ethical council to consider not only “what AI can do but also what it should do.”“To be clear: we are not encouraging the United States, and our friends, to do anything but win the AI race,” the letter says. “There is no alternative. We must win. However, we are advising that this victory simply must not be a victory at any cost.”The letter echoes some themes that have increasingly been explored inside the Vatican, not just by Pope Leo XIV but also his predecessor, Pope Francis. Last year, in remarks at an event held at the Vatican about AI, Francis argued that AI must be used to improve, not degrade, human dignity.“Does it serve to satisfy the needs of humanity, to improve the well-being and integral development of people?” he asked. Or does it “serve to enrich and increase the already high power of the few technological giants despite the dangers to humanity?”To some Catholic theologians, AGI is simply the newest incarnation of a long-standing threat to the Church: false idols. “The presumption of substituting God for an artifact of human making is idolatry, a practice Scripture explicitly warns against,” reads a lengthy missive on AI published by the Vatican in January. “AI may prove even more seductive than traditional idols for, unlike idols that ‘have mouths but do not speak; eyes, but do not see; ears, but do not hear’, AI can ‘speak,’ or at least gives the illusion of doing so. Yet, it is vital to remember that AI is but a pale reflection of humanity—it is crafted by human minds, trained on human-generated material, responsive to human input, and sustained through human labor.” #with #letter #trump #evangelical #leaders
    With Letter to Trump, Evangelical Leaders Join the AI Debate
    time.com
    Two Evangelical Christian leaders sent an open letter to President Trump on Wednesday, warning of the dangers of out-of-control artificial intelligence and of automating human labor.The letter comes just weeks after the new Pope, Leo XIV, declared he was concerned with the “defense of human dignity, justice and labor” amid what he described as the “new industrial revolution” spurred by advances in AI.“As people of faith, we believe we should rapidly develop powerful AI tools that help cure diseases and solve practical problems, but not autonomous smarter-than-human machines that nobody knows how to control,” reads the open letter, signed by the Reverends Johnnie Moore and Samuel Rodriguez. “The world is grappling with a new reality because of the pace of the development of this technology, which represents an opportunity of great promise but also of potential peril especially as we approach artificial general intelligence.”Rodriguez, the President of the National Hispanic Christian Leadership Conference, spoke at Trump’s first presidential inauguration in 2017. Moore, who is also the founder of the public relations firm Kairos, served on Trump’s Evangelical executive board during his first presidential candidacy.The letter is a sign of growing ties between religious and AI safety groups, which share some of the same worries. It was shared with journalists by representatives of the Future of Life Institute—an AI safety organization that campaigns to reduce what it sees as the existential risk posed by advanced AI systems.The world’s biggest tech companies now all believe that it is possible to create so-called “artificial general intelligence”—a form of AI that can do any task better than a human expert. Some researchers have even invoked this technology in religious terms—for example, OpenAI’s former chief scientist Ilya Sutskever, a mystical figure who famously encouraged colleagues to chant “feel the AGI” at company gatherings. The emerging possibility of AGI presents, in one sense, a profound challenge to many theologies. If we are in a universe where a God-like machine is possible, what space does that leave for God himself?“The spiritual implications of creating intelligence that may one day surpass human capabilities raises profound theological and ethical questions that must be thoughtfully considered with wisdom,” the two Reverends wrote in their open letter to President Trump. “Virtually all religious traditions warn against a world where work is no longer necessary or where human beings can live their lives without any guardrails.”Though couched in adulatory language, the letter presents a vision of AI governance that differs from Trump’s current approach. The president has embraced the framing of the U.S. as in a race with China to get to AGI first, and his AI czar, David Sacks, has warned that regulating the technology would threaten the U.S.’s position in that race. The White House AI team is stacked with advisors who take a dismissive view of alignment risks—or the idea that a smarter-than-human AI might be hostile to humans, escape their control, and cause some kind of catastrophe.“We believe you are the world’s leader now by Divine Providence to also guide AI,” the letter says, addressing Trump, before urging him to consider convening an ethical council to consider not only “what AI can do but also what it should do.”“To be clear: we are not encouraging the United States, and our friends, to do anything but win the AI race,” the letter says. “There is no alternative. We must win. However, we are advising that this victory simply must not be a victory at any cost.”The letter echoes some themes that have increasingly been explored inside the Vatican, not just by Pope Leo XIV but also his predecessor, Pope Francis. Last year, in remarks at an event held at the Vatican about AI, Francis argued that AI must be used to improve, not degrade, human dignity.“Does it serve to satisfy the needs of humanity, to improve the well-being and integral development of people?” he asked. Or does it “serve to enrich and increase the already high power of the few technological giants despite the dangers to humanity?”To some Catholic theologians, AGI is simply the newest incarnation of a long-standing threat to the Church: false idols. “The presumption of substituting God for an artifact of human making is idolatry, a practice Scripture explicitly warns against,” reads a lengthy missive on AI published by the Vatican in January. “AI may prove even more seductive than traditional idols for, unlike idols that ‘have mouths but do not speak; eyes, but do not see; ears, but do not hear’, AI can ‘speak,’ or at least gives the illusion of doing so. Yet, it is vital to remember that AI is but a pale reflection of humanity—it is crafted by human minds, trained on human-generated material, responsive to human input, and sustained through human labor.”
    0 Yorumlar ·0 hisse senetleri ·0 önizleme
CGShares https://cgshares.com