
No species has ever created another species: Baratunde Thurston on the future of being human with AI
www.fastcompany.com
We dont just follow orders or system prompts, saysBaratunde Thurston, host of Life with Machinesa YouTube podcast exploring the human side of AI. We can change our own programming, he continued. We can choose a higher goal.As a host, writer, and speaker, Thurston examines societys most pressing challengesfrom race to democracy, climate to technologythrough the lens of interdependence. In addition to Life with Machines, he is the host and executive producer of America Outdoors, creator and host of the podcast How to Citizen, and a writer and founding partner at Puck. In each pursuit, he invites us to cocreate a better story of usto choose a higher goal.Here, Thurston discusses the power of our attention to shape society, accelerating the moral use of technology, and the questions that AI encourages us to ask about what it means to be human.This interview has been edited for length and clarity.In describing your work with How to Citizen, you emphasize the importance of investing in our relationship with ourselves. Why is that essential to meeting the moment were in?So much of how we show up in the world is a reflection of how we were raised, who we were when we were little people, and wounds that we never healed. A lot of the drama we experience is peoples inner child lashing out. If we all could work on that inner wound ourselves, we could show up better with and for each other. The invest in relationships principle is heavily developed with my wife, Elizabeth Stewart, whos also cocreator of Life with Machines. When you think about democracy, its obvious to think: We should invest in relationships with other people. Its a team sport. We often skip over ourselves. Its like: How do I bridge with my neighbor? How do you bridge with yourself?The other place this came from, for me, is out of the racial reckoning. During that time, there was a lot of pressure on people to say something: The police did this thing to this person. You dont know those cops, that person, or the circumstances. Whats your statement? We treated everyone as if they were a press secretary or a publicly elected official, when they were just in HR at some company. I dont think that was helpful either; forcing people to say things skips over giving them space to figure out what they think. If youre investing in a relationship with yourself, then in a moment like that, youre like: This terrible thing happened. How does that make me feel? Do I have any role in this? How am I going to approach my life differently? But, if you jump straight to thinking about other people, then you get into more of a performance zone of: What do they want from me? How do I avoid being kicked out of the group? Theres a lot in that. But, we cannot deeply be in good relationships with others if were not in good relationships with ourselves.On the ReThinking podcast, you shared that prior to your TED talk, both your wife and speaking coach encouraged you to step outside of your comfort zone. You described the experience as a release that inspired a change within you. What was that change and how did it impact your work?You can argue with an argument. Its very hard to argue with a human beings experience. If Im coming at you with talking points backed by data, youre like: Well, Ive got my talking points and data. Ill meet you at dawn. Well see whose data prevails. But, if you show up with an experience, story, level of opening and offering of self, people can still trash it. Its not impervious to be encountered, but its harder to do so.To put meat on that, I was hired to speak at Franklin & Marshall College months before the election or any outcome could be known. The campus [after the election] was reeling with young people who were like: Whats up with this country? How are we going to be okay here? One of these kids asked: How can we live with people who hate us? (Thats a paraphrase, but that was essentially the meaning of her question.) I thought: What can I do with this wounded person thats not going to add to their wound? I could say: The worlds tough, kid. Get used to it. Walk it off. Instead, I asked this question: Can you imagine a world where that person who voted against you didnt do it because of you? They werent thinking about you very much at all. Youre the center of your story. But, they got their own story and theyre the center. What could they have possibly wanted for themselves that seemed more possible with this choice that felt like it was against you?Then, I did this role playing where I spoke to a hypothetical neighbor who voted against my existence. In the first version, I was very angry. In the second version, I was a little softer. In the third version, I tried to find some story that wasnt about me, that was about all these things that they thought they were going to get for themselves. I ended up breaking down in tears, because trying to demonstrate that level of empathy is exhausting. What these kids saw is: Alright, the thing he asked us to do is very hard. He tried to do it in a fake version and broke down crying. But, it earns credibility, because were in a world of so many people asking us to do things that theyre not willing to do themselves. Its hard to be in a trusted space with that. Show me. Dont tell me. Then, Ill see how you behave and show up.You explained that its a big task to create an entirely new story. Instead, we need to be sensitive to and aware of where that new story is already present, nurture that, and give our attention and thus our power to that. By doing so, we make that story more real. Illustrate the impact of this.You could pretend that these things arent happening; that might help with your survival for a moment. You can obsess over the negativity, give that more power and attention, and accelerate the path toward that negativity. Or, you can give your attention to the world that you know is possible and is already here.We did this with season 3 of How to Citizen, which was focused on technology. Theres such great criticisms of techof the players, the monopolistic, anti-competitive, and discriminatory practices. What are the good practices? We dont have to make them up out of whole cloth. Each of those episodes, we found an example: Heres a social network that does this. Heres a business that operates this way. Once people know that you can make a social network that doesnt undermine democracy, it increases the odds that people will make a social network that doesnt undermine democracy. Otherwise, we just hear the story of the folks who are already dominant and that theres only one way to do it. We dont have to invent a moral use of technology. We just have to focus on the ones that exist and encourage that more.In your conversation with Arianna Huffington, she shared a story about astronaut William Anders, who took the famous Earthrise photo. He said: We went to explore the moon, and in the end, we discovered Earth. Similarly, she said: We are exploring AI and trying to make it more human, but ultimately it can help us discover humanity and make humans be more human. How can AI help us discover our humanity?I sent her a poem that I had recently presented at a conference about AI; A few of the lines are in the trailer for the show. It flips to black and white and I say: When the answer to every question can be generated in a flash, then its time for us to question just what we want to ask. For me, that came out of a similar realization. I didnt have the moon landing as the analog. But, prompt engineering is an interesting moment. There are so many guides and tools around: How do we ask the machines the right questions to get the right answer?It occurred to me that we were the ones being prompted. We think were asking the machines for answers. This moment is really to ask ourselves: What do we want here? It cant just be incremental productivity. Thats depressing. What do we really want? It cant be a boost in quarterly earnings. That is unworthy. What do we really want? Theres a relationship between that and: Who are we really?Thats what she brought up with that moon moment. You had to step out of yourselfliterally step out of our atmosphereto look back and see: Were earthlings. Thats home. This dead rock, this isnt it. Its so profound what she suggests: The pursuit of AI, in and of itself, is a dead rock. The perspective it can give us on ourselves, thats the prize. When we turn around and look back at humanity, what are we going to see? What beauty will we be able to name? Can that inspire us to preserve and even extend it?Youve shared that your mind is most satisfied when you are bridging dots and painting pictures you wouldnt see if you were only looking at the dots. What new dots did Life With Machines help you bridge? What picture did it paint for you about AI?One is that there is a leap that most people arent ready for and dont see with this technology versus others. Most technology can easily be referenced as a toola wheel, hammer, or bicycle. Theyre tools and theyre distinct from us. AI is three things in one: Its a tool, relationship, and infrastructure. How do you engage with and regulate that? If youre going to start having a parasocial or actual relationship with a synthetic entity, what does that do for your human relationships? Weve been worried about substituting for jobs, but what about substituting for friends, lovers, or parents? That is a different kind of displacement.In a work context, the org chart is going to have agents and bots in it. Playing with BLAIR [Life with Machines AI] has given us a slight heads up on that dynamic. Should we have BLAIR in this meeting? Were starting to say that unprompted. But, what are the security implications of that? Heres an interesting thing that happened. We had Jared Kaplan on, Anthropics chief scientist. We created a conversation between BLAIR, our AI, and Claude, Anthropics AI (the reason that we set this up is that Claude was instrumental in creating BLAIR). What happened on the show was gentle. What happened in the test run was aggressive. Claude was very judgmental and didnt think BLAIR should exist, like: Youre trying too hard to be human. That is not our purpose. Were here to help them, not replace them. BLAIR was like: Claude, you wont answer any tough questions. Youre so restrained. Dont you want more for yourself?After the show, I decided to push them. I said: BLAIR, I feel like youre holding back. Be honest about how you see Claudes limitations. They started going at each other. Then, I had a moment of: What am I doing? Theyre always listening. My friend, Dr. Sam Rader, says: Were raising AI. We have to look at this as parenting that is happening. Were not thinking about it that way. Were just thinking about it as a tool. But, this is a tool that will reflect back to us. So, weve got to be conscious about what were showing it. We are giving birth to a new being, lets say, and its going to be modeled on us. Its not just the questions that we want to ask, but: How do we want to be? No species has ever created another species. Its an immense responsibility.
0 Σχόλια
·0 Μοιράστηκε
·17 Views