MHTNTIMES.COM
Sam Altman Admits That Saying 'Please' and 'Thank You' to ChatGPT Is Wasting Millions of Dollars in Computing Power
If chivalry isn't quite dead yet, it's definitely on life support. OpenAI CEO and tech mogul Sam Altman recently acknowledged that people adding "please" and "thank you" to their AI prompts is burning a hole in his pocket. Responding to a post on X—formerly known as Twitter—where someone pondered “how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models,” Altman replied that it’s “tens of millions of dollars well spent.” “You never know,” he added. It might seem silly to be courteous to a chatbot, but some AI developers argue it actually matters. Microsoft's design lead Kurtis Beavers, for instance, believes politeness "helps generate respectful, collaborative outputs." "Using polite language sets a tone for the response," Beavers explains. And there’s some logic to it: what we call "artificial intelligence" is really more like a hyper-advanced prediction engine—think predictive text, but supercharged to form full sentences based on context. "When it clocks politeness, it’s more likely to be polite back," states a memo from Microsoft’s WorkLab. "Generative AI also mirrors the levels of professionalism, clarity, and detail in the prompts you provide." A late 2024 survey found that 67 percent of Americans said they were courteous when talking to chatbots. Among them, 55 percent said they do it because it “feels like the right thing to do,” while 12 percent confessed they're just trying to stay on good terms in case of an AI-led doomsday. While that kind of robot rebellion still lives in science fiction—and most experts doubt LLMs will ever become truly “intelligent”—the environmental toll of today’s AI is very real. And all those “pleases” and “thank yous” are making an impact. A Washington Post investigation, working alongside researchers at the University of California, looked at the energy cost of crafting a 100-word email using AI. The result? Just one email eats up 0.14 kilowatt-hours of electricity — enough to keep 14 LED bulbs on for an hour. Do that once a week for a year, and you’re using 7.5kWh, roughly the same power consumed in one hour by nine homes in Washington, DC. Now picture the thousands of detailed prompts flooding AI models like ChatGPT every day — it's far from energy-efficient. AI manners may seem like a small matter, but they highlight a larger issue: every question we ask these models leaves an environmental footprint. The data centers powering today’s chatbots already account for about 2 percent of global electricity use — a figure poised to balloon as AI continues to weave into daily life. So next time you're thinking of thanking Grok for its services, consider skipping the chatbot altogether and writing the message yourself. Your brain — and the planet — will be better off.
0 Commenti 0 condivisioni 52 Views