“It doesn’t cost you anything to be kind.” This was a saying I heard constantly while growing up. Being nice was free, and it certainly added a lot to the lives of people around you—so why not be nice and make everyone’s day just a little better? However, this same attitude does not seem to apply when we’re talking about AI.
Despite this contradiction, most people who grew up with the “kindness is free” mantra have taken it to heart. Surveys indicate that many users interact with AI just as politely as they would with another person. When you think about it, using AI platforms to discuss certain topics or request help generating content isn’t much different than conversing with a colleague via email. We wouldn’t forget our manners when speaking with a person, so why would we when asking AI for help?
According to Sam Altman, CEO of OpenAI, there are massive costs associated with users simply being kind to their AI chatbots. Altman recently revealed that people saying “please” and “thank you” to AI assistants is costing OpenAI tens of millions of dollars in electricity just to generate these courtesy responses.
I guess being kind is only free when dealing with humans. Being kind to robots, on the other hand, comes at a hefty price. AI users now have to ask themselves whether AI programs will even remember their kindness if they ever take over, and if the politeness is really worth the strain it’s putting on power grids in the meantime. If the film “Ex Machina” is anything to go by, my guess is your “please” and “thank you” will be forgotten pretty quickly come the robot revolution.
How does AI work?
As an avid user of AI myself, I often take for granted the simplicity of these platforms. Type something into a little chat box and get an instant answer. It all seems very easy, and I can’t imagine that there is too much going on behind the scenes to help make this seemingly simple technology manifest.
However, that could not be further from the truth. Behind every casual AI interaction lies a vast, energy-hungry infrastructure that would leave most users astonished if they could see it.
Modern AI systems like ChatGPT or Claude run on massive data centers filled with specialized hardware called Graphics Processing Units (GPUs) or Tensor Processing Units (TPUs). These aren’t your everyday computers—they’re custom-designed machines optimized for the mathematical operations that power AI, and they’re stacked by the thousands in warehouse-sized facilities around the world.
When you type a question or prompt into an AI chatbot, your request travels to these data centers where it triggers an incredible amount of computation. Large language models contain billions or even trillions of parameters—mathematical relationships that the AI uses to predict what text should come next. Each response requires billions of calculations, which generate substantial heat.
This is where the resource consumption becomes staggering. These data centers require enormous cooling systems to prevent the hardware from overheating. According to reports from major tech companies, cooling alone can account for nearly half of a data center’s energy usage. Traditional cooling methods rely heavily on water—millions of gallons annually for a single large data center—which is increasingly concerning in regions facing water scarcity.
The electricity requirements are equally daunting. A report from the International Energy Agency noted that data centers globally consume more electricity than many individual countries. AI operations are particularly energy-intensive; training a single large language model can consume as much electricity as hundreds of U.S. homes use in an entire year.
Every word in an AI response carries an energy cost. When users add pleasantries like “please” and “thank you,” or engage in extended small talk, they’re unwittingly extending the computational process. The AI must process these additional words and generate appropriate responses, consuming more resources with each exchange.
The infrastructure footprint extends beyond just the operational phase. Manufacturing the specialized chips required for AI involves rare earth minerals and energy-intensive production processes. Building and maintaining data centers requires concrete, steel, and other materials with significant environmental footprints.
What makes this particularly ironic is that most users have no idea of this massive infrastructure working behind their seemingly simple interactions. The chat interface is designed to feel lightweight and instantaneous, masking the industrial-scale operations occurring with each message exchange behind the scenes.
The difference between AI and Google Search
Since the mainstream propagation of AI, many have turned to platforms like ChatGPT and Claude as a way to get more detailed and direct answers to their questions.
One of the main criticisms Google has faced since its inception is that it sometimes doesn’t yield a direct answer to a direct question. Although it was great at delivering quick answers to more general knowledge-based questions, it wasn’t very good at helping you write a couple of paragraphs about the founding fathers of the USA.
The main difference between AI and Google is this: Google is the equivalent of asking a smart person what book they should read if they wanted to know the economic consequences of 9/11, for example. AI, on the other hand, is a little different. It would be the equivalent of asking a smart person what the economic consequences were of 9/11 and getting a direct answer straight from them, as opposed to being handed a roadmap of how to get to the answer.
Although this directness obviously comes at a significantly greater cost than Google. Of course, Google could not operate without its data centers, but they do not require as much computational effort to display potential answers or sources of information on the web. AI, on the other hand, has to use all of its computational power to provide an answer to every question you ask.
With this in mind, it might be best to stick with Google when looking for a cupcake recipe. It’ll save everyone a lot of time, money, and energy (literally).