5 Ways to Tame AI's Energy Consumption
- Susan Guest
- Jul 15
- 2 min read
Updated: Oct 13

The IEA's special report Energy and AI, April 10, 2025, offers the most comprehensive, data-driven global analysis to date on the growing connections between energy and AI. It projects that electricity demand from data centers worldwide will more than double by 2030 to around 945 terawatt-hours (TWh), slightly more than the entire electricity consumption of Japan today. AI will be the most significant driver of this increase, with electricity demand from AI-optimized data centers projected to more than quadruple by 2030.
While corporate and governmental initiatives must address AI's largest environmental effect, small decisions at an individual level, when scaled across millions of users, can collectively work towards decreasing the energy demand as well. Here are five things you can implement in your AI use to contribute towards decreasing its environmental effect.
Use the Right Tool for the Job. When reaching for a powerful generative AI, ask yourself whether there is a less complicated, less energy-hungry tool available that can get your work done. For general fact-finding, a routine search engine query is several orders of magnitude less energy-intensive. Limit your use of generative AI to work it does best, such as summarizing long documents, suggesting ideas, and generating content, and highly specific research and analysis rather than tasks a search engine can perform.
Choose Text-Based Activities Over Generating Images. It takes a lot of energy to produce an image using AI, as compared with creating text. If you have a requirement for an image, think of using a stock image or finding one already available on the internet before asking an AI to produce a fresh one in its entirety. This decision can significantly lower your activity's energy consumption.
Adjust Your Prompts for Shorter Responses. One of the primary drivers of energy consumption working with AI is not how advanced your prompt is, but how long your response will be. More power required means increased energy use. You can mitigate this effect by being specific in your request. Instead of asking for a general summary, frame your prompt to ask for a short summary, a bulleted response, or a response in a set number of paragraphs.
Select Smaller, More Efficient Models When Possible. All AI models are not of similar quality; larger, more complicated models take more power to operate. Some AI systems have multiple versions of their models available (e.g., a "light" or "nano" version vs. a "pro" or "ultra" version). If you have tasks that do not need maximum possible reasoning power, selecting a small, less power-hungry model will lower energy consumption considerably, yet still produce a great result.
Advocate for Transparency and Change. While personal habits are helpful, their greatest power is in signaling a demand for systemic change. You can contribute to this larger effort by:
supporting companies that reveal how they impact Earth and invest in reducing their power consumption and pollution
using your voice as a citizen to support policies that mandate clear, standardized reporting on the energy use, water consumption, and carbon emissions of AI services. This allows all users to make informed, environmentally conscious choices.
This article was written by Susan Guest in collaboration with ChatGPT.
