Using AI Responsibly: 6 Ways to Make Your AI Usage More Sustainable
The idea that something as simple as saying “please” and “thank you” to ChatGPT could be wasting millions in computing power recently caught headlines. But for us at Indeed Innovation, this isn’t a new conversation.
We’ve been exploring the intersection of AI and sustainability for some time now—advocating for more conscious, efficient, and responsible use of machine intelligence. And with AI adoption accelerating across industries, that conversation has never felt more urgent, or more practical.
Artificial intelligence now shapes how we communicate, code, analyze data, and even think. But behind every prompt and prediction lies an energy-hungry infrastructure: massive cloud servers, cooling systems, and high-volume data transfers.
And it’s growing—fast.
To align AI innovation with sustainability, it’s time we make smarter choices.
Local AI vs Cloud AI: A Sustainability Perspective
Cloud-based AI, particularly large language models, demands substantial computational power. During Training and for each query after deploying the system. Training a model like GPT-3, for instance, consumed as much energy as 128 U.S. households use in a year. That is not the end. The operation of data centers themselves requires not only electricity. Each interaction with these models can require water for cooling—up to 925 milliliters per query in some cases.
What is the alternative? In contrast, local AI— either at the company’s own servers or in nearby devices —can offer a more energy-efficient alternative. By processing data close to the source, it reduces the need for energy-intensive data transfers and large-scale cooling infrastructure. Additionally, it also brings added benefits in terms of privacy and performance, especially when sensitive data should remain on-site.
A ChatGPT query uses ten times more energy than a standard Google query.
David Porter, Vice President at the Electric Power Research Institute
Anyone who uses software will notice that AI is increasingly being offered as a feature. Starting with web browsers, through Acrobat when reading PDFs, to standard office software. For searching, analysis, translation, and sparring. However, to use it responsibly, it is important to consider: Do I really need AI for this task? Or is a search function sufficient?
So, what can we do to use AI as sustainably as possible? Based on our experience in developing AI assistants for internal work processes and for our clients, we have compiled the following six recommendations.
6 Practical Ways to Reduce AI’s Carbon Footprint
1. Use Local AI When Possible
Run LLM models locally to get outputs, like the kinds you see from ChatGPT, especially if you’re processing sensitive data or using AI often. This reduces energy-hungry data transfers and cooling demands from cloud data centres.
2. Match the Model to the Task
Not every prompt needs a massive model. For everyday tasks (e.g., writing a short email), use smaller, efficient models that can do the job with less energy. Think of it like choosing a bike instead of a truck for a 2-minute errand.
3. Reduce Query Frequency and Batch Tasks
Group related AI tasks together to reduce total processing time. Each query sent to the cloud consumes energy, batching reduces roundtrips and increases efficiency. Think of baking one baguette at a time in an oven compared to bake many of them in parallel over night, when electricity is cheap. Batch processing with AI could be: collect bills to analyse and process 20 of them in one task. Batch processing is available, when using the API-version of OpenAI for example instead of ChatGPT.
4. Audit and Optimize AI Usage
Track how, when, and why AI tools are being used. Are there workflows where traditional automation (scripts, rules engines) would be more efficient? Not everything needs AI.
5. Encourage Responsible AI Habits in Teams
Educate teams on sustainable AI practices. Promote awareness of AI’s environmental impact just like you would with recycling or power-saving.
Want to Go Deeper?
Download our free whitepaper about how to use AI for Circular Economy (german only).
6. Align AI Strategy with Sustainability Goals
If you’re already tracking Scope 3 emissions or have circularity goals, include AI infrastructure in those assessments. Opt for cloud providers with renewable energy commitments, or better yet, deploy AI where you have local green energy available.
Closing Words
Optimizing your AI usage doesn’t mean abandoning innovation. It means asking better questions about how, when, and why you use these tools and making intentional decisions that balance performance with impact.
We’re continuing to explore this space: not just where AI is going, but how to build with it responsibly.
Sources:
https://www.rgare.com/knowledge-center/article/5-ways-to-reduce-genai-s-carbon-footprint
https://www.knapsack.ai/blog/local-ai-vs-cloud-ai-which-is-more-sustainable/
https://www.eweek.com/artificial-intelligence/ai-energy-consumption/
https://github.com/mlco2/impact/blob/master/data/impact.csv
https://www.epa.gov/energy/greenhouse-gas-equivalencies-calculator
Jeongwoo Jang
Data Analysis
AI Concepting
Prototyping & Testing