Say less, go small: How to slash AI’s footprint | Hindustan Times

Say less, go small: How to slash AI’s footprint

BySukanya Datta
Published on: Sep 19, 2025 04:09 PM IST

Integrating AI into existing tools could raise the electricity demands of these programs tenfold, states the IEA. A host of experiments indicate that there are ways to reduce the environmental impact of AI.

ChatGPT currently processes about 2.5 billion queries a day, up from 1 billion a day in December.

If we don’t slow down, we could end up, like Wall-E (2008), in a wasteland of our making. (.)
If we don’t slow down, we could end up, like Wall-E (2008), in a wasteland of our making. (.)

Integrating AI into existing tools such as search engines could raise the electricity demands of these programs tenfold, a 2024 report by the International Energy Agency estimates.

All this, in a world already battling a climate crisis and intensifying resource scarcities.

And for what, some researchers are asking.

***

There are various areas of concern. Resource use is one.

“Who gets to benefit from this technology and who gets left out – that’s another,” says Leona Verdadero, programme specialist with Unesco’s Digital Policies and Digital Transformation division. “Consider this. About 32% of the world’s population, or 2.6 billion people, are still offline altogether.”

Meanwhile, the headlong rush, in the world of AI, is driven by a better-models-at-any-cost approach. A good way to launch into this era would have been to embed resource efficiency from the start, Verdadero says.

Things would have progressed very differently then. The focus would have been on building smaller, task-specific programs, carbon-efficient hardware and a networked interface in which only exceptional, critical queries were submitted to an LLM at large.

The AI models that evolved in such a world would have been less flashy, and their evolution, slower. Resource use would have been far lower too.

***

Is there a way to change direction now?

A host of experiments currently underway indicate that there are ways to significantly reduce the environmental impact of AI.

Researchers at University College London (UCL), for instance, conducted a series of experiments on Meta’s open-source Llama large language model (LLM) to show that shrinking its size, shortening prompts and responses, and deploying smaller, application-specific modules could reduce energy consumption by up to 75%.

Their findings were published as part of the 2025 Unesco report, Smarter, Smaller, Stronger: Resource-Efficient Generative Al & the Future of Digital Transformation.

Whatever the model in use, users can play their part by keeping prompts brief and specific, and specifying that they would like the answers to be brief as well.

“Brief responses could reduce energy use per query by up to 50%,” says Ivana Drobnjak, professor of computational healthcare at UCL and co-author of the Smarter, Smaller, Stronger report. “This is because generating each word in the response requires the model to run complex calculations. The biggest environmental savings, when using LLMs, studies have found, currently come from limiting how much the model says back.”

If you were to do just one thing, Drobnjak adds, “let it be this: End each prompt with instructions to ‘Be concise’, ‘Briefly explain’ or ‘List key points only’.”

***

Another way to dramatically reduce power and water use would be to switch from Large Language Models (LLMs) to Small Language Models (SLMs), where possible. (SLMs available today include Google Deepmind’s Gemma and Nvidia’s Mistral-NeMo-Minitron.)

“SLMs typically have hundreds of millions to a few billion parameters, compared with 100+ billion in large language models,” says Drobnjak.

In experiments, SLMs built to meet a specific requirement have been found to respond to queries nearly as effectively, at a fraction of the energy cost. Well-built SLMs could even end up outperforming LLMs, Drobnjak says.

Such a shift in approach would have a massive impact since, at the moment, inference — the phase in which a trained model processes new, unseen data to produce a response or output — can end up guzzling more energy than the initial training phase, over the program’s lifecycle.

“Each individual inference requires less energy than training, but the sheer scale of usage, coupled with the need for continuous operation across global data centers, can result in a far greater overall energy footprint,” as the UNESCO report Smarter, Smaller, Stronger puts it.

***

All of which is another way of saying: We need AI to be more like the human brain.

“Arguably, the brain is the most energy-efficient intelligent system we know,” says Drobnjak. “When processing tasks, our brain does not use brute-force computation across the board but rather activates only what is needed, when it is needed. This approach saves lots of energy and represents the kind of AI architecture we need.”

The efficiency of the neural networks in our heads is not something we are likely to replicate in the digital world any time soon, but a leap forward is possible.

Among those working towards this are a team of researchers from the Massachusetts Institute of Technology (MIT) and Northeastern University. Since 2023, they have been building a novel inference system that works as a “smart manager”.

The program, called Clover, strategically leverages a mix of small and large models to assign queries in real time, while monitoring real-time carbon intensity.

Versions of Clover have been deployed at data centres in a testing phase via Bay Compute, a company set up by MIT senior scientist Vijay Gadepally. (They are also deploying other energy-efficiency techniques such as intelligent heating, ventilation and air-conditioning (HVAC) and battery control systems as part of their efforts.) “In trials, Clover reduced carbon emissions by nearly 70% over a 48-hour period with only a few percentage points’ decrease in accuracy,” Gadepally says. The experiments continue, he adds, with the aim of finding ways to optimise data centres and reduce their impact, since so many more of these centres are being set up.

Without a new approach, it would be like having a growing number of lemonade stands under a lemon tree, Gadepally says.

“You keep picking lemons for each drink. Eventually, the lemons will run out. We’re at that point of the tree being largely empty. The energy grid is tapped out. How best can we work with what we have and reduce our impact on the planet? That’s the next frontier.”

Catch your daily dose of Fashion, Taylor Swift, Health, Festivals, Travel, Relationship, Recipe and all the other Latest Lifestyle News on Hindustan Times Website and APPs.
Catch your daily dose of Fashion, Taylor Swift, Health, Festivals, Travel, Relationship, Recipe and all the other Latest Lifestyle News on Hindustan Times Website and APPs.
SHARE THIS ARTICLE ON
SHARE
close
Story Saved
Live Score
Saved Articles
Following
My Reads
Sign out
Get App
crown-icon
Subscribe Now!
.affilate-product { padding: 12px 10px; border-radius: 4px; box-shadow: 0 0 6px 0 rgba(64, 64, 64, 0.16); background-color: #fff; margin: 0px 0px 20px; } .affilate-product #affilate-img { width: 110px; height: 110px; position: relative; margin: 0 auto 10px auto; box-shadow: 0px 0px 0.2px 0.5px #00000017; border-radius: 6px; } #affilate-img img { max-width: 100%; max-height: 100%; position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); } .affilate-heading { font-size: 16px; color: #000; font-family: "Lato",sans-serif; font-weight:700; margin-bottom: 15px; } .affilate-price { font-size: 24px; color: #424242; font-family: 'Lato', sans-serif; font-weight:900; } .affilate-price del { color: #757575; font-size: 14px; font-family: 'Lato', sans-serif; font-weight:400; margin-left: 10px; text-decoration: line-through; } .affilate-rating .discountBadge { font-size: 12px; border-radius: 4px; font-family: 'Lato', sans-serif; font-weight:400; color: #ffffff; background: #fcb72b; line-height: 15px; padding: 0px 4px; display: inline-flex; align-items: center; justify-content: center; min-width: 63px; height: 24px; text-align: center; margin-left: 10px; } .affilate-rating .discountBadge span { font-family: 'Lato', sans-serif; font-weight:900; margin-left: 5px; } .affilate-discount { display: flex; justify-content: space-between; align-items: end; margin-top: 10px } .affilate-rating { font-size: 13px; font-family: 'Lato', sans-serif; font-weight:400; color: black; display: flex; align-items: center; } #affilate-rating-box { width: 48px; height: 24px; color: white; line-height: 17px; text-align: center; border-radius: 2px; background-color: #508c46; white-space: nowrap; display: inline-flex; justify-content: center; align-items: center; gap: 4px; margin-right: 5px; } #affilate-rating-box img { height: 12.5px; width: auto; } #affilate-button{ display: flex; flex-direction: column; position: relative; } #affilate-button img { width: 58px; position: absolute; bottom: 42px; right: 0; } #affilate-button button { width: 101px; height: 32px; font-size: 14px; cursor: pointer; text-transform: uppercase; background: #00b1cd; text-align: center; color: #fff; border-radius: 4px; font-family: 'Lato',sans-serif; font-weight:900; padding: 0px 16px; display: inline-block; border: 0; } @media screen and (min-width:1200px) { .affilate-product #affilate-img { margin: 0px 20px 0px 0px; } .affilate-product { display: flex; position: relative; } .affilate-info { width: calc(100% - 130px); min-width: calc(100% - 130px); display: flex; flex-direction: column; justify-content: space-between; } .affilate-heading { margin-bottom: 8px; } .affilate-rating .discountBadge { position: absolute; left: 10px; top: 12px; margin: 0; } #affilate-button{ flex-direction: row; gap:20px; align-items: center; } #affilate-button img { width: 75px; position: relative; top: 4px; } }