top of page

Responsible AI: Why Less Can Deliver More

ree

Image/Wix Alt-text: A man standing in a dark datacentre with parallel rows of blue-lit server racks extending into the distance.


In 2025, Large Language Models (LLMs) dominate but, as we look to 2026, the real opportunity may lie in smaller, task‑focused models.


It’s no secret that AI’s consumption of water, energy, and land, is raising serious concerns in many quarters with some questioning whether we really need to use vast LLMs trained on vast, often “polluted” datasets, to perform relatively simple tasks.


Smaller, task-specific models, built on highly curated datasets, are also capable of delivering powerful results with far less environmental and social cost

Crucially, smaller models also mean lower computational demands.


This could help level the playing field, enabling universities, governments, and NGOs (organisations unable to compete with Big Tech’s resources), to participate meaningfully in AI innovation. It could also pave the way for AI to be deployed where humanity most needs solutions: health, education, climate change, and beyond.


The Hidden Costs of Bigger Models

Without even surveying the entire AI ecosystem, the case for smaller models is compelling when we examine just some of the resources consumed in the quest for bigger and bolder AI models. Let’s turn to resource footprint of AI data centres for example:


  • Water – Data centre cooling systems consume vast amounts of drinking-quality water straining supplies especially in water-scarce regions. Seawater is not ideal because of the corrosive nature of salt-water which leads to thermal degradation in GPUs.  In addition, a significant portion of water is also lost through evaporation. However, what really increases data centre water footprint is water needed for its electricity consumption especially from sources like nuclear, nuclear gas, coal and hydropower. According to the EESI (Environmental and Energy Study Institute), large data centres can consume up to five million gallons of water per day - equivalent to the needs of a town populated by 10,000 - 50,000 people. Will  these levels of usage be acceptable in locations where the population is already facing water restrictions or those experiencing a deterioration in water quality following the development of data centres within their locality? 

  • Energy – Data centres consume a lot of energy with computational power and cooling driving enormous energy demands. While it’s true, increases in energy consumption cannot all be attributed to AI (cloud computing and crypto mining have also played a part), the widespread adoption of GenAI has significantly increased the demand for data centres (in terms of size and numbers), triggering unprecedented increase in energy demand and consumption. In terms of sustainability, in the USA for example, around 56% of data centre electricity comes from fossil fuels, increasing carbon emissions further contributing to climate change. As such, organisations using AI would do well to prioritise sustainability (rather than just efficiency) or face similar backlash and pressures that oil companies and the fashion industries have faced over sustainability and accusations of “greenwashing”.

  • Land – Acquisitions for data centres means this land cannot be used for essential uses such as farming, conservation, or housing.  While some data centres are located undersea, there are hundreds of others built on land impacting communities from price increases, lack of transparency in land purchases (meaning people are often unable to block or object on time) and for those who remain, constant humming sounds from data centres affecting quality of life.

  • Job creation - Promises of job creation and opportunities rarely materialise and even where jobs are created, the vast majority are not permanent on-site roles. While it’s true, data centres do create jobs and opportunities elsewhere in the economy, many try to frame them as good for the local community because of the opportunities they bring?


    However, data centres are not like the car or steel industries which at their height, provided permanent diverse jobs for thousands of local people. The problem with data centres is, once completed (which can take years), they don’t really require many people to operate them. So when it comes to data centres and job creation, given the tax and other incentives they receive, do they really deliver value for money or ROI in terms of (local) job creation? Take Vantage Data Centers which says it will invest $2bn to construct a new, state-of-the-art data centre campus in Stafford County, Virginia, USA. The company estimates that at full capacity, the project will create 50 new jobs. This works out at a staggering $40 million investment per (local) job, alongside significant environmental costs. The situation is similar in the UK where, despite  data centres’ large size and cost, the number of permanent on-site jobs is relatively small, typically in the dozens.


AI as a Force for Good

The promises of AI and realities such as the ones outlined above, are some of the reasons why more and more people are calling for smaller AI models. AI can be and is already, a force for good in many areas in which it is being deployed. Smaller, human-centred models are already proving their worth and examples include (though not limited to):


o   Stockholm Subway (SL) - According to SL (Storstockholms Lokaltrafik), AI-powered video analytics (IRIS Rail) has saved lives by detecting at-risk behaviour and enabling rapid intervention

o   Assistive Technology - From AI-powered screen readers (used by blind and visually impaired people to navigate the internet), voice assistants, smart wheelchairs to intelligent prosthetics (which can detect the user’s intentions), adaptive solutions are empowering people with disabilities to live more independently

o   Medical breakthroughs – AI can and is already contributing in accelerating drug discovery and cancer analysis

o   Education - Not everyone learns or absorbs information in the same way, so personalised learning platforms are adapting to individual needs, improving outcomes across diverse student populations


The Leadership Imperative

As we move into 2026 and beyond, stakeholders will demand more accountability when it comes to AI. Not only on bias, privacy, and intellectual property, but also on AI’s environmental footprint.


Calls for smaller, more sustainable models (with consumers and employees demanding organisations justify their uses of LLMs), will grow louder - just as they did with Fair Trade and climate action, both of which began at grassroots level. 

More AI regulation is coming and leaders and organisations outsourcing responsibility for ethics and sustainability to third party suppliers and AI developers could face challenges and pushback from regulators, as well as stakeholders.


The expectation will be clear: demonstrate that your AI strategy is not only innovative, but also Responsible, Inclusive, and Sustainable.

***************


If you’re exploring ways on how to embed Responsible AI within your organisation, or simply want to understand where to begin, then please feel free to contact me on LinkedIn or, visit ExecutiveGlobalCoaching.com to learn more about how we work


Subscribe below to receive an alert as soon as I publish new editions of my LinkedIn newsletters or, to read previous editions:

1.   Responsible AI - (this one) Putting People and Culture at the heart of AI Strategy.



 



 
 
 

Comments


© 2025 ExecutiveGlobalCoaching.com

bottom of page