There is a new doomer narrative over artificial intelligence emerging in the background at this year’s COP meeting. This one isn’t focused on a malignant superintelligence. Instead, it is over sustainability and concerns over AI’s burgeoning energy demands.
A recent study projects that by 2027, NVIDIA’s new AI servers will be consuming over 85.4 terawatt-hours annually, exceeding the energy usage of countries such as Sweden and Argentina.
Research from the University of Massachusetts Amherst suggesting that training a single AI model can emit over 284 tonnes of CO2, equivalent to the lifetime emissions of five average American cars, paints a concerning picture of AI’s environmental impact. Annually, AI’s carbon footprint is approaching 1% of global emissions.
AI’s energy demands have indeed increased dramatically. A Stanford study flags a 300,000-fold rise in AI systems’ power requirements since the early 2010s. And some of this energy is derived from fossil fuels, with data centers globally consuming over 1% of global electricity, a third of which comes from coal and natural gas.
However, what the doomers miss is the ingenuity of human research and industry. Analyzing IT’s electricity consumption back to the 2000s, Jonathan Koomey and colleagues found that the energy intensity of the global data center industry dropped by around 20% per year between 2010 and 2018. Efficiency gains in data centers, chips, and programming have outstripped the increase in energy use.
This human factor is what the doomers’ narrative misses, suggesting that while AI’s energy demands are growing, so too are the efficiencies in the systems that support it.
AI software support
Innovations in AI also contribute to this trend of increasing efficiency. Techniques like “gradient compression” in AI training, a method being driven forward at my own institution, are reducing the energy required for AI systems to share and process data as they learn, whilst simultaneously speeding up the process.
AI equipment management
The impact of AI on energy efficiency extends beyond theoretical research. Google’s AI-driven approach to data center cooling has led to a reduction of about 40% in energy use, equivalent to taking 64,000 cars off the road annually.
In the energy sector, the adoption of “grid edge” AI technologies–everything from smart thermostats to better-managed solar panels–could lead to substantial reductions in utility emissions by 2030.
The challenge lies in ensuring that the efficiency gains and emission reductions achieved through AI outpace its own resource consumption. This requires a concerted effort across technology, governance, and collaborative research.
Industries must focus on developing smarter AI systems powered by renewable energy. Policymakers need to create frameworks that encourage innovation within environmentally responsible boundaries. Academic investments should target the exploration of AI in the realms of climate and clean energy.
While AI presents significant sustainability challenges, it also offers groundbreaking solutions. With responsible leadership that balances the benefits of AI with its environmental externalities, AI can positively transform systems to accelerate global decarbonization.
Striking the right balance is essential for AI to usher in an era of sustainable prosperity, moving beyond doomerism to a future where technology and environmental stewardship go hand in hand. The journey towards sustainable AI is not just about technological innovation but also about reimagining our relationship with technology in the context of our planet’s health. As we navigate this path, the decisions we make today will shape the sustainability of our digital future–and hopefully, that is something everyone at COP can agree on.
Source : Fortune