Data center
Dean Mouhtaropoulos | Getty

Computing / Cloud Computing

Is AI the next big climate-change threat? We haven’t a clue

Dire warnings are being issued about AI’s energy needs, but new chip technologies and even AI itself could help keep demands for more electrical power in check.

Jul 29, 2019
Data center
Dean Mouhtaropoulos | Getty

At a recent conference in San Francisco, Gary Dickerson took the stage and made a bold prediction. The chief executive of Applied Materials, which is a big supplier to the semiconductor industry, warned that in the absence of significant innovation in materials, chip manufacturing and design, data centers’ AI workloads could account for a tenth of the world’s electricity usage by 2025.

Today, the millions of data centers around the world soak up a little less than 2%—and that statistic encompasses all kinds of workloads handled on their vast arrays of servers. Applied Materials estimates that servers running AI currently account for just 0.1% of global electricity consumption.

Other tech executives are sounding an alarm too. Anders Andrae of Huawei thinks data centers could end up consuming a tenth of the globe’s electricity by 2025, though his estimate covers all their uses, not just AI.

Jonathan Koomey, special advisor to the senior scientist of Rocky Mountain Institute, is more sanguine. He expects data center energy consumption to remain relatively flat over the next few years, in spite of a spike in AI-related activity.

These widely diverging predictions highlight the uncertainty around AI’s impact on the future of large-scale computing and the ultimate implications for energy demand.

Bigger pictures

AI is certainly power hungry. Training and running things like deep-learning models involves crunching vast amounts of data, which taxes memory and processors. A study by research group OpenAI says that the amount of computing power needed to drive large AI models is already doubling every three and a half months.

Applied Materials’ forecast is, by its own admission, a worst-case scenario designed to highlight what could happen in the absence of new thinking in hardware and software. Sundeep Bajikar, the company’s head of corporate strategy and market intelligence, says it assumes there will be a shift over time in the mix of information being used to train AI models, with videos and other images making up a rising percentage of the total relative to text and audio information. Visual data is more computationally intensive and therefore requires more energy.

There will also be more information for models to crunch thanks to the rise of things like autonomous vehicles and sensors embedded in other smart devices. And the spread of super-fast 5G wireless connectivity will make it even easier to shuttle data to and from data centers.

Bajikar says these and other trends underline the urgent need for what his company calls “a new playbook” in materials and manufacturing for the AI era. Some researchers think AI’s thirst for power could even become a major environmental headache: a team from the University of Massachusetts, Amherst, recently published a study showing that training several popular and large AI models produces nearly five times the entire lifetime emissions of the average American car.

Betting on basics

But pessimistic forecasts ignore several important developments that could limit AI’s power grab. One of them is the rise of “hyperscale” data centers pioneered by companies like Facebook and Amazon.

These use vast arrays of basic servers tailored for specific tasks. The machines are more energy-efficient than servers in conventional centers that have to juggle a wider range of functions. An ongoing shift to hyperscale, along with advances in cooling and other technologies, is a big reason new data centers’ energy consumption has been basically canceled out by efficiency improvements over the past few years.

New kinds of microchips will also help. The Applied Materials forecast assumes AI workloads will continue to run on existing hardware whose efficiency gradually improves over the next few years. But a host of startups, as well as big companies like Intel and AMD, are developing semiconductors that harness technologies like photonics to power neural networks and other AI tools using far less energy.

Koomey says alarmist projections also ignore the fact that for some kinds of AI tasks, like pattern recognition, approximate outputs from models are sufficient. That means energy doesn’t need to be expended calculating results to hundreds of decimal places.

Ironically, the biggest check on AI’s power consumption could actually be AI itself. Google is already using technology developed by DeepMind, a company it acquired in 2014, to cool its data centers more efficiently. The AI had already helped the firm cut its cooling bill 40% by making recommendations to human operators; now it’s effectively running cooling systems in the centers by itself.

AI will be used to optimize other aspects of data centers’ operations too. And, like Google's cooling win, this will benefit all kinds of workloads. That doesn’t mean data centers won’t end up guzzling significantly more power because of rising demand for AI’s wizardry, but it’s yet another reason making forecasts here is so damn hard.