The Allen Institute for Artificial Intelligence (AI2) is proposing a new way to incentivize energy-efficient machine learning.

Exploding footprint: More researchers are sounding the alarm about the growing costs of deep learning. In 2018, OpenAI published a study showing that the computational resources required to train large models was doubling every three to four months. In June, another study found that developing large-scale natural-language processing models, in particular, could produce a shocking carbon footprint.

The trend is driven by the research community’s emphasis on advancing the state of the art—with little regard to costs. While there are leaderboards that celebrate performance breakthroughs, for example, they rarely mention what those incremental improvements cost. Often, linear increases in performance are unlocked through exponential increases in resources. At this rate, one expert predicts, AI could account for as much as one-tenth of the world’s electricity use by 2025.

Rich get richer: These statistics aren’t just concerning from an environmental perspective. They also have implications on the field’s diversity and advancement. The sheer amount of resources needed to produce notable results privileges private over academic AI labs. This could restrict the field’s development to shorter-term projects that are more aligned with corporate incentives rather than longer-term advances that would benefit the public, for example.

Show your work: In a new paper, researchers at the Seattle-based AI2 have proposed a new way to mitigate this trend. They recommend that AI researchers always publish the financial and computational costs of training their models along with their performance results. The authors hope that increasing transparency into what it takes to achieve performance gains will motivate more investment in the development of efficient machine-learning algorithms.

Oren Etzioni, the CEO of AI2 and an author on the paper, also thinks that paper reviewers for publications and conferences should reward those that improve efficiency as much as accuracy. Until people standardize efficiency metrics, it will be difficult to evaluate the importance of such a contribution. “I view reporting these numbers as necessary but not sufficient,” he says.

Why now? Recent years have seen a dramatic escalation in the amount of computing power that corporate research labs are throwing at deep learning. 

But Etzioni hopes the community can be more aware of the trade-offs. Plus, investing in more efficient algorithms could wring more mileage out of available resources and produce other gains. It’s not an either-or thing, he says: “We just want to have a better balance in the field.”