
The development of ever more powerful models for artificial intelligence is revolutionizing the world, but it doesn’t come cheap. In a newly distributed position paper, researchers at Seattle’s Allen Institute for Artificial Intelligence argue that more weight should be given to energy efficiency when evaluating research.
The AI2 researchers call on their colleagues to report the “price tag” associated with developing, training and running their models, alongside other metrics such as speed and accuracy. Research leaderboards, including AI2’s, regularly rate AI software in terms of accuracy over time, but they don’t address what it took to get those results.
Of course, cutting-edge research can be expensive in all sorts of fields, ranging from particle physics done at multibillion-dollar colliders to genetic analysis that requires hundreds of DNA sequencers. Financial cost or energy usage isn’t usually mentioned in the resulting studies. But AI2’s CEO, Oren Etzioni, says that times are changing – especially as the carbon footprint of energy-gobbling scientific experiments becomes more of a concern.
“It is an ongoing topic for many scientific communities, the issue of reporting costs,” Etzioni, one of the position paper’s authors, told GeekWire. “I think what makes a difference here is the stunning escalation that we’ve seen” in the resources devoted to AI model development.
One study from OpenAI estimates that the computational resources required for top-level research in deep learning have increased 300,000 times between 2012 and 2018, due to the rapid development of more and more complex models. “This is much faster than Moore’s Law, doubling every three or four months,” Etzioni said.