The environmental impact of artificial intelligence in question

LCOP27 has just ended, so now is the perfect time to consider the environmental impact of artificial intelligence (AI). The “megamodels” of statistical learning, these software whose training requires the adjustment of billions of parameters on billions of documents, have revolutionized automatic language processing since the introduction in 2017 of ” transformers by researchers from Google and the University of Toronto. The question of their environmental footprint arose very quickly, with the publication in 2019 of a article shock by researchers at the University of Massachusetts, estimating the CO₂ cost of driving one of these models to be five times that of the full life cycle of a car. A study A recent study from researchers at UC Berkeley and Google radically revises this figure, dividing it by 88 to reduce it to 1.8% of the full cost of a round-trip flight between San Francisco and New York.

Difficult to see clearly. But the good news is that the community has taken hold of the subject. The difficulty lies in taking into account all the factors, including the manufacture, maintenance and recycling of the computing and storage platforms in the calculation of the environmental footprint. Most recent studies focus on computational cost and infrastructure efficiency. One of them, published in 2020 in Science, indicates that the ratio between the total energy consumed by large computing and storage platforms (data centers) and that consumed by their IT equipment is now approaching, with a value of 1.1, its limit optimal.

It also shows that the energy consumption of data centers increased by 6% between 2010 and 2018, while their use increased by 550% over the same period. These improvements are due to the optimization of all elements: power consumption of computing and storage, virtualization, cooling, etc. Margins for progress still exist, making it possible to hope to contain the increase in consumption while the scientific community is working on models that are themselves more frugal in terms of computing time and data, ranging from generalist, compressed and optimized mega-models a posteriori, to “light” models, dedicated to particular tasks and exploiting the corresponding physical a priori.

A collective awareness

The question of the frugality of AI systems goes far beyond their environmental cost. How can academic players be guaranteed access to the gigantic amounts of data needed to train megamodels? How to interpret and audit a system with 200 billion parameters trained on a corpus of billions of documents? Energy frugality is also a key to limiting the necessary computing capacity, the transit of data to the cloud and the discharge of batteries in mobile applications ranging from cell phones to “smart” cars.

You have 18.27% of this article left to read. The following is for subscribers only.

We would love to give thanks to the writer of this post for this outstanding web content

The environmental impact of artificial intelligence in question

You can find our social media profiles here , as well as other related pages here