Training a single AI model can emit as much carbon as five cars in their lifetimes

Source: MIT Technology Review


In this article published by the MIT Technology Review, which comes from a scientific study (available here) on the carbon footprints of the training life cycle of several AI models. 

The scientific study was published by researchers Emma StrubellAnanya Ganesh and Andrew McCallum.

 
In terms of methodology, to measure its power, the researchers trained each of the models on a single GPU in one day. They then used the number of training hours indicated in the original model documents to calculate the total energy consumed during the entire training process. 

 

This was then converted into pounds of carbon dioxide equivalent based on the average energy mix in the US, which closely matches the energy mix used by Amazon's AWS, the largest cloud services provider. 

 

The AI models used to calculate the carbon footprint include the second iteration, GPT-2, of OpenAI's ChatGPT model. For those in the know, the other models analysed are ELMo, BERT, NAS, etc.

Figure 1 - Estimated cost of training a model in terms of CO2 emissions (lbs) and cloud compute cost (USD). Power and carbon footprint are omitted for TPUs due to lack of public information on power draw for this hardware.
Source : Energy and Policy Considerations for Deep Learning in NLP

The results obtained in this study are staggering. The researchers' findings show that training a model intended for research (at least 6 months) emitted a CO2 equivalent of 78,000 pounds, or in Kg an equivalent (website converter) of 35,380.2 Kg!

Figure 2 - Estimated CO2 emissions from training common NLP models, compared to familiar consumption.
Source: Energy and Policy Considerations for Deep Learning in NLP


Previous
Previous

Towards the systematic reporting of the energy and carbon footprints of machine learning