University of California professors Don Patterson and Bill Tomlinson and MIT Sloan School of Management scientist Andrew Torrance studied the environmental impact of AI systems. They published their paper on arXiv, which states generative AI tools like ChatGPT and Midjourney consume significantly less carbon dioxide than humans when performing the same work.
Many scientists responded by providing insights supporting and refuting that data. We still need such discussions, whether you believe artificial intelligence consumes less or more greenhouse gases. After all, we are implementing this technology in nearly every facet of our lives worldwide. We must understand what future it could bring for humanity and the planet.
This article will discuss how MIT and University of California scientists measured ChatGPT energy consumption. Then, I will explain why some doubt these findings.
What is ChatGPT’s average energy consumption?
VentureBeat reported on the recent ChatGPT energy study from these scientists. It says they originally published the paper in March, but it is undergoing peer review at the time of writing.
The paper discovered an AI system like ChaTPT emits 130 to 1500 times less carbon dioxide equivalent (CO2e) than humans. Also, image generators like DALL-E 2 and Midjourney produce 310 to 2900 times less CO2e.
The authors analyzed existing data on the environmental impact of AI programs, human activities, and text and image production. These included information from studies and databases estimating how AI and humans affect the environment.
For example, they used an online ChatGPT estimate based on the traffic of 10 million queries, generating around 3.82 metric tons of CO2e daily. Moreover, they wrote off the training footprint of 552 metric tons of CO2e.
They used examples of the annual carbon footprints of average US people (15 metric tons) and India (1.9 metric tons) to compare different per-capita emissions effects over an estimated amount of time needed to generate a text page or picture.
You may also like: Climate change will shift rice consumption
They emphasized the importance of measuring carbon emissions from AI activities to inform policymaking on sustainability issues. “Without an analysis like this, we can’t make any reasonable kinds of policy decisions about how to guide or govern the future of AI,” Professor Don Patterson told VentureBeat.
“We need some sort of grounded information, some data from which we can take the next step,” he added. Also, scientist Andrew Torrance admitted flaws in their study.
He noted that “we live in a world of complex systems. An unavoidable reality of complex systems is the unpredictability of the results of these systems.”
Why do some refute the data?
The paper shares interesting insights but only caught the AI community’s attention when another expert used it to support his claims. Meta chief AI scientist Yann LeCun posted on X a chart from the study and captioned it with the following claim:
“Very interesting paper: using generative AI to produce text or images emits 3 to 4 orders of magnitude *less* CO2 than doing it manually or with the help of a computer.”
In response, it attracted the attention of AI researchers like HuggingFace AI member Sasha Luccioni. “You can’t just take an individual’s total carbon footprint estimate for their whole life and then attribute that to their profession,” she told VentureBeat.
Luccioni added, “That’s the first fundamental thing that doesn’t make sense. And the second thing is, comparing human footprints to life cycle assessment or energy footprints doesn’t make sense because, I mean, you can’t compare humans to objects.”
Another reason why people doubt ChatGPT energy consumption is because OpenAI and other AI companies don’t share exact information. El Pais newspaper shared a related statement from AI researcher Verónica Bolón:
“We don’t know exactly what it consumes, but it has to be [a] brutal [amount of energy]—both in training it and using it—because it needs a lot of data and very large neural networks.”
You may also like: NFT environmental impact
“And it’s not something people stop to think about because they don’t have information about it either. [The company] is called OpenAI, but in [terms of information about energy consumption] it’s not open at all.”
Some studies also cast doubt on the chatbot’s water consumption. Inquirer USA shared a study from Johnson Controls that suggests training the GPT-3 large language model consumed 700,000 liters of water.
That amount is enough to manufacture 370 BMW automobiles. Also, the website cites an arXiv study that says ChatGPT consumes a “500ml bottle of water” when it answers 20 to 50 questions.
Conclusion
A recent study claims that ChatGPT energy consumption is lower than human activity. However, some scientists doubt their findings due to numerous factors.
One is that OpenAI allegedly doesn’t disclose its exact energy usage data. Also, Professor Andrew Torrance says complicated factors (climate, society, and AI) add “unpredictability” to AI findings.
Nevertheless, their research will help us ensure artificial intelligence systems benefit humanity without harming the environment. Learn more about the latest digital tips and trends at Inquirer Tech.