[ad_1]
The new AI Index report points out the vast amount of energy that AI like ChatGPT require — though there is a positive side, too
A small but important part of the AI Index Report for 2023¹ points to the growing concern about the energy consumption required for AI training.
Spoiler alert: it’s quite a lot.
There’s no standard benchmark for tracking the carbon intensity of AI systems, so the report focuses on research from a recent paper by Luccioni et al., 2022² which records the energy requirements of a number of large language models (LLMs) including ChatGPT.
The following table shows the energy requirements for training four different AI models and the CO2 emissions associated with it.
The data contains a number of measurements but the bottom line is represented by the power consumption and CO2 emissions which I have summarised in the charts below.
There is quite a difference between the various models and, as you can see, OpenAI’s GPT-3 comes top with a consumption of over 1200 Megawatt-hours. That’s about as much electricity as 120 US homes would consume in a year according to consumption figures by the U.S. Energy Information Administration³. That certainly seems like a lot of energy.
The chart below illustrates the CO2 emissions which follow a similar pattern.
Luccioni, the paper’s principal author, is a researcher at Hugging Face Inc. and the work is mostly concerned with BLOOM, her company’s alternative to ChatGPT. The figures for other models are approximate and based on what public information is available (Bloomberg reports Lucciana saying that nothing is really known about ChatGPT and that it could just be “…three raccoons in a trench coat.” — does that mean GPT-4 will be four raccoons?).
CO2 emissions for training ChatGPT are equivalent to around 500 flights from New York to San Francisco
The AI Index Report makes some comparisons with other energy-intensive activities and their CO2 emissions (see chart, below). It finds for example, that the CO2 emissions generated in training ChatGPT are equivalent to one passenger taking a flight from New York to San Francisco around 500 times! Or the total energy consumption of a single American over 28 years!
Unsurprisingly, the single air passenger does not produce zero emissions as it may appear from the chart above (the figure is nearly 1 tonne). You can see the actual numbers more clearly in this table:
But it’s not all bad news.
AI can also reduce energy consumption
According to Bloomberg, while AI models are getting larger (and presumably more energy intensive), the companies creating them are working on improving efficiency. Microsoft, Google and Amazon — the cloud companies that host much of the work — all are aiming for carbon-negative or carbon-neutral operations. This is, of course, highly desirable.
Also, while training AI systems is energy-intensive, recent research shows that AI systems can also be used to optimize energy consumption. A paper from DeepMind⁴ released in 2022 details the results of a 2021 experiment in which it trained an AI called BCOOLER to optimize cooling in Google’s data centres.
The graph above shows the energy-saving results from one BCOOLER experiment. After three months, a, roughly, 12.7% energy saving was achieved.
Even if carbon neutrality is achieved, the use of AI to increase the efficiency of these centres will also make them cheaper to run. Maybe we should be thinking about applying AI to other energy-intensive industries, too.
I doubt that we are currently in a position to know exactly what the eventual toll on the environment will be. LLMs like ChatGPT are not going away and so the energy that needs to be spent in training them is definitely going to be spent. On the other hand, it’s not the case that people are going to stop flying NY to SF, heating their homes or using their cars.
But we should try and put some of this somewhat shocking data into perspective. While a ChatGPT training session might use as much energy as one American does in 28 years (which sounds an awful lot), it is also true that 330 million Americans, the population of the USA, emit around 10 million times more CO2 than a single ChatGPT session⁵.
And there appear to be around 20 flights a day from New York to San Francisco, and say that each flight serves 150 passengers; that works out to be over 1 million tonnes of CO2 emissions per year — more than 2000 ChatGPTs⁵.
For single entities, ChatGPT, and its like, clearly use a lot of energy (and thus — at the moment, at least — produce a lot of CO2 emissions) but compared to energy consumption and CO2 emissions from other human activity, are they really very significant (there are, after all, a lot more humans than LLMs)?
Also, it’s got to be good news that the large cloud hosting companies are aiming to achieve carbon neutrality which, if achieved, will reduce CO2 emissions to zero. So while energy use might remain high, the aim is to make its environmental impact neutral.
Additionally, AI can be used to mitigate some of the energy use in data centres. Maybe similar technology could be used in airlines and other energy-intensive industries.
The bottom line, however, is that we are all producing more CO2 than we should, so any additional energy use, that is not produced from renewables, is moving in the wrong direction.
Thanks for reading, I hope you found this useful. If you would like to see more of my work, please visit my website.
You can also get updates by subscribing to my occasional, free, newsletter on Substack.
If you are not a Medium member you can sign up using my referral link and get to read any Medium content for only $5 per month.
References
- The AI Index 2023 Annual Report
Nestor Maslej, Loredana Fattorini, Erik Brynjolfsson, John Etchemendy, Katrina Ligett, Terah Lyons, James Manyika, Helen Ngo, Juan Carlos Niebles, Vanessa Parli, Yoav Shoham, Russell Wald, Jack Clark, and Raymond Perrault, “The AI Index 2023 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2023.
The AI Index 2023 Annual Report by Stanford University is licensed under Attribution-NoDerivatives 4.0 International.
You can find the complete report on the AI Index page at Stanford University.
5. CO2 emissions from other sources (these are rough calculations):
330 million Americans emit 18 tonnes of CO2 each year, that’s 330m x 18, 5900m tonnes of CO2–10 million ChatGPTs.
Approx. 20 flights (each day), NY to SF, with around 150 passengers on board produce 20 x 150, or 3000 tonnes of CO2. That’s 3000 x 365, about 1 million tonnes of CO2 per year — 2000 ChatGPTs.
[ad_2]
Source link