As Use of A.I. Soars, So Does the Energy and Water It Requires

Via Yale e360, a look at how fenerative artificial intelligence uses massive amounts of energy for computation and data storage and billions of gallons of water to cool the equipment at data centers. Now, legislators and regulators — in the U.S. and the EU — are starting to demand accountability:

Two months after its release in November 2022, OpenAI’s ChatGPT had 100 million active users, and suddenly tech corporations were racing to offer the public more “generative A.I.” Pundits compared the new technology’s impact to the Internet, or electrification, or the Industrial Revolution — or the discovery of fire.

Time will sort hype from reality, but one consequence of the explosion of artificial intelligence is clear: this technology’s environmental footprint is large and growing.

A.I. use is directly responsible for carbon emissions from non-renewable electricity and for the consumption of millions of gallons of fresh water, and it indirectly boosts impacts from building and maintaining the power-hungry equipment on which A.I. runs. As tech companies seek to embed high-intensity A.I. into everything from resume-writing to kidney transplant medicine and from choosing dog food to climate modeling, they cite many ways A.I. could help reduce humanity’s environmental footprint. But legislators, regulators, activists, and international organizations now want to make sure the benefits aren’t outweighed by A.I.’s mounting hazards.

“The development of the next generation of A.I. tools cannot come at the expense of the health of our planet,” Massachusetts Senator Edward Markey (D) said last week in Washington, after he and other senators and representatives introduced a bill that would require the federal government to assess A.I.’s current environmental footprint and develop a standardized system for reporting future impacts. Similarly, the European Union’s “A.I. Act,” approved by member states last week, will require “high-risk A.I. systems” (which include the powerful “foundation models” that power ChatGPT and similar A.I.s) to report their energy consumption, resource use, and other impacts throughout their systems’ lifecycle. The EU law takes effect next year.

Meanwhile, the International Organization for Standardization, a global network that develops standards for manufacturers, regulators, and others, says it will issue criteria for “sustainable A.I.” later this year. Those will include standards for measuring energy efficiency, raw material use, transportation, and water consumption, as well as practices for reducing A.I. impacts throughout its life cycle, from the process of mining materials and making computer components to the electricity consumed by its calculations. The ISO wants to enable A.I. users to make informed decisions about their A.I. consumption.

Right now, it’s not possible to tell how your A.I. request for homework help or a picture of an astronaut riding a horse will affect carbon emissions or freshwater stocks. This is why 2024’s crop of “sustainable A.I.” proposals describe ways to get more information about A.I. impacts.

In the absence of standards and regulations, tech companies have been reporting whatever they choose, however they choose, about their A.I. impact, says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, who has been studying the water costs of computation for the past decade. Working from calculations of annual use of water for cooling systems by Microsoft, Ren estimates that a person who engages in a session of questions and answers with GPT-3 (roughly 10 t0 50 responses) drives the consumption of a half-liter of fresh water. “It will vary by region, and with a bigger A.I., it could be more.” But a great deal remains unrevealed about the millions of gallons of water used to cool computers running A.I., he says.

The same is true of carbon.

“Data scientists today do not have easy or reliable access to measurements of [greenhouse gas impacts from A.I.], which precludes development of actionable tactics,” a group of 10 prominent researchers on A.I. impacts wrote in a 2022 conference paper. Since they presented their article, A.I. applications and users have proliferated, but the public is still in the dark about those data, says Jesse Dodge, a research scientist at the Allen Institute for Artificial Intelligence in Seattle, who was one of the paper’s coauthors.

A.I. can run on many devices — the simple A.I. that autocorrects text messages will run on a smartphone. But the kind of A.I. people most want to use is too big for most personal devices, Dodge says. “The models that are able to write a poem for you, or draft an email, those are very large,” he says. “Size is vital for them to have those capabilities.”

Big A.I.s need to run immense numbers of calculations very quickly, usually on specialized Graphical Processing Units — processors originally designed for intense computation to render graphics on computer screens. Compared to other chips, GPUs are more energy-efficient for A.I., and they’re most efficient when they’re run in large “cloud data centers” — specialized buildings full of computers equipped with those chips. The larger the data center, the more energy efficient it can be. Improvements in A.I.’s energy efficiency in recent years are partly due to the construction of more “hyperscale data centers,” which contain many more computers and can quickly scale up. Where a typical cloud data center occupies about 100,000 square feet, a hyperscale center can be 1 or even 2 million square feet.

Estimates of the number of cloud data centers worldwide range from around 9,000 to nearly 11,000. More are under construction. The International Energy Agency (IEA) projects that data centers’ electricity consumption in 2026 will be double that of 2022 — 1,000 terawatts, roughly equivalent to Japan’s current total consumption.

However, as an illustration of one problem with the way A.I. impacts are measured, that IEA estimate includes all data center activity, which extends beyond A.I. to many aspects of modern life. Running Amazon’s store interface, serving up Apple TV’s videos, storing millions of people’s emails on Gmail, and “mining” Bitcoin are also performed by data centers. (Other IEA reports exclude crypto operations, but still lump all other data-center activity together.)

Most tech firms that run data centers don’t reveal what percentage of their energy use processes A.I. The exception is Google, which says “machine learning” — the basis for humanlike A.I. — accounts for somewhat less than 15 percent of its data centers’ energy use.

Another complication is the fact that A.I., unlike Bitcoin mining or online shopping, can be used to reduce humanity’s impacts. A.I. can improve climate models, find more efficient ways to make digital tech, reduce waste in transport, and otherwise cut carbon and water use. One estimate, for example, found that A.I. -run smart homes could reduce households’ CO₂ consumption by up to 40 percent. And a recent Google project found that an A.I. fast-crunching atmospheric data can guide airline pilots to flight paths that will leave the fewest contrails.

Because contrails create more than a third of global aviation’s carbon emissions, “if the whole aviation industry took advantage of this single A.I. breakthrough,” says Dave Patterson, a computer-science professor emeritus at UC Berkeley and a Google researcher, “this single discovery would save more CO₂ than the CO₂ from all A.I. in 2020.”

Patterson’s analysis predicts that A.I.’s carbon footprint will soon plateau and then begin to shrink, thanks to improvements in the efficiency with which A.I. software and hardware use energy. One reflection of that efficiency improvement: as A.I. usage has increased since 2019, its percentage of Google data-center energy use has held at less than 15 percent. And while global internet traffic has increased more than twentyfold since 2010, the share of the world’s electricity used by data centers and networks increased far less, according to the IEA…

However, data about improving efficiency doesn’t convince some skeptics, who cite a social phenomenon called “Jevons paradox”: Making a resource less costly sometimes increases its consumption in the long run. “It’s a rebound effect,” Ren says. “You make the freeway wider, people use less fuel because traffic moves faster, but then you get more cars coming in. You get more fuel consumption than before.” If home heating is 40 percent more efficient due to A.I., one critic recently wrote, people could end up keeping their homes warmer for more hours of the day.

“A.I. is an accelerant for everything,” Dodge says. “It makes whatever you’re developing go faster.” At the Allen Institute, A.I. has helped develop better programs to model the climate, track endangered species, and curb overfishing, he says. But globally A.I. could also support “a lot of applications that could accelerate climate change. This is where you get into ethical questions about what kind of A.I. you want.”

If global electricity use can feel a bit abstract, data centers’ water use is a more local and tangible issue — particularly in drought-afflicted areas. To cool delicate electronics in the clean interiors of the data centers, water has to be free of bacteria and impurities that could gunk up the works. In other words, data centers often compete “for the same water people drink, cook, and wash with,” says Ren.

In 2022, Ren says, Google’s data centers consumed about 5 billion gallons (nearly 20 billion liters) of fresh water for cooling. (“Consumptive use” does not include water that’s run through a building and then returned to its source.) According to a recent study by Ren, Google’s data centers used 20 percent more water in 2022 than they did in 2021, and Microsoft’s water use rose by 34 percent in the same period. (Google data centers host its Bard chatbot and other generative A.I.s; Microsoft servers host ChatGPT as well as its bigger siblings GPT-3 and GPT-4. All three are produced by OpenAI, in which Microsoft is a large investor.)

As more data centers are built or expanded, their neighbors have been troubled to find out how much water they take. For example, in The Dalles, Oregon, where Google runs three data centers and plans two more, the city government filed a lawsuit in 2022 to keep Google’s water use a secret from farmers, environmentalists, and Native American tribes who were concerned about its effects on agriculture and on the region’s animals and plants. The city withdrew its suit early last year. The records it then made public showed that Google’s three extant data centers use more than a quarter of the city’s water supply. And in Chile and Uruguay, protests have erupted over planned Google data centers that would tap into the same reservoirs that supply drinking water.

Most of all, researchers say, what’s needed is a change of culture within the rarefied world of A.I. development. Generative A.I.’s creators need to focus beyond the technical leaps and bounds of their newest creations and be less guarded about the details of the data, software, and hardware they use to create it.

Some day in the future, Dodge says, an A.I. might be able — or be legally obligated — to inform a user about the water and carbon impact of each distinct request she makes. “That would be a fantastic tool that would help the environment,” he says. For now, though, individual users don’t have much information or power to know their A.I. footprint, much less make decisions about it.

“There’s not much individuals can do, unfortunately,” Ren says. Right now, you can “try to use the service judiciously,” he says.



This entry was posted on Thursday, February 8th, 2024 at 4:36 am and is filed under Uncategorized.  You can follow any responses to this entry through the RSS 2.0 feed.  Both comments and pings are currently closed. 

Comments are closed.


About This Blog And Its Author
As the scarcity of water and energy continues to grow, the linkage between these two critical resources will become more defined and even more acute in the months ahead.  This blog is committed to analyzing and referencing articles, reports, and interviews that can help unlock the nascent, complex and expanding linkages between water and energy -- The Watergy Nexus -- and will endeavor to provide a central clearinghouse for insightful articles and comments for all to consider.

Educated at Yale University (Bachelor of Arts - History) and Harvard (Master in Public Policy - International Development), Monty Simus has held a lifelong interest in environmental and conservation issues, primarily as they relate to freshwater scarcity, renewable energy, and national park policy.  Working from a water-scarce base in Las Vegas with his wife and son, he is the founder of Water Politics, an organization dedicated to the identification and analysis of geopolitical water issues arising from the world’s growing and vast water deficits, and is also a co-founder of SmartMarkets, an eco-preneurial venture that applies web 2.0 technology and online social networking innovations to motivate energy & water conservation.  He previously worked for an independent power producer in Central Asia; co-authored an article appearing in the Summer 2010 issue of the Tulane Environmental Law Journal, titled: “The Water Ethic: The Inexorable Birth Of A Certain Alienable Right”; and authored an article appearing in the inaugural issue of Johns Hopkins University's Global Water Magazine in July 2010 titled: “H2Own: The Water Ethic and an Equitable Market for the Exchange of Individual Water Efficiency Credits.”