Artificial intelligence, or AI, is everywhere these days – whether it’s the news headlines, your Instagram explore page, your Google search page, or when you ask your friend, Siri, to call someone! It’s hard to imagine a life without AI some days, but how do the multifaceted pros and cons of AI affect the environment, and how does this relate to Nebraska and our everyday lives?
According to the Data Center Map, there are currently 36 data centers in the state of Nebraska, though not all of which are specifically AI data centers. This includes the multiple buildings at a single site, ranging from local/state data centers, to big names like Meta and Google, in Grand Island (1), Lincoln (3), Papillion (14), and Omaha (18). This does not include the Google Data Center being built in North Lincoln, which is expected to open in the fall of 2025.
What are Data Centers?
According to IBM, an AI Data Center is a facility that is the home of IT (information technology) infrastructure to train, deploy and deliver AI services. Data centers contain advanced computers, servers, networks, storage frameworks, and energy and cooling systems to manage the heavy work of AI. For example, Amazon's AI software, AWS, has more than 100 data centers worldwide, each of which has an estimated 50,000 servers used to support cloud computing services.
What is Generative AI?
Artificial intelligence is not a new invention, but generative AI has been taking the internet by storm, powering big names like OpenAI’s ChatGPT, Google’s Gemini, Facebook’s Meta, Microsoft’s Copilot, and Amazon’s AWS. According to MIT News, before generative AI, traditional AI made predictions based on existing data. Generative AI, by contrast, creates new data through machine-learning models.
Forms of traditional AI have been around since the early 20th century, but in the past decade, new, larger and more complex models have introduced generative AI models into our everyday life. This started in 2017 when Google introduced the transformer architecture, now used in most AI models today like ChatGPT.
Generative AI models have a wide range of applications, but their foundations are the same. AI converts information inputs into token sets, which consist of numerical representations of data sets. This enables many different applications, whether in language processing, image generation, or beyond.
According to The Royal Institute, November 30, 2022, marked a monumental milestone with the public release of OpenAI’s large language model ChatGPT. By January 2023, it became the fastest-growing consumer software application in history, with over 100 million users.
What makes AI different from your regular computing and software?
The Royal Institute differentiates regular computing and AI as follows: traditional computing uses algorithms to perform specific tasks, while AI systems are designed to mimic human intelligence and rely on data rather than explicit programming. Regular computing follows fixed instructions, whereas AI learns from data. Similarly, traditional computing operates on predefined rules, while AI applies reasoning that mimics human cognitive abilities.
How can AI be used as a potential solution to environmental issues?
AI’s ability to analyze complex datasets is extremely useful in tackling environmental challenges. According to the United Nations, AI is currently being used to:
- Monitor methane emissions: improving emissions data helps drive action and track progress.
- Track air quality: monitoring pollution helps guide public health decisions.
- Measure environmental footprints: assessing the lifecycle impacts of products can lead to more sustainable production.
- Reduce Information and Communication Technology (ICT) Emissions: using AI to optimize its own energy, water, and waste impacts.
AI has wide applications for potentially improving resource efficiency and sustainability. According to Yale's School of the Environment, AI-run smart homes could reduce households’ CO₂ consumption by up to 40 percent. However, despite its potential, AI’s operation requires significant resources - posing a complicated relationship between technology and sustainability.
How can AI be a potential contributor to environmental issues?
Computer Resources
To power AI, high-performance computing systems require specialized chips and accelerators. Common accelerators include GPUs (graphics processing units), which split complex problems into smaller tasks processed simultaneously. Another is the NPU (neural processing unit), designed to mimic the human brain for deep learning applications.
Hardware like GPUs, has a sizable environmental footprint, from raw material extraction to transportation. According to MIT news, mining for materials often involves toxic chemicals and harmful environmental practices, which is not specific to AI and is common in obtaining modern technology- like the chips needed for your smartphone/laptop. This mining can have negative impacts on the environment and human health like through air pollution, water contamination, habitat destruction, exposure to toxic substances, etc.
In 2023, an estimated 3.85 million GPUs were shipped to data centers by NVIDIA, AMD, and Intel- an increase from 2.67 million in 2022.
According to the MIT Technology Review, these components also have short lifespans, typically 2–5 years, generating large amounts of e-waste. The hardware contains valuable metals like copper, gold, silver, aluminum, rare earth elements, as well as hazardous materials such as lead, mercury, and chromium, which makes e-waste difficult to dispose of. According to a study published in Nature Computational Science, depending on the adoption rate of generative AI, the technology could add 1.2 million to 5 million metric tons of e-waste in total by 2030.
Extending the lifespan of AI hardware could reduce harmful extraction and e-waste. Though the issue extends beyond AI, it highlights the potential for improved sustainable technology practices.
Energy Resources
According to MIT, AI is extremely energy intensive: “Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development. By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts (which would bump data centers up to fifth place on the global list, between Japan and Russia).”
Training AI models requires enormous energy, which is long before consumers start to use it on their personal devices. According to a study from Cornell University, it is estimated that in order to create ChatGPT-3, with 175 billion parameters, it consumed 1,287 megawatts hours of electricity, which generated 552 tons of carbon dioxide. This is equivalent to 123 gasoline-powered vehicles driven for one year.
Although it's hard to measure the amount of energy and resources required for a single generative AI inquiry, it’s estimated to be four to five times more than a normal search engine such as a Google search. But now, even your Google search inquiries are now dominated by Google’s Gemini AI overviews, which require more energy since they create new information rather than drawing from existing information.
According to the International Energy Agency, due to the increase in AI use, the U.S. economy is set to consume more electricity in 2030 for processing data than for manufacturing all energy-intensive goods combined.
The environmental impact of this energy depends on the local energy grid, which determines its environmental impact through carbon emissions. According to the U.S. Energy Information Administration 60% of total energy in the U.S. is generated by fossil fuels, and in Nebraska 45% of the state's total net generation was fueled by coal-fired power plants in 2023.
According to research from Goldman Sachs, renewable or green energy sources are needed in the future of powering AI data centers and are receiving large investments from AI providers. They predict that 40% of the new capacity built to support increased energy demand will be from renewable sources. Ultimately, the environmental impact of this energy depends on the local energy grids.
Water Resources
AI data centers also require large amounts of fresh water to cool servers through evaporative cooling. With thousands of servers and towers used all day every day to power AI, technology applications lead to a lot of heat being produced, so fresh water is needed to cool down the servers, through evaporative cooling, to ensure operational efficiency. According to the United Nations Environment Program, AI-related infrastructure globally is predicted to soon consume six times more water than Denmark, a country of 6 million residents.
Water consumption differs greatly depending on a variety of factors like the size of the data center, temperature, humidity, cooling system type, season and temperature. According to Dgtl Infra, the average water consumption of a hyperscale data center, like Google for example, is 550,000 gallons per day or 200 million gallons per year. The water consumption is less for smaller wholesale and retail data centers of 18,000 gallons per day or 6.57 million gallons per year. Most of this water is from municipal or regional water utility companies, which is mainly potable drinking water.
According to Stanford University, “Google’s data center water consumption increased by about 20% from 2021 to 2022 and by about 17% from 2022 to 2023, and Microsoft saw about 34% and about 22% increases over the same periods…” By 2028, this could double or quadruple.
According to Stanford University, water consumption by data centers is a serious concern, especially in areas that are in high-stress water areas. Although AI is considerably less water intensive than other sectors, it is still a very water intensive resource.
AI data centers are increasingly located in high water-stress areas. According to Bloomberg News, ⅔ of new data centers built or in development are in such areas of high levels of water stress since 2022. AI data centers are being built all over the country, but 72% of the new centers in high water stress areas are in just five states: Virginia, Arizona, Texas, Illinois, and California.
Some used water is discharged back into municipal systems. For example, internet company Equinix used about 5,970 megaliters in 2023, equal to the annual use of 14,400 U.S. homes, with 60% evaporated and 40% returned.
According to a Yale School of the Environment interview with Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, he estimates that a 10-50 response interaction with GPT-3 uses about a half-liter of fresh water, but this varies by location, system, and model.
Disclaimer
Water and energy usage by AI data centers vary widely due to factors like climate, center size, and cooling systems. Much of the data above is approximate, as companies aren’t required to disclose exact usage. According to the Yale School of the Environment, “Right now, it’s not possible to tell how your AI request for homework help will affect carbon emissions or freshwater stocks.” Transparency and regulation are a potential solution for accountability, just like how Google Flights now shows emissions estimates, we could see similar visibility for AI.
AI in the United States:
The AI boom raises potential environmental and human health concerns, which could be seen in the U.S. An example of some of these concerns can be seen in southwest Memphis, Tennessee (CNN News).
The nearby city of Boxtown, Tennessee is a historically marginalized Black community with a legacy of pollution.“Colussus” as the AI super data center is referred to, is running on 150 megawatts of local public utilities, which is enough to power 100,000 homes. This concerns residents since they have experienced blackouts in the previous winters.
The primary concern is with the dozens of temporary gas-powered turbines that are being used to power this facility. The turbines are temporary, so they don’t require air permits. They release large amounts of toxic pollutants into the air, which is harmful to the atmosphere and the health of those in the surrounding area. Shelby County, where the center is located, already has poor air quality, receiving an “F” from the American Lung Association. Though the turbines are temporary, the timeline for their removal is unclear.
Residents have long raised awareness around environmental and human health concerns – from crude oil pipelines to clean air and water. The AI boom adds another layer to these ongoing challenges.
How are AI developers approaching sustainability?
Google( Gemini AI): (Google’s Sustainability Goals, not specifically AI related)
- Carbon: Reduce 50% of Scope 1, 2, and 3 emissions by 2030
- Water: Replenish 120% of freshwater used
- Waste: Achieve zero waste to landfill at data centers
- Renewable energy use matched 100% in 2023
- Focus on efficiency, carbon-free energy, water responsibility, and circular economy
- Net-zero emissions and water positive by 2030
Microsoft (Copilot AI): (Microsoft Sustainability Goals, not specifically AI related)
- Carbon negative, water positive, zero waste by 2030
- Support for societal shifts toward net-zero economy
- No sustainability goals found on their website
Regulations of AI
There are currently no national U.S. regulations on AI, but there are state and local regulations like in Colorado, New York, and Illinois.
Globally, the EU passed the AI Act, which is the first comprehensive legal framework worldwide that outlines risk-based rules for AI developers and users. The goal of “This Regulation should be applied in accordance with the values of the Union enshrined as in the Charter, facilitating the protection of natural persons, undertakings, democracy, the rule of law and environmental protection, while boosting innovation and employment and making the Union a leader in the uptake of trustworthy AI.”
Final Thoughts
Artificial intelligence is a part of our future, but at what cost? AI can benefit the environment, but also consumes vast resources. Moving forward, we should try to pursue technological innovation alongside environmental responsibility, which can potentially be done by engaging with companies and prioritizing transparency. Before you ask ChatGPT for help, remember the resources behind it.
Learn More: Upcoming Event
Join Conservation Nebraska for an engaging webinar on energy and water use in manufacturing, industry, and AI and how this relates to environmental impacts across our state. This webinar will be led by Dr. Bruce Dvorak a Professor of Civil and Environmental Engineering at the University of Nebraska-Lincoln. “Manufacturing, Energy, Water and AI: A Guide to Industry’s Environmental Impact in Nebraska” will be held on Monday August 11th from 6-7:00 PM at Bennett Martin Public Library or via Zoom.