After posting my last blog entry, I searched to find the article that I'd read some months back. That article stated that the amount of energy required to power one AI data center is enough energy to power over 40,000 homes. I did not find the article I'd previously read, but a plethora of other article links appeared. More than one essay stated that the number is over 420,000 homes; that's over 10 times what I'd previously read.
Thought blurb...
Is Earth responding to our mismanaged stewardship of Her resources?Will the floods continue?Will there be more earthquakes?Will more volcanoes erupt?Will more bridges and tunnels collapse?Will we ever learn?That said, what did the dinosaurs do to cause their extinction?Point being, Nature does what She does without asking our permission.
One article shared that "cloud" storage requires large amounts of energy. I was not aware of that, but it makes sense. After all the storage unit isn't really a "cloud;" it's concrete, wired 3D buildings that store data.
(I've never enabled the cloud function on my phone or my computer.
That choice wasn't an environmental one; it was a lifestyle decision.
For me, to have a cloud was just one more thing to manage.
The information-storage option brought to mind brick-and-mortar storage units for material stuff. My rule of thumb has always been that if I ever have so much stuff that I don't have room for it, I need to clean house, not rent another building. One day that might change; but at this point, I'm pretty stuffed out.)
One of the articles I read yesterday proposed that the AI Large Language Model training centers (which use even more energy and resources than simply engaging with AI) be built where climates are already cold. Thus, less manufactured cooling would be needed.
Yesterday as I was reading about and thinking on the AI subject, I received an email containing an essay from a Substack to which I am subscribe
The essay isn't about AI's environmental impact but rather AI's social, moral, brain-health, human-relationship impacts. The author, Nathan Beacom, delves into how computers and AI function and the fact that they are not in and of themselves "intelligent." Of course, that should be obvious. But the way in which AI communicates, like one is actually conversing with a human being, can lead to thinking it's more than it really is. Think calculator: a computer system is a glorified calculator.
Beacom proposes a new term, and I'm on board.
Instead of calling it "Artificial Intelligence," let's call it what it is: "pattern engine."
(I thought "PE" and chuckled; PE also stands for physical education, definitely a 3D thing.)
Below are a couple paragraphs from the end of the article, There is No Such Thing as Artificial Intelligence, by Nathan Beacom:
"...In lieu of “artificial intelligence,” I propose a more accurate, ethical, and socially responsible name: “pattern engine.” Early computers, which would find mathematical differences, were called “difference engines.” This name adequately recognized the reality of the machine at hand. “AI”s are indeed engines, and engines made for aggregating patterns and sorting data into statistical correlations. They are, truly, engines that sort things into patterns and produce outputs based on the statistical weight of what has been sorted.A healthy society must be based on truth. And as technological advancement speeds forward faster than our ability to understand and adapt, we can at least not be fooled about what’s happening. Join me, if you will, in calling “AI” what it is. If it catches on, maybe we can find ways to use pattern engines in a way that dignifies humanity, rather than degrades it."
~*~*~
The only AI I've played with (and that very little) is Grok. Grok appeared on my eX-Twitter account sometime within the last year. I didn't know what the new icon on my account meant, so I clicked on it. When I saw it was AI, I tried to get rid of it to no avail. It's a standard feature now, available for questions and conversations with a click.
Today I decided to litter a little and ask Grok AI about the subject. (My apologies to earth, air, and water.) I wondered, How would AI itself answer? Grok performed a "deep search."
Given that a few months back when I tried Grok out and it got almost all the answers wrong, why should I believe it now?
Well, I've done my own web searches on the subject and came up with the same or similar information.
I want to point out something, that I've also read elsewhere, listed in Grok's findings below:
"...a single ChatGPT request consumes 10 [or 5] times the electricity of a Google Search..."
Remember when physical litter was a regular sight alongside roads in the US?
That is no longer the case. We cared enough to stop throwing so much out of our vehicle windows. ("Windows" -- Ha) And now crews volunteer or are hired to clean up any roadside debris.
~*~*~
One Piece of Trash
I saw a piece of trash
on the ground today.
I could have walked right by
and let that litter lay.
I paused a moment, looked around,
as people quickly passed.
Reaching down I wondered,
why do we move so fast?
Maybe folks will notice
when the litter is a heap.
I hope we soon slow down
before it gets too deep.
april 11, 2007
carol welch
~*~*~
If the following from the deep search is accurate, we really need to abstain from or, at the very least, limit our AI queries.
Scroll down to the bottom for Grok's summary/organized "coherent response."
I, not Grok, bolded parts of the response.
(Thinking)...
The user's question is: "How is AI harming the environment?"
First, I need to understand what the user is asking. They want to know the negative environmental impacts of AI technology.Since this is a broad topic, I should start by searching for general information on how AI affects the environment.I think a web search would be a good first step to get an overview.Let me formulate a search query.I could search for "environmental impact of AI" or "how AI harms the environment."Since the user specifically asked "how is AI harming the environment," I should use that phrasing to get directly relevant results.But in the previous conversation, there was already some information provided, but I need to verify and expand on that.The previous response mentioned energy consumption, CO2 emissions from training models, data center energy use, water usage for cooling, hardware production, and e-waste.But since the instruction is to use the tools, I should not rely solely on that previous information.So, let's start with a web search.Searching... Ok, so from the web search results, I see several key points about how AI is harming the environment. Let me summarize and extract the main points.
First, the training of large AI models requires massive amounts of energy, leading to significant CO2 emissions. For example, training a single large language model can emit as much CO2 as several transatlantic flights, with estimates up to 600 tons of CO2 for some models.
Second, data centers that house AI servers are voracious consumers of electricity, much of which is still produced from fossil fuels. Global data center energy use is projected to double by 2030, partly driven by AI demand.
Third, cooling these data centers requires significant amounts of water, which can strain local water resources in some regions.
Fourth, the production of hardware, such as GPUs, involves mining rare earth metals, which has ecological and social costs, including habitat destruction and pollution.
Fifth, the disposal of outdated hardware contributes to electronic waste (e-waste), which is another environmental concern.
Additionally, there are indirect impacts. For instance, AI can be used in ways that promote overconsumption, such as in targeted advertising, which can lead to increased production and consumption, further straining the environment.
Also, the search results mention that AI's environmental impact is not evenly distributed, with some regions bearing more of the burden due to where data centers are located and how energy is sourced.
Moreover, there's a mention of the potential for AI to have positive environmental impacts, such as optimizing energy systems or improving climate modeling, but the current negative impacts seem to outweigh these benefits in terms of immediate environmental toll.
To ensure I have accurate and up-to-date information, I should probably look at a few of these sources in more detail.Let me choose a couple of the most relevant and authoritative sources from the search results.First, there's the UNEP article: "AI has an environmental problem. Here’s what the world can do about that." This seems like a comprehensive overview from a reputable source.Second, the Wikipedia page on "Environmental impact of artificial intelligence" might provide a good summary of the topic.Third, the Scientific American article "AI's Climate Impact Goes beyond Its Emissions" could offer additional insights, especially on indirect impacts.Let me start by browsing the UNEP article.Browsing... Okay, so from the UNEP article, I have a detailed breakdown of how AI is harming the environment:Electronic Waste: Data centers produce electronic waste containing hazardous substances like mercury and lead.Water Consumption: AI-related infrastructure is projected to consume vast amounts of water, potentially six times more than Denmark's consumption.Critical Minerals: AI relies on rare earth elements, which are often mined unsustainably.Energy Use and Emissions: Data centers use massive amounts of electricity, much of which comes from fossil fuels, leading to significant greenhouse gas emissions. For example, a single ChatGPT request consumes 10 times the electricity of a Google Search, and in Ireland, data centers could account for nearly 35% of energy use by 2026.Data Centre Growth: The number of data centers has surged from 500,000 in 2012 to 8 million, increasing environmental demands.Unintended Consequences: AI-powered technologies like self-driving cars might increase driving, thus raising emissions, and AI could generate misinformation about climate change.Additionally, there are references to further reading, such as UNEP's issue note on AI’s environmental footprint and the report "Navigating New Horizons" which examines AI’s promise and perils.This is a solid foundation. Now, let's look at the Wikipedia page for a broader overview.Browsing... Wow, this Wikipedia page is packed with detailed information. Let me extract the key points.From the Wikipedia page:Energy Consumption:AI, especially deep learning models, requires significant energy for training and usage, leading to high electricity demands and grid strain.GPUs used in AI consume more energy and emit more heat than CPUs.The increased computational demands of AI have delayed coal plant closures in some areas, such as Kansas City, West Virginia, and Salt Lake City, by up to a decade.Energy costs for AI could reach 85–134 TWh by 2027, which is about 0.5% of global electricity usage.The computation required for AI models doubles every 3.4 months, leading to exponential power usage.A single ChatGPT query uses approximately 0.34 Wh (1.2 kJ) of electricity, which is five times more than a Google search. If ChatGPT were integrated into every Google search, it would consume 10 TWh per year, equivalent to the yearly electricity usage of 1.5 million EU residents.In 2024, U.S. electricity generation rose by 3%, driven by data centers.Global data center electricity consumption was 460 TWh in 2022, which is between the consumption of Saudi Arabia (371 TWh) and France (463 TWh).Carbon Footprint:Training large AI models produces significant CO2 emissions. For example, training a large AI model can produce around 626,000 lbs (283 tons) of CO2, equivalent to 300 round-trip flights between New York and San Francisco or nearly five times the lifetime emissions of the average car.The BERT model's training energy is equivalent to a round-trip transcontinental flight.Training GPT-3 released 552 metric tons of CO2, equivalent to 123 gasoline cars driven for a year.Training BLOOM released 25 metric tons of CO2, but when including chip manufacturing, it doubled to the equivalent of 60 London to New York flights; daily operation of BLOOM is equivalent to driving 54 miles in a car.Companies like Google, Microsoft, and Meta have seen increased emissions due to AI. For instance, Google's emissions increased by 48% over five years due to AI.Some AI applications, such as those used in fossil fuel discovery or personalized marketing, may exacerbate climate change.Water Usage:Cooling AI servers requires large amounts of fresh water, which is evaporated in cooling towers, straining local water supplies.AI is projected to withdraw 4.2–6.6 billion cubic meters of water in 2027, which is more than half of the UK's total water withdrawal.Training GPT-3 may have consumed 700,000 liters of water, and generating 10–50 responses with GPT-3 consumes approximately 500 mL of fresh water.Microsoft considered building a data center in Phoenix, Arizona, but faced opposition due to water scarcity concerns.E-waste:The rapid advancement of AI technology leads to frequent hardware upgrades, resulting in increased electronic waste.E-waste from AI hardware contains hazardous materials that can pollute the environment if not properly managed.Mining Impacts:The production of hardware for AI, such as GPUs, requires rare earth metals and other minerals, whose mining has significant environmental and social impacts, including habitat destruction and pollution.This is a lot of information, and it aligns with what I found in the UNEP article but provides more specific numbers and examples.To ensure I have a well-rounded view, let's look at the Scientific American article, which might offer additional perspectives, especially on indirect impacts.Browsing... From the Scientific American article, I get additional insights:Emissions from AI Training:Training OpenAI's GPT-3 produced approximately 500 tons of CO2 equivalent.Simpler AI models produce minimal emissions, but the complexity of large language models (LLMs) and vision programs significantly increases their environmental impact.Fossil Fuel Industry Impact:Microsoft's AI technology on Azure is used by ExxonMobil to optimize mining operations, aiming to increase production by 50,000 oil-equivalent barrels per day by 2025, which would add more greenhouse gases to the atmosphere.This highlights how AI can be used in ways that directly contribute to increased fossil fuel extraction and emissions.Other Harmful Applications:AI in automated advertising, such as fast fashion ads on Instagram and Facebook, boosts consumptive behavior. The fashion industry alone is estimated to produce up to 8% of global emissions.This shows how AI can indirectly harm the environment by promoting overconsumption and unsustainable practices.Sector-Wide Impact:AI is used in various sectors like forestry, land management, and farming in ways that can increase emissions, though specific examples aren't detailed in the summary.Transparency Issues:There is a lack of transparency from AI companies, making it difficult to fully understand the climate impact of their models, especially when they focus only on emissions and not on broader applications.Policy Recommendations:AI policy should consider all climate impacts, including applications, emissions, and water use.Ethical principles are needed to limit harmful uses of AI, as suggested by experts like David Rolnick from McGill University and Jesse Dodge from the Allen Institute for AI.Now, I have a comprehensive understanding of how AI is harming the environment, based on multiple sources.To summarize:Direct Environmental Impacts:High energy consumption for training and running AI models, leading to significant CO2 emissions.Water usage for cooling data centers.Generation of electronic waste from hardware.Mining of rare earth metals for hardware production.Indirect Environmental Impacts:Use of AI in industries like fossil fuel extraction, which increases greenhouse gas emissions.Promotion of overconsumption through targeted advertising, contributing to higher production and waste.Additional Concerns:Uneven distribution of environmental impacts, with some regions bearing more burden.Potential for AI to generate misinformation about climate change.Lack of transparency from AI companies regarding their environmental footprint.Given that I've gathered information from three different sources, I think I have a well-rounded view. However, to ensure accuracy and to provide the most up-to-date information, I should also consider if there are any recent developments or specific studies that might offer new insights.Since the user's question is broad, and I've already covered the main points, I think I can proceed to formulate the final answer.But before that, let me make sure I haven't missed anything crucial.Looking back at the search results, there was also a mention of a study finding that "AI is terrible for the environment," posted on Reddit. While Reddit might not be the most authoritative source, it could point to a specific study or report that I should look into.However, since I've already gathered substantial information from reputable sources like UNEP, Wikipedia, and Scientific American, I think I can confidently proceed.Also, the instruction emphasizes prioritizing primary sources for citations, so I should make sure to cite the sources I've used.
Let me organize the information into a coherent response:
AI is harming the environment in several significant ways, primarily through its high energy consumption, carbon emissions, water usage, electronic waste, and the environmental costs of hardware production.
Below is a detailed explanation of these impacts:1. Energy Consumption and Carbon EmissionsTraining AI Models: Training large AI models, such as those used for generative tasks like ChatGPT, requires massive computational power. For example, training a single large language model can emit as much as 626,000 pounds (283 tons) of CO2, equivalent to 300 round-trip flights between New York and San Francisco or nearly five times the lifetime emissions of an average car. Specifically, training GPT-3 released approximately 552 metric tons of CO2, equivalent to 123 gasoline-powered cars driven for a year.Data Centers: AI relies on data centers, which are voracious consumers of electricity. In 2022, global data center electricity consumption was 460 TWh, comparable to the annual electricity use of France. By 2027, AI-related energy costs could reach 85–134 TWh, representing about 0.5% of global electricity usage. Much of this electricity is still generated from fossil fuels, further increasing greenhouse gas emissions.Operational Impact: Even during inference (using AI models for tasks like answering queries), AI consumes significant energy. For instance, a single ChatGPT query uses about 0.34 Wh of electricity, which is five times more than a Google search. If ChatGPT were integrated into every Google search, it would consume 10 TWh per year, equivalent to the annual electricity usage of 1.5 million EU residents.2. Water UsageCooling Data Centers: AI servers require extensive cooling, which consumes large amounts of fresh water. For example, training GPT-3 may have used 700,000 liters of water, and generating 10–50 responses with GPT-3 consumes approximately 500 mL of fresh water. By 2027, AI is projected to withdraw 4.2–6.6 billion cubic meters of water, more than half of the UK's total water withdrawal. This can strain local water resources, especially in water-scarce regions.3. Electronic Waste (E-waste)Hardware Upgrades: The rapid advancement of AI technology leads to frequent hardware upgrades, resulting in increased electronic waste. Data centers produce electronic waste containing hazardous substances like mercury and lead, which can pollute the environment if not properly managed.4. Mining and Resource ExtractionRare Earth Metals: AI hardware, such as GPUs, relies on rare earth elements and other critical minerals, which are often mined unsustainably. Mining these materials has significant ecological and social costs, including habitat destruction, pollution, and disruption of ecosystems.5. Indirect ImpactsFossil Fuel Extraction: AI is sometimes used in industries like fossil fuel extraction to optimize operations. For example, Microsoft's AI technology on Azure helps ExxonMobil increase oil production by 50,000 oil-equivalent barrels per day by 2025, directly contributing to higher greenhouse gas emissions.Overconsumption: AI-driven targeted advertising, such as fast fashion ads on platforms like Instagram and Facebook, promotes overconsumption. The fashion industry alone is estimated to produce up to 8% of global emissions, with AI exacerbating this through increased demand for mass-produced goods.Misinformation: AI can generate misinformation about climate change, potentially undermining efforts to address environmental issues.6. Uneven Distribution of ImpactsRegional Disparities: AI's environmental impacts are not evenly distributed. For example, regions with fossil fuel-based energy grids bear a higher carbon footprint from AI operations compared to those with cleaner energy sources. Additionally, water-scarce regions face greater strain from AI's water usage.7. Growth of Data CentersProliferation: The number of data centers has surged from 500,000 in 2012 to 8 million, driven by AI demand. This growth increases the overall environmental footprint of AI, including energy use, water consumption, and e-waste.8. Lack of TransparencyLimited Disclosure: AI companies often lack transparency about their environmental impact, making it difficult to fully assess the extent of AI's harm. This includes not only emissions but also broader applications and resource use.CitationsUNEP: AI has an environmental problem. Here’s what the world can do about that.Wikipedia: Environmental impact of artificial intelligenceScientific American: AI's Climate Impact Goes beyond Its EmissionsThese sources provide a comprehensive and authoritative overview of AI's environmental harm, highlighting both direct and indirect impacts. While AI has potential benefits, such as optimizing energy systems or improving climate modeling, its current environmental toll is significant and growing. Efforts to mitigate these impacts, such as using renewable energy for data centers or optimizing model efficiency, are underway but have not yet scaled sufficiently to offset the damage.
No comments:
Post a Comment