• The Loop
  • Posts
  • 🔆 It's complicated: AI's Complex Relationship with Energy

🔆 It's complicated: AI's Complex Relationship with Energy

Plus: AI facial recognition app slammed with heavy fines, OpenAI building out US infrastructure and more

Was this email forwarded to you? Sign up here

🗞️ Issue 34 // ⏱️ Read Time: 7 min

Hello 👋

In last week’s newsletter, we explored the often-overlooked water consumption of AI systems and data centres. Now, we turn our attention to another critical resource: Energy. ⚡️

In this week's newsletter

What we’re talking about: The dual role of AI in the climate crisis: The substantial energy consumption of AI systems and data centers, as well as the AI-based solutions that have the potential to substantially reduce and change our energy consumption.

How it’s relevant: AI is increasingly integrating into our daily lives and powering critical infrastructure and services. As AI capabilities expand, so does its energy footprint, making this a pressing issue for technologists, policymakers, and consumers.

Why it matters: The increasing electricity consumption of AI contributes to carbon emissions and strains power grids. By addressing AI's energy use now, we can work towards more sustainable technological advancement and help combat climate change, its implications for global energy resources, and ultimately ensure that we move towards a bright future.

Big tech news of the week…

🌍 Anthropic launched Claude for Enterprise. Your entire organization can now collaborate securely with Claude — with no training on chats or files.

⚖️ AI company Clearview has been hit with heavy fines by the Dutch data protection watchdog. “If there is a photo of you on the Internet — and doesn’t that apply to all of us? — Then you can end up in Clearview's database and be tracked. This is not a doom scenario from a scary film. Nor is it something that could only be done in China”, the DPA Chairman said in a statement.

🌍 OpenAI is preparing for big infrastructure development in the US, increasing capacity for AI computing and data storage. The project involves international investors, something that has raised national security concerns from the Committee on Foreign Investment in the United States (CFIUS) and the US National Security Council.

Electricity Usage: The Power Behind the Intelligence

The issue: Compute loves energy

“Google’s greenhouse gas emissions are soaring thanks to AI”. This is one of many similar news stories we’ve seen lately. And yes, training large AI models consumes enormous amounts of electricity. As a result of the booming demand for generative AI and GPUs, AI’s electricity demand is forecast to surge, particularly in data centres worldwide. Let’s start with some numbers.

📚 A 2019 study found that training a single large AI model can emit as much carbon as five cars (5 x 🚗) in their lifetimes.

📈 From 2012 to 2018, the amount of compute used in the largest AI training runs has been doubling every 3.4 months: The energy consumption of AI is growing exponentially.

📊 Research estimates that data centres, cryptocurrencies, and AI consumed about 460 TWh of electricity worldwide in 2022, almost 2% of total global electricity demand, and this is expected to increase by 28% by 2026. 

Even though these viewpoints and analyst projections differ slightly in timelines, all point to the same conclusion: Global electricity demand will increase within the next few years. Drastically. And AI will play a big part in this increase.

Why so much power, and what does it have to do with the weather? 

Why are floods, droughts, tropical cyclones and other types of extreme weather becoming more common, and what does AI have to do with it all? 

Artificial Intelligence, especially deep learning models, requires massive computational power for both training and inference. These models, inspired by the human brain's neural networks, simultaneously perform millions or billions of mathematical operations, each consuming energy. The larger and more complex the model, the more power it needs.

The energy-intensive nature of AI directly translates to a significant carbon footprint:

  • Direct Emissions: From fossil fuel-based electricity generation used to power AI systems and data centres.

  • Indirect Emissions: From the production and transportation of hardware used in AI systems.

  • Lifecycle Emissions: Including the environmental cost of manufacturing and disposal of AI hardware.

Data centres are often run at full power 24 hours a day, seven days a week, regardless of demand. Electricity generation often relies on fossil fuels. When fossil fuels burn they emit carbon dioxide and other gasses. And this is what causes climate change, where extreme weather is one of the consequences. If you want to know more about it, have a look at this brilliant overview by NASA.

The flip side: Energy loves compute

At the same time, AI offers unsurpassed opportunities to deal with the issues we pointed out above. That’s right - the technology is really powerful and can help reduce energy consumption by dealing with challenges such as optimising properties, revolutionising the logistics and transport sectors, and helping innovate new materials. When AI is used to solve climate challenges, there are huge benefits for the planet (and, subsequently, for us as human beings). Here are a few examples:  

Task-Specific Power Consumption

In a first-of-its-kind research effort, Hugging Face and Carnegie Mellon University have provided insights into the energy consumption and carbon emissions of specific AI tasks. The study compares the energy requirements for training, fine-tuning, and inference across various AI models.

Type of AI Task 👇🏼

Energy consumption compared to charging a smartphone 📱

CO2 compared to driving a fossil-fuelled car 🛻

Image Generation

By far the most energy-intensive task.

Generating one image using the most energy-intensive AI model consumes about 50% of a full smartphone charge.

Generating 1,000 images with a model like Stable Diffusion XL produces as much CO2 as driving 6.6 kilometres in an average gasoline-powered car.

Text Generation

Significantly less energy-intensive than image generation.

Generating text 1,000 times consumes about 16% of a full smartphone charge.

The least carbon-intensive text generation model examined produced CO2 equivalent to driving 0.0009 kilometres.

Using large generative models for specific tasks (e.g., text classification or question answering) can consume up to 15 times more energy than models fine-tuned for that specific task. So, if you want to translate a text, the emissions will be higher if you use ChatGPT or Claude, instead of using a model focused on translation, like DeepL or Google Translate.

While training large AI models is energy-intensive, most of their carbon footprint comes from their actual use due to the frequency of operations. For very popular models like ChatGPT, usage emissions could exceed training emissions in just a couple of weeks.

The Emissions of the Key Industry Players

OpenAI

Training their GPT-3 model was estimated to consume 1,287 MWh, equivalent to the annual energy consumption of 120 U.S. homes.

Google

In 2023, Google’s total greenhouse gas (GHG) emissions increased 13% year-over-year, primarily driven by increased data centre energy consumption and supply chain emissions.

Microsoft

Microsoft’s total carbon emissions have risen nearly 30% since 2020 primarily due to the construction of data centers.

Addressing the Challenge

There are plenty of ways to solve the AI-related energy challenges we mentioned above. So, where to start?

  1. Create a standard of measurement: A main problem to tackle in reducing AI’s climate impact is to quantify its energy consumption and carbon emission, and to make this information transparent.

  2. Invest in renewable energy sources for data centres. 

  3. Consider the full lifecycle environmental impact when designing and deploying AI models

  4. Develop more energy-efficient algorithms and hardware (e.g., Google's TPUs designed for AI workloads, NVIDIA's energy-efficient GPU architectures)

  5. Carbon offset programs for AI companies is something that tends to be suggested as a solution. Unfortunately, these types of programs have proven to be flawed in most cases.

Some of the biggest environmental challenges, including climate change, can be solved through AI. It can shift current practices and catalyse sustainable development. The result could be a future where products and systems are radically improved. Yes, we know it seems complex, and it is. But ultimately, isn’t that what we would like AI to do? Solve real challenges so that we can continue to live our lives and focus on solving other, sometimes smaller, and often better, problems.  

Until next time.
On behalf of Team Lumiera

Emma - Business Strategist
Allegra - Data Specialist

Lumiera has gathered the brightest people from the technology and policy sectors to give you top-quality advice so you can navigate the new AI Era.

Follow the carefully curated Lumiera podcast playlist to stay informed and challenged on all things AI.

What did you think of today's newsletter?

Login or Subscribe to participate in polls.