September 12, 2024

AI and Energy Expansion: Driving Demand for Reliable and Affordable Power

In today’s rapidly evolving digital world, Artificial Intelligence (AI) has emerged as a transformative force, touching almost every facet of our lives.

From voice assistants like Siri and Alexa to AI-powered recommendations on streaming platforms or the rise in “smart homes” with AI-powered thermostats, alarms, and appliances – AI’s influence and engagement in our daily lives are rapidly increasing.

Beneath the surface of these everyday conveniences lies a massive energy consumption network that powers AI’s increasing capabilities and reach. Data centres and “hyperscaler” facilities are the infrastructure meeting AI’s growing energy demands.

What are data centres, and why are they expanding so rapidly?

Data centres are complex environments created to house IT equipment and process and manage large volumes of digital data and computing resources. They provide the necessary infrastructure for businesses, organizations, and online services to store and access data, run applications, and ensure reliable, high-speed network connectivity. Every email sent, video streamed, or search query performed relies on data centres to function seamlessly.

Think of a data centre as a vast digital library, storing and processing enormous volumes of information. These facilities house countless servers, networking equipment, and storage devices working tirelessly to enable our online activities.

When someone’s phone data gets backed up to the “cloud,” it gets stored in data centres: massive facilities filled with thousands of computer servers running constantly. In the era of 5G and cloud-based storage, data centres have become essential infrastructural cogs, supporting everything from financial transactions to social media to government operations.

Data centres need a continuous and stable supply of energy to operate. According to the IEA, they now account for more than 1% of global electricity use.  About a third of the 8,000 global data centres are in the U.S., compared to 16% in Europe and almost 10% in China.

In January, the International Energy Agency (IEA) forecasted that global data centre electricity demand will double from 2022 to 2026, with AI significantly increasing.

What is a “hyperscaler”?

A “hyperscaler” takes the concept of data centres to the next level.

A hyperscaler is a cloud service provider that manages large data centres and offers cloud solutions to businesses and software vendors. Hyperscalers help businesses handle computing, storage, and other IT processes for millions of users and provide access to cutting-edge technologies like artificial intelligence, machine learning, and big data analytics.

Hyperscalers use hyperscale computing, a flexible method of processing data that can scale up or down quickly based on data traffic. They apply this method to their data centres and cloud services to accommodate fluctuating demand. Hyperscalers can adapt to market changes rapidly and cost-effectively, and their linked servers make the network more reliable. They also offer predictable costs, which can help companies plan and meet their business goals.

These colossal facilities handle the immense computational demands of AI. With hundreds of thousands of servers and advanced infrastructure, hyperscaler data centres are the powerhouses behind large-scale AI applications.  Some examples of hyperscalers include Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), IBM Cloud, and Oracle.

How are data centres linked to AI and why is AI so energy-intensive?

Data centres are energy-intensive operations for several reasons. First, the sheer number of servers and networking equipment continuously running consumes significant electricity. Second, these facilities require robust cooling systems to prevent overheating, which increases energy consumption.

However, AI has vastly increased the energy needed in these centres.

AI data centres use a lot of energy because they need to perform computationally intensive tasks, such as training machine learning (ML) and deep learning models. These tasks require large data sets, complex algorithms, and specialized hardware, which use much processing power and electricity. The amount of energy a data centre uses is proportional to the amount of computation it does, so larger AI models and data centres require more power.


For example, one study found that using an AI model to generate an image uses as much power as charging a smartphone. Similarly, a ChatGPT query uses ten times more energy than a standard Google query. Researchers recently found that the cost of the computational power required to train these models is doubling every nine months, with no slowdown in sight.

Some experts predict that data centre electricity consumption will triple by 2030, reaching 390 terawatt hours, the same amount of power used by 40 million U.S. homes. Operating 24/7, such a data centre could use 875 million kilowatt hours (kWh) of electricity annually.

The need for balanced energy solutions will become even more critical as AI reshapes our world. The demand for reliable, affordable, and lower-carbon energy to power these facilities is paramount.

That’s why Capital Power is committed to meeting this challenge head-on. Our investments in balanced energy solutions, including natural gas and renewables, and our focus on efficiency position us to provide the reliable, affordable, and clean energy needed to power AI’s future.

Learn more about our natural gas and renewable fleet powering businesses and homes across North America.

View our Operations