• Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Artificial intelligence (AI) holds many promises for us today, including guidance toward our ecological future. However, if we do want to save the planet with AI, we must also consider the environmental footprint that comes with deploying this technology itself. It's hard to predict exactly how much AI will scale over the next few years, but it is crucial to take steps to make it as energy-efficient as possible moving forward.

Across the globe, data servers are humming away to bring our digital world to life. The planet's approximately 8,000 data centers are the foundation of our online existence and will grow further with the advent of artificial intelligence. However, along with this presence come the vast amounts of megawatts that are needed to operate these essential tools.

According to research projections, by 2025, the information technology (IT) sector may account for 20% of the total global electricity consumption and contribute as much as 5.5% to the world's carbon emissions.

Hence, the environmental repercussions associated with AI cannot be discarded, as neglecting them could lead to lasting and irreversible consequences. The adaptation of the world's data servers in order to prepare for AI integration is already underway. Indeed, it is being described by a Google executive as a rare and pivotal turning point in the field of computing.

The training of GPT-3 on a database of more than 500 billion words is estimated to require 1,287 megawatt hours of electricity and 10,000 computer chips. This is the same amount of energy that can power around 121 homes for a year in the United States.

A recent study by Google and the University of California, Berkeley, reported that training GPT-3 resulted in 552 metric tons of carbon emissions, which is equivalent to driving a passenger vehicle for 2 million kilometers or flying from Australia to the UK over 30 times.

The Exhaustive Work of AI

Artificial intelligence has emerged as one of the most revolutionary technologies, bringing about unprecedented transformations in various industries and already exerting a significant influence on our daily lives.

The development of generative AI tools such as OpenAI's GPT-4 or Google's PaLM2 can be dissected into two pivotal stages: the initial training phase and subsequent execution or inference. With this in mind, modern and larger AI models necessitate the utilization of increasingly powerful graphics processing units (GPUs) and protracted training times, leading to heightened resource expenditure and energy usage.

In the year 2023, AI witnessed an explosive surge in popularity, transitioning from the technological periphery to its central focus. AI-driven machines engage in memory-to-processor transactions, with each of these interactions consuming energy. As these tasks become more intricate and data-intensive, two key factors undergo exponential growth: the requirements for expanded memory storage and the escalating demand for energy resources.

Environmental Costs

The growth of AI technology and the expansion of data centers are contributing to a significant increase in energy demand, raising immediate concerns about their adverse effects on environmental sustainability and climate change.

In greater detail, AI language models operate on extensive distributed systems that span multiple servers and data centers. Consequently, the energy consumption cannot be attributed to a single model or server but is shared across the entire infrastructure, making it challenging to pinpoint specific energy usage for any individual model. Furthermore, as AI models serve various purposes and are concurrently accessed by numerous users, both the workload and energy required tend to fluctuate.

AI facilities rely on large-scale operations and the substantial energy usage that those entail. According to an expert at the University of Pennsylvania School of Engineering, each of these facilities annually consumes between 20 and 40 megawatts of power. Even at the lower end, this is enough to power nearly 16,000 households.

As companies like OpenAI, Google and Microsoft compete to develop increasingly advanced AI models, they do not fully disclose the precise amount of electricity and water they use for training and running their AI models, the sources of energy powering their data centers or the locations of some of their data centers.

One non-peer-reviewed study by researchers at UC Riverside estimates that the training of GPT-3 in Microsoft's state-of-the-art US data centers could have possibly consumed 700,000 liters of freshwater.

Additionally, manufacturers are racing to produce faster chips, and the faster these chips operate, the more heat they generate. This necessitates greater cooling, a process that accounts for 40% of the total energy expenditure in a data center. The global shift to liquid cooling presents its own set of challenges, as it in turn requires substantial water usage.

Addressing the Challenge

Imagine the prospect of AI alone increasing global energy demand by 10%, a figure that seems highly plausible considering the limited number of models currently in existence. This estimate may even be a conservative one, as businesses and governments are more and more compelled to embrace AI technology.

In response to this surge in energy demand, every conceivable energy source, including oil and gas wells, wind turbines, coal mines and nuclear plants, will be pressed into service, pending a breakthrough in energy efficiency.

While artificial intelligence holds promise for enhancing efficiency across various sectors, questions arise about whether this heightened efficiency actually contributes positively to the bottom line, especially when the costs exceed the savings in labor and waste reduction.

To mitigate AI’s carbon footprint, numerous experts recommend that AI pioneers incorporate renewable energy sources into their operations. In practical terms, the emphasis should be placed on discovering more sustainable methods to meet the substantial energy requirements of AI.

The relocation of AI processing to data centers is contributing to the reduction of AI's carbon footprint due to the increasing operational efficiency of data centers and their adoption of environmentally friendly energy sources.

Particularly concerning are the generative AI models, which consume the most substantial energy amounts; there is a pressing need to make them more green before they become more widespread. For instance, fine-tuning and prompt training within specific content domains is considerably more energy-efficient than training entirely new large models from scratch. Moreover, these approaches often provide more value to many businesses than generic model training.

Furthermore, while standard CPUs consume an average of 70 watts and GPUs consume 400 watts of power, tiny microcontrollers require just a few hundred microwatts, which is a thousand times less power, to process data locally without relying on data servers.

Given these considerations, there is a growing movement to make AI modeling, deployment and utilization more environmentally sustainable. The objective is to replace energy-intensive practices with eco-friendly alternatives. Both vendors and users must embrace this change to ensure that AI algorithms can be deployed widely without harming the environment. The technology is unprecedented, and it’s here to stay. We must therefore be sure to navigate this journey in a sustainable manner.

Pin It