1. Introduction
This blog post deals with the impact of digital, especially data centers, on the planet. This is a topic that I have been studying, with my fellows at the NATF, for more than a decade. In 2010, the rise of datacenter activity led to many warnings and worries in the news, leading to a first analysis in 2013, which was also made available as a research paper co-written with Erol Gelembe. The 2010-2020 decades confirmed our analysis and I published a blog post two years ago, showing the relative stability of data center consumptions, using actual numbers for 2010 -2015 and estimates for 2020, borrowing the great work of Jens Malmodin.
Three years later, if we want to look at the situation at the end of 2023, genAI has occurred. In general, the use of AI has increased; in particular, the search for foundations models and the massive use of LLMs has created a surge in two things: the consumption of existing datacenters and a craze to build new massive datacenters. We have all read news articles about the complexity hence the electricity consumption of using ChatGPT versus a Google search, or the exponential growth of ChatGPT adoption (which has brutally stopped as we shall later see). Still genAI is both a huge success and a “game changer”, which has in turn created lots of requests for opening new data centers. The new (high) power density of these datacenters, combined with their location (with hubs of concentration such as Virginia or Ireland) has created worried about the state of the power grids and their capacity to accommodate such surges of power consumption.
Since the GAFAM and hyperscalers are the major players and the key purchaser of NVIDIA GPUs, they have received a lot of attention as both their electricity consumption and CO2 emissions were growing. At the same time, they continued their progress towards green energy, in an effort to offset the electricity consumption growth. At first, it was mostly about “matching” (buying green energy certificate to match their consumptions) but it became “assigning” (assigning dedicated decarbonated energy sources to their data centers).
The goal of this blog post is simple: first, to provide with a revised evaluation for 2020, as we have more mature numbers, and then to propose an estimate for 2023, taking both into account the increase of computing power demanded by AI growth and the progressive decarbonization of clouds. The focus here is on datacenters but I will expand my analysis to the complete scope of digital (following Malmodin’s distribution between ICT: Information & Communication Technologies and EM: entertainment and media). Thus, I will provide with a complete revised version of the analysis of the impact, energy and CO2, of “digital”, similar to what I proposed two years ago. To conclude this post with some more exciting content, I will also disclose a 2030 forecast, especially a forecast of AI datacenters power consumption which is the “topic of the moment” when Trump is announcing Stargate. Remember that this 2030 forecast is just a guess, what matters in this blog post are the 2020 and 2023 sets of figures.
This post is organized as follows. Section 2 is an update on the 2020 analysis which I did two years ago. I will focus on datacenter power consumption and try to establish a credible consensus. Section 3 is about 2023 and AI impact. I will use two approaches (bottom-up and top-down) to estimate the growth of hyperscaler data centers and the overall growth. I will also take a look at their claim towards “green clouds” that are fed with renewable energy. Section 4 is both the prospective 2030 section and the place where I will talk about devices, in an effort to extend the numbers proposed by Jens Malmodin.
2. 2020 Update
2.1 ICT new numbers for 2020
The electrical consumption of data centers, excluding bitcoin mining, has remained almost stable between 2000 and 2020. The reference source is the article "ICT sector electricity consumption and greenhouse gas emissions – 2020 outcome", from Jens Malmodin, Nina Lövehagen, Pernilla Bergmark, Dag Lundén, as this team has been working on the subject for over 15 years and produces detailed, well-referenced compilations. The 2020 figure for datacenter is consistent with other sources for 2020. The International Energy Agency (IEA) documents show 200 TWh for 2015 and a range of 220-320 TWh for 2021, but a 10-year tracking of IEA estimates suggests considering the lower end of the range. Various articles published in 2020 confirm Malmodin's figures and the fact that data center electrical consumption (growing in terms of computing power) remains stable; see for example "Recalibrating global data center energy-use estimates" by Masanet et al (frequently quoted) which provides a detailed explanation of economy factors (similar, but more in-depth, to the 2013 ADT document analysis), or Cunliff in "Beyond the Energy Techlash: The Real Climate Impacts of Information Technology". It should be noted that ITIF figures are lower (around 180 TWh), as are Statista figures (see Figure 2) reused in the article "How much energy do data centers consume?" by Tech Radar.
Thes figures are very close to those published on this blog two years ago. However, the stability of the global picture hides a deep transformation at work, the move to cloud computing and especially the shift of computing from private computing to hyperscalers (very large cloud players such as AWS, Azure or GPC, who have made their massive scale and hyper-automation a major competitive advantage). Between 2015 and 2020, they grew from 30 TWh to 76 TWh, with growth continuing through 2023. This spectacular increase from AWS, Azure, and Google Cloud is naturally reflected in the figures for Amazon, Meta, and Alphabet, and explains part of the media attention they received this year (see Section 3.4). Here is a chart borrowed from Statista that shows this massive shift.
So far, we have left aside bitcoin mining from the data center figures (which is what IEA or Malmodin do). The true novelty compared to 2015 figures is not AI, but the electrical consumption of blockchain and bitcoin mining. This specifically refers to bitcoin, because the blockchain consensus algorithm "proof of work" is highly energy intensive. Other blockchains like Ethereum, which have transitioned to "proof of stake", are negligible compared to bitcoin mining, and private blockchains (like Hyperledger) are already included in previous datacenter figures and are completely insignificant. The following figure gives an idea of the value of this consumption, which has been continuously growing since 2018: approximately 70 TWh in 2020, around 120 TWh in 2023 (nearly half of datacenters, which represents about a third of global computing!). This figure is taken from https://digiconomist.net/bitcoin-energy-consumption. See also the previously cited Malmodin article (75 TWh in 2020 for blockchain) and the article "Bitcoin Energy Consumption Hit New Record in 2023" which quotes the Cambridge University analysis at 120 TWh for 2023.
Let's recall here that, faced with this unbelievable situation (a very high consumption for low societal value), China banned bitcoin mining on its territory in 2021. Consequently, 38% of mining is now operated in the US, which strongly contributes to the datacenter energy supply crisis, that we will discuss later – see the EIA analysis for more details.
2.2 Discussing the uncertainty level of datacenter electricity consumption
Before moving on to 2023, I will discuss the level of confidence that we may have in the 220 TWh number for data centers. We need to calibrate this 2020 value, because uncertainty will amplify as we move forward. The following figure, interestingly comes from a 2020 IEA publication and shows the distribution by region (from a total of 200 TWh to be compared with the interval that I mentioned earlier). This distribution helps to zoom on each zone to consolidate the confidence level.
The 70-80 TWh for the US in confirmed from many sources, such as “datacenter dynamics”, who gives the 17 GW total for 2022, which translates roughly to 85-90 TWh, which fits with the trajectory that we shall see later on (and shows that the forecast part of the previous chart is wrong, datacenter consumption started to grow in 2020). The 40 TWh order of magnitude for Europe is also confirmed from different sources such as “Energy Consumption in Data Centres and Broadband Communication Networks in the EU” or the Science paper from Masanet (205 TWh in 2018) that one also finds in a longer study.
Let us address the divergence that starts to appear as early as 2020 from three examples. The data center dynamics article “the trouble with data center energy figures” proposes the following chart:
As I have stated many times, I do not care so much about forecasts, they are bound to be wrong, and I am not surprised by the wide range of guesses. It is more interesting to look at the divergence about the present (or immediate past). For instance, IEA states in its annual report : “Estimated global data centre electricity consumption in 2022 was 240-340 TWh, or around 1-1.3% of global final electricity demand. This excludes energy used for cryptocurrency mining, which was estimated to be around 110 TWh in 2022, accounting for 0.4% of annual global electricity demand”. In that case, if you look at the IEA sources you will find the papers that I have mentioned already. You need to have followed IEA for ten years to understand why there is an upper bound that reflects “information disclosed by electricity operators” and that seem constantly high. A second interesting example is the Berkeley analysis that one finds the in the excellent document “2024 United States Data Center Energy Usage Report” that I will use later in this paper. I reproduce the following chart because it is a root cause of many of the alarming papers that appeared last year (including IEA). You can see that up to 2018, it follows all the previously quoted estimates, from Malmodin to Masanet. However, it shows a sharp inflexion with over 100 TWh in 2020 and 170 TWh in 2023. We shall discuss 2023 in the next section, but the previously quoted anchor of 85 TWh in 2022 makes me skeptical about “the solidity of the solid line” of the picture. It illustrates the point that you need up to three years to produce a scientific estimate.
This 170 TWh estimate for the US data centers is important because it has been propagated in many other studies such as the DOE report, and it is in stark contrast, as far as 2020 is concerned, with another Berkeley document “United States Data Center Energy Usage Report” (since Malmodin is a co-author of that one, you will not be surprised that it matches the Malmodin vision quoted earlier). To give you one example among many of the discrepancies that one can find on this topic, Statista produced a chart for 2022 that shows the consumption of data centers at no less than 500 TWh.
I need to end this section with a CAVEAT: if I wanted to do justice to the amount of conflicting evidence I have accumulated in the past 10 years, I would need to write a 50 pages document. So, as far as the 2020 figures go, you can either trust the (biased) evidence that I have displayed or do your own analysis. From my perspective, 2020 data center consumption is known +/- 5%. The uncertainty will grow for the 2023 estimates presented in the next section.
3. 2023 Data Center Estimate
3.1 Trend Estimates for Datacenters and Hyperscalers
I will now turn to 2023, which is still too recent to have access to peer-reviewed scientific analysis, but old enough to get enough numbers for an estimate. The method that I propose is to start from the previous hyperscaler/other cloud/ other datacenters distribution to estimate the trend from the known figures, and then to evaluate the order of magnitude of “gen AI inflexion”.
We know that the trend shown in Section 2.1 has continued and that hyperscalers have accelerated their growth. The following figures are copied from Statista and show the electricity consumption from Google and Microsoft (which we can also get from the corporate sustainability reports as we shall see in Section 3.4). If we take these three sources, readjust the first figure with the 220 TWh value for 2020, we get an estimate of 110 TWh for hyperscalers (from which Microsoft and Google roughly contribute to 25 TWh each) in 2023, up from 75TWh in 2020.
We will now see how much must be added to reflect the genAI acceleration.
3.2 Impact of Generative AI
The contribution of GenAI to hyperscaler growth from 2020 to 2023 is moderate, probably around 20 TWh. To establish a scale, the electrical consumption for training GPT4 was estimated at 50-60 GWh, while the global consumption for ChatGPT is reported at 180 GWh for 2023, with usage of 10,000 NVIDIA GPUs. NVIDIA represents over 80% of high-performance GPUs used for GenAI, so it's possible to make an approximate calculation from the number of chipsets delivered in 2023, using (upper limit) the peak consumption of an H100 at 700W. Since NVIDIA sold 500,000 H100s in 2023, out of a total of 1.5M GPUs, this yields an increase of around 10 TWh for the GPU addition of 2023. To find out who NVIDIA’s main customers are, read “The Scariest Nvidia Statistic That Virtually No One Is Talking About”: The four biggest customers are Microsoft (15%), Meta (13%), Amazon (6%), and Alphabet (6%, though a significant portion of Google’s AI runs on TPUs designed by Google).
It's interesting to explain how this approximate value of 10 TWh is produced. I have used two excellent support documents : (1) « AI Data Center Energy Dilemma » from SemiAnalysis, which explains the electricity consumption of an AI data center built around DGX H100 clusters (see the « Data Center Maths » section), and (2) NVIDIA document that explains the functioning of the DGX rack. The annual consumption of an H100 is often evaluated at 3.7 MWh, based on a 60% load. This is just an order of magnitude for our purpose- there are other, less power-hungry chipsets, and not all H100s are used for GenAI either. For the calculation proposed in this note, 700W is considered for H100s and 300W for other GPUs, with an annual usage rate of 61%. Given the 2023 volumes mentioned above, I'll consider 1 million "H100 equivalent" for a first approximation. One must then consider the other elements of the DGX rack (the CPUs and network cards necessary around the GPU), as explained in the SemiAnalysis document. This increases consumption by about 80%, to which must be added the energy required for cooling. This additional factor is reflected in the classic PUE, with an average value for GAFAM estimated at 1.2. The PUE (Power Usage Effectiveness) is roughly the ratio between the energy consumed by the data center and the energy transmitted to the server chipsets. The best reference I found is the article "AWS global data centers achieved PUE of 1.15 in 2023". To provide context, here's an excerpt: "Google claims its facilities average a PUE of around 1.1, with its best site offering 1.06. Meta's facilities, on average, offer a PUE of around 1.08. Microsoft's newest facilities achieve a PUE of around 1.12, with a global facility average of 1.18 across the portfolio. Oracle has said its data centers offer a PUE 'as low as' 1.15'." Note that progress on PUE since 2010 is the primary reason explaining stable consumption despite increasing computing power (cf. Masanet). To better understand my load ratio hypothesis, I share here a figure from the excellent report “2024 United States Data Center Energy Usage Report” that I mentioned earlier. 61% is a proper compromise between what hyperscalers reach for their operations (including the use of LLM for answering queries) and the higher intensity use during training.
The overall result is that one H100 GPU generates approximately 10 MWh of consumption per year (with the supporting environment from the data center), which gives 10 TWh per year for an equivalent of 1M H100s in 2023. Given the explosion in LLM model sizes and associated training, this scale allows us to retain 20 TWh as the acceleration of AI consumption associated with generative AI between 2020 and 2023 (based on the cumulative sales of NVIDIA and the genAI traffic that is known – cf section 4).
3.3 Data Center Electricity Estimate
We can now plug this “genAI estimate” into the previous estimate to get a global figure of 260 TWh for data centers in 2023. This total comes from: (110 + 20) TWh for hyperscalers, 95 TWh for other Clouds, and 35 TWh for private data centers. This, by the way, provides a hypothetical breakdown of 70-75 TWh for AWS, 25 TWh for Google and Microsoft, and 5-10 TWh for others (This order of magnitude for AWS is consistent with another value mentioned on the web, but remains a simple hypothesis since AWS does not share its electricity consumption numbers).
Since this is far from the alarming numbers that we have seen in the press, one could be surprised. But bear in mind that:
- 2023 and 2024 have seen a huge increase in plans for new datacenters. It is important to see the difference between what runs in 2023 and what is in the plans for the future.
- As shown by the following figure borrowed from VisualCapitalist web site, data centers are heterogeneously distributed and some parts of the world or the US do have an large concentration. So a few tens of additional TWh may indeed cause a problem.
- Many data centers are being built or rebuilt to accommodate the massive energy density of the NVIDIA chipset and the associated necessary cooling infrastructure. The new generations of GPUs require water cooling, which is more efficient than air cooling while allowing for better performance. However, it necessitates redesigning the architecture of data centers. Combined with the power density demanded by racks like the DGX, this explains the need to build new data centers, even if it means demolishing those that were just recently built (a famous example being Meta).
Also note that the level of uncertainty for a 2023 estimate is higher. I quote 260 TWh, but the estimation range is 240 – 280, and my confidence level is definitely less than for the 2020 updates.
3.4 Data Centers CO2 - Green Energy
When reading the documents by Malmodin & al, it becomes evident that the carbon impact decreases between 2015 and 2020 while electricity consumption remains stable. This is because the assumptions regarding carbon density (gCO2/kWh) decrease from 614 in 2015 to 426 in 2020. This drop reflects the gradual integration of green (decarbonized) energy into the power supply of data centers, starting with hyperscalers.
This subject is complex: the initial arguments from hyperscalers were based on the use of certificates, which are debatable from a systemic perspective. Certificates allow hyperscalers, who are wealthy, to purchase the green energy production capacity of their region, while carbon-intensive electricity is left to others. Since then, hyperscalers (and other digital giants such as Meta) have committed to developing a "green electricity roadmap," either by building their own green energy capacity or by entering into agreements with energy operators to develop new dedicated capacities.
When reading the "Corporate Sustainability Reports" of the three major hyperscalers, it is very challenging to determine exactly what is already operational and what is soon to be operational. For example:
Amazon plans to have nearly 77 TWh of green electricity, compared to 52 TWh of decarbonized electricity used in 2023. Amazon explains: "[We] Match 100% of the electricity consumed by our global operations with renewable energy by 2025—five years ahead of our original target of 2030."
Google reports that 64% of its electricity is green, with a goal of 100% by 2030.
Microsoft appears close to achieving its "green data center" goal, with 19 GW "contracted," producing 23 TWh of green electricity for a similar need.
It is important to distinguish between "green matching" (e.g., when Amazon matches their consumption with renewable energy purchases) and "green assigned production" (where green energy is either self-produced or sourced through partnerships with dedicated infrastructure).
For future calculations, decarbonisation assumptions for hyperscalers’ electricity must be made, represented in the following figure: 80g/kWh in 2023 and 40g/kWh in 2030. These figures are a compromise, leaning toward conservative estimates based on what is stated in reports. Although hyperscalers are particularly proactive, the rest of the industry is also striving for decarbonized electricity, which is reflected in the (slight) decline in the "gCO2/kWh" curve for other data centers that we shall use later on.
4. 2030 Prospective
4.1 Talking about Devices
Before I discuss 2030, I will propose a short update on the topic of devices, to get a global picture similar to what I did two years ago. This blog post is long enough, and I did not have enough time to do a proper analysis – that is, to question the complete LCA (Life Cycle Analysis) of each kind of device. It turns out that change is on its way : on the one hand, both manufacturing and operations, for one kind of device, is making progress; on the other hand, devices are getting more powerful (chipset, features, sizes) so the complete CO2 cost of a laptop, for instance, is not moving much. What I propose in this section is to keep the unit carbon costs from Malmodin and to simply update the volumes.
To discuss the impact of digital technology, we need to focus on the devices used by ICT (Information and Communication Technology) and EM (Entertainment & Media). The first step is to simplify the analysis by Malmodin & al for 2015 by defining the following categories:
For ICT:
- PCs (grouping desktops and laptops together)
- Monitors
- CPEs (Customer Premises Equipment), which include internet access boxes provided by operators (their numbers increase with the penetration of broadband access) and Wi-Fi routers.
- Smartphones, which have a relatively low energy consumption impact but a significant manufacturing impact.
- All other categories identified in Malmodin & al's document, grouped here as "other ICT."
For the E&M sector (focusing only on the digital segment):
- Televisions,
- STBs (Set-Top Boxes),
- Other devices (such as home cinema or hi-fi equipment), grouped under "other E&M."
Understanding the impact of a device can be simplified into two questions: (1) What is the CO2 impact of manufacturing (and delivery)? What is the CO2 impact of usage? The volume driver for the first question is the number of units manufactured, while the driver for the second is the installed base. The following figure reproduces Malmodin’s 2018 document, which covers 2015 (a very comprehensive study providing all the figures on the left side). Meanwhile, the right side reflects the figures from the Malmodin & al article of 2023, which covers 2020. Since the new article provides installed base figures but not shipment figures, I conducted a bibliographic search (via Statista and other sources) to find the 2020 values. As I said earlier, I deduced the CO2 manufacturing costs by keeping the unit costs constant.
To attempt an evaluation of these figures for 2023 and a prospective vision for 2030, each category must be examined separately. For ICT devices, the following information is available:
ICT Devices:
Smartphones: Sales peaked around 2018 (approximately 1.55 billion devices per year) and have been slowly declining since, reaching 1.34 billion in 2023. The installed base continues to grow, but at a slower pace (projected to reach 7.2 billion by 2025, accounting for multi-device ownership).
PCs: Sales experienced a spike during COVID-19 but returned to 2015 levels in 2023, with a slightly downward trend over the past 10 years (source: Gartner).
Monitors: Information on monitors is scattered, but the general trend points to stability (one the COVID bump is ignored) .
CPEs (Customer Premises Equipment): The number of devices follows broadband penetration trends, with the growth rate slowing, leading to a gradual decline in shipments.
E&M Devices:
Televisions: The number of installed TVs continues to increase due to multi-device ownership, while the number of households with TVs grows very slowly. This results in a slight decrease in shipments (source: Statista).
STBs (Set-Top Boxes): Shipments are now around 200 million, down from 2015 levels. This reflects the slowing growth of the installed base, which is tied to broadband deployment.
This data enables the construction of the following table, which provides an estimate for 2023 and a forecast for 2030. This table is based on volumes and does not challenge the unit values provided by Malmodin. Therefore, it is only an approximation.
4.2 Forecasting Datacenters Growth
Trying to guess how datacenter consumption will evolve until 2030 is mostly the difficult exercise of estimating which kind of growth with can expect from AI in general and generative AI (foundation model) in particular. Before venturing an educated guess, I will point out a few conflicting trends.
The race to increase model size will continue, although it seems that it is no longer just about the number of billions of parameters for large language models (LLMs), but rather about composite architectures using multiple LLMs. However, progress in learning methods is constant, and it is not possible to simply draw a linear extrapolation of the power required. Also, software progress, as demonstrated by open-source LLM and recently by DeepSeek, shows that the race to the ever-more-powerful GPUs is not the only way. This is a point that Yann Le Cun has made for a while, that open-source research for new foundation models will provide both more diversity and solution that are more energy efficient.
GPUs are making constant progress in energy efficiency. NVIDIA's Blackwell generation, which will replace the Hopper (H in H100), offers both a performance leap and a reduction in energy consumption for the same calculations by a factor of 25. However, the new chipsets will be even more power-hungry (power requirements will increase from 700W to 1000W and potentially 1500W), raising the issue of the rebound effect. On the software side, an excellent summary of the ongoing progress in learning method efficiency can be found in the online document "Situational Awareness" by Leopold Aschenbrenner (efficiency in both cost and power consumption).
The question posed in early 2023 about the race for size and power to build monopoly positions around a few generative AI leaders seems to have found a different answer. There is a consensus around a more distributed approach: no single solution is perfect, and the best response is a variety of LLMs used for different tasks. Apple, for example, has taken its time to integrate generative AI into its upcoming mobile operating systems (evident from Siri's lag compared to ChatGPT), but this (new) revolution is coming. It appears it will leverage the global network of a billion neuromorphic chips already present on our smartphones.
While an increase in AI usage, particularly generative AI, is expected, the assumption of a tenfold increase that we have seen in some papers is not very credible, primarily due to the investments this would require. Major AI players (OpenAI, Microsoft, Google, Meta) have made significant investments in 2022–2023 and likely will in 2024, amounting to tens of billions of dollars. It is unlikely that these investments will scale by an order of magnitude given the relatively low additional revenue generated. The following chart is a reminder that after the short exponential and spectacular growth of ChatGPT, we are seeing now very small growth.
Still we have all heard about the Stargate project and the hypothetical $500B investment into genAI. There is a debate about how much money will actually be spent, so it is interesting to evaluate two scenarios : a $100B investment (which is also the same order of magnitude as what Satya Nadella announced) and a $500B investment. We can use the rule of thumb from the 2023 analysis: $50B gives 1M of H100 and yields 10 additional TWh. Of course, this is just an approximation, but if you agree with the previous analysis, $1B will give you more or less the same amount of GPU during the upcoming years, much more computing power and approximately the same energy expense during this decade. Hence applying the 2023 ratio is not silly and yields: One big Stargate is 100 additional TWh, while a small one is 20TWh. We may also apply this kind of thinking to Mark Zuckerberg quote, shared on Facebook last week : “ In 2025, I expect Meta AI will be the leading assistant serving more than 1 billion people, Llama 4 will become the leading state of the art model, and we'll build an AI engineer that will start contributing increasing amounts of code to our R&D efforts. To power this, Meta is building a 2GW+ datacenter that is so large it would cover a significant part of Manhattan. We'll bring online ~1GW of compute in '25 and we'll end the year with more than 1.3 million GPUs. We're planning to invest $60-65B in capex this year while also growing our AI teams significantly, and we have the capital to continue investing in the years ahead.”. 1GW of power yields approximately 5TWh (for a year at 60% load), $65B is indeed 1.3M GPU using our ratio, for 13TWh (2GW at 70%). So, our crude ratio is no so crude after all, to get a rough estimate.
Given all of this, guessing what the data center consumptions will be in 2030 is not easy. If nothing much was happening (the big players that have invested 10s of billions of dollars with no matching incremental revenues so far might slow down at some point), 300 TWh would be a good bet. If the high level of spending corresponding to a full Stargate program materializes, 400 TWh is the proper estimate. I can envision a few key players putting a $100B ticket, but not many of them, so I selected as my estimate the value of 350 TWh, but my confidence level here is not better than 20%. Still, this is very different from the 800 or 1000 TWh estimates that one may find in the newspapers, most of which originate from the same IEA reference.
4.3 Networks and Blockchains
Let us complete the picture with a forecast of network and blockchain electricity consumption. As far as networks are concerned, I have no other technique than using the previously observed trends over the past years, with a small upper buffer that reflects the IEA more aggressive forecast. For blockchain, it is much harder considering the variability of the past four years. I was quite scared two years ago and I had projected up to 300 TWh, now I see that there are some stabilizing counterforces (such as China’s decision, which could be replicated in a form or another). Thus, my new forecast in 200 TWh in 2030.
To evaluate the CO2 impact, we need to assess the expected level of decarbonation that we can expect, as explained in Section 3. Clearly, there is no coordinated efforts from the network operators similar to what the hyperscalers are doing (which we just described) but there are some efforts, so I figured that applying the previous Malmodin ratios was too conservative. Although the move to green electricity that is necessary for Net-zero scenarios is nowhere in sight, there is a constant progress so I picked an improved value of 400g/kWh for Networks and for bitcoin mining (here I may still be too conservative). The following is the resulting forecast for datacenters, networks and blockchain in 2030, both in TWh and CO2 density.
5. Conclusion
We can now put all these numbers into one big table and refresh what I had published two years ago. The following table shows the impact of the ICT sector, from 2015 to 2030. Remember that 2023 is an estimate, and that 2030 is just a guess.
To complete the « digital scope”, I added the “E&M” devices, as explained earlier, to get a full picture with a scope that is consistent both with Malmodin and my previous analysis.
For those of you who will compare with my previous figures of two years ago, you will see that the estimates for 2020 have turned out to be pretty accurate, but that my forecast for 2030 has evolved:
- The impact of AI is indeed accelerating the electricity consumption of data centers, but the decarbonisation of “green cloud” will probably reduce the CO2 impact.
- I am less pessimistic about blockchain, some stabilisation forces are at play, although the overall impact of “proof-of-work” is appalling.
- The overall impact of digital, expressed as a percentage of worldwide electricity consumption or CO2 emissions, has not changed significatively since my last estimates/forecast, nor are they moving considerably.
Since these ratios are everywhere in the literature and wrong most of the time, here they are in a separate table (the denominators for the ration are themselves nothing but educated guesses as far as 2030 is concerned):
Let me propose a few comments on top of these “raw” numbers:
- Digital growth is a problem since we would like to see its impact decline … even if net-zero in 2050 is a fantasy. Thus, although the general theme of this paper is that the “fears about AI-generated consumption” have been greatly exaggerated, we still have a digital sustainability problem in front of us.
- Indeed, the scenarios that I have read quoting datacenter consumption being multiplied by 4 in 2030 seem highly improbable. As said earlier, we should expect a very large growth in computing, a large but contained growth in electricity consumption and a moderate but accelerating decline in CO2 emissions.
- The key ratios are interesting to remember – as said, they are different from what is often quoted, and they are rather stable in time as far as CO2/GHG are concerned. As electrification is the major battle of the upcoming climate change revolution, the decreasing ratio about electricity (7% to 5.5%) is not good news, it is expected.
I share these numbers with an antifragile data modeling mindset: I welcome contradiction and dissenting numbers (with data sources) as a way to learn more. This is a really complex topic since we do not have enough data from primary sources (versus a huge number of comments on the same sources), thus I make no claims that these estimates are correct.