The projection is that the rapid explosion in artificial intelligence (AI) computing chips means AI needs 5 times more electricity than transportation. Is this really true?  Electricity for AI is due to huge computing needs to train AI data models. New customised chips from Nvidia, Tesla and others, designed from first principles, are much faster, and consumer low power. When Microsoft / OpenAI / Google / Meta / Tesla datacentre or others have over 100,000 AI computers, even at a few watts, the demand is TerraWatts of electricity and be an order of magnitude more than transportation.

It is ironic that electricity for AI will be more than the electrification of the vehicles themselves. 

“My not-that-funny joke is that you need transformers to run transformers. You know, the AI is like… There’s this thing called a transformer in AI… I don’t know, it’s a combination of sort of neural nets… Anyway, they’re running out of transformers to run transformers.

“Then, the next shortage will be electricity. They won’t be able to find enough electricity to run all the chips. I think next year, you’ll see they just can’t find enough electricity to run all the chips.

Elon Musk New Atlas

AI Demands Extremely Large Compute

AI is computer intensive.  Ark Invest in their 2024 white paper have predicted the major cost of training will decrease in cost exponentially and say that within a few years the cost to train a model from $4.5 million could fall to under $30 by 2030.

  • Training computers on ever increasingly large data sets. 
  • Inference computers that use the neural nets created. These can very powerful, or limited to run even in modern mobile phones. 

Tesla took a strategy of a very low power interference computer (hardware 3 or 4) in their vehicles that draw less than 100W. To compensate, they need much more compute in their training computers (exaflops or terraflops). Musk stated in March 2024 that Tesla  is no longer compute-constrained for its AI for Grok, autonomous vehicles, and their Robot Optimus. 

Herbert Ongs discusses this with his Brighter with Herbert “Tesla’s Hidden Compute Power: Bigger Than Expected! w/ Brian Wang”video. Brian Wang reports this in detail on his website..

Tesla published this graph on why they built their own DOJO supercomputer, while at the same time spending hundreds of millions buying NVIDIA H100 compute engines. They had about 2 Exaflops in Jan 2023 and planned to have 100 Exaflops by October 2024.

In mid 2023, XAi and Tesla announced a rapid build out of AI compute. (SemiAnalysis.) 

What is Tesla status in April 2024?

tesla compute projections in 2023.
From Tesla 2023

Brian Wang Wang states Tesla has not been alone in purchasing A100 chips or H100. Meta has purchased over 350,000 A100 chips, while Tesla has purchased over 10,000 H100.  A H100 is ~ 4 PFlops, In all probability Tesal has over 120 exaflops already by March 2024, which would explain Musk’s comments. 

By Dec 2024, Tesla will have bought and installed over 800 exaflops.

  • ~ 20,000 B100s, which are 20 PFlops each. I.e. 400 exaflops. (listed as primary purchaser of the Blackwell chips)
  • ~100,000 H100s i.e. 400 exaflops
  • Unknown in-house DOJO systems – maybe ~200

AI comput is increasing by 10x annually.  which allows AI to rapidly expand.

AI Datacenters Need MW of Electricity

  •  A Nividia H100 chip power demand is ~700W.
  • The new B200 Blackwell, with 8GPUs, 2 Intel GPUs is 14kW max with compute of 72 petaFLOPS training and 144 petaFLOPS inference.
  • A 20,000 equipped datacenter, along with petabytes of data storage would need power of 300MW and achieve 7 exaflops, 

A Microsoft data center employee says they cannot have more than 100,000 H100 in one state in the USA without breaking that state’s power supply..

Meta, Microsoft, and XAI are all looking to co-locate with under-utilized power plants and some have even proposed siting a data center with its dedicated nuclear power plant.

As AI requires data and cache, the difficulty to split out data geographically is too hard for transmission costs. 

Brian Wang, Nextbigfuture, Rough Projections has presented this table for Grok (XAI LLM) but is similar to OpenAI GPT.

Grok /
GPT Version
ComputeData SourceTokensEnergy (MW)Year
Grok-220,000 H100Regular data2202024
Grok-3100,000 H100All regular data401002024
Grok-42X50% synth & RWV data1001502025
Grok-55X75% synth & RWV data2003002025
Grok-610X90% synth & RWV data4003502025
Grok-725X96% synth & RWV data1,0004002026
Grok-8125X98% synth & RWV data2,0008002027
Grok-9625X99% synth & RWV data4,0001,6002028
Grok-103000X99.5% synth & RWV data8,0003,0002029
Table from Next Big Future [2]

Electrification of Transportation and Demand from AI

Transportation

Transport electrification increases electricity demand by about 20%. E.g. in Australia, the supply of electricity from the NEM is currently  210TWh annually. 

An additional 50TWh electricity would electrify transportation With an average age of vehicles 10 years, demand will increase by about 5-10GWh per year (over the 10 years).

Data Centers and AI Demand

  • Currently, data centers use about 5% of the USA electricity.  Many expect that to increase by 4X over the next decade.
  • Some suggest many electricity grids are unsuited to the green revolution. E.g. Alimeter More grids enable wider resilience to more intermittent electricity supply.  
  • The current USA electricity consumption is about 4,178 Terrawatt hour (TWh)
  • AI will add ~26 TWh to the electricity every year for every AI datacenter.
  • For 100 AI datacenters, demand would 2,600 TWh, a 50% growth of total energy demand and 4 times the expected energy demand from transportation.

Demand for Electricity for AI will exceed the demand for electricity for electrifying transportation. 

More Reading 

  1. The U.S. power grid wasn’t built for the green movement. 2023. https://altimetry.com/articles/this-is-the-next-big-power-grid-investment
  2. Tesla’s Hidden Compute Power Brian Wang 2024 https://www.nextbigfuture.com/2024/04/teslas-hidden-compute-power.html#more-194874
  3. Tesla AI Capacity Expansion – H100, Dojo D1, D2, HW 4.0, X.AI, Cloud Service Provider Semi-Analysis https://www.semianalysis.com/p/tesla-ai-capacity-expansion-h100
  4. Elon Musk: AI will run out of electricity and transformers in 2025 March 2024 https://newatlas.com/technology/elon-musk-ai/
  5. Big Ideas 2024 Ark Invest https://ark-invest.com/big-ideas-2024/