AI AI, Oh Oh: Artificial intelligence power consumption about to skyrocket – and no one is prepared

Artificial

Today’s musings go far into left field, but please do tag along. If you care about energy, you’ll want to hear all this. 

Though it has been around for a while, Artificial Intelligence (AI) has exploded in popularity in 2023. AI went from being a fringe thing to the thing. It has been a perplexing year in that regard; many first heard of AI in February and then by May were being told their entire profession was about to be wiped out. Yay, progress!

In a well-written and mercifully understandable article on LinkedIn, Susan Lindeque explains how this all unfolded so quickly. It is driven by, in her words, “a convergence of factors, including the availability of massive amounts of data, the rise of machine learning algorithms, the emergence of edge computing, and the convergence of AI with other technologies such as blockchain, 5G, and quantum computing.” A few of those aspects are worth examining with respect to energy.

In layman’s terms – all I have – this convergence is helping machines rapidly learn from massive amounts of data, continuously updated, with the effect that these machines can perform many tasks at least as well as, or better than, humans. ChatGPT, for example, can, with a simple query, write entire essays for students, to any length they want, because ChatGPT’s AI engine has scraped the web of all pertinent information. Conversely, teachers are not becoming suicidal over this; AI also provides a tool that spots AI generated output. You may notice a weird circularity here whereby AI is necessary to police itself; more on that topic in a second because it has some big ramifications.

AI has shown dramatic potential in many fields, and there is no doubt that it will knock some occupations on the head. This article chronicles how AI may revolutionize fields such as medicine, automotive, agriculture, and entertainment. 

So anyway, all great stuff, right? Except…what does this mean for energy? 

AI is a power hog the likes of which it is, frankly, hard to imagine. And it is about to explode in size. Consider these few examples. 

ChatGPT, the famous friend of students, is a good starting point if for no other reason than we’ve heard of it. One engineering prof estimates it can take up to 10 Gigawatt hours (GWh) to train the latest iteration of ChatGPT. According to the US Energy Information Agency, a typical home consumes about 11 Mwh per year. So that one training session consumes about the same power as a thousand homes do in a year. Chat GPT requires continuous updating, because it will only be of value if it routinely and often scrapes the web for all the latest updates.

And then there are the actual users, the millions of oddball requests for enlightenment that the machine must endure every day. Running ChatGPT for a typical day requires about 1 GWh, which is about the same daily consumption as 33,000 homes. There is no reason to think that load will shrink, quite the opposite in fact.

ChatGPT is just one of a whole wave of these things coming, and some of them are just off the charts. Consider what Tesla is up to: The pioneering auto firm is building its own AI supercomputer called Dozo. Sounds like a pet but wow, what an appetite. Dozo went into action a few weeks ago, using 10,000 Nvidia H100 CPUs. Each of those can consume 700 watts, so over a day that pile o’ CPUs in Dozo’s tummy will consume, fully cranked, 168 Mwh per day, or enough power for about 5,500 homes for a day. 

That might not sound too bad, but Tesla’s Dozo is, as of today, just a figurative tadpole. Fully grown, by the end of 2024, Dozo is expected to house 300,000 CPUs. That translates to the same power consumption as 165,000 homes. And that’s US homes; in Africa that would be about a small country’s worth. (Sharp eyed readers will note that the chart linked shows 300,000 A100 CPUs and not newer H100s, but don’t get too hung up – it’s a sh*tload of energy).

Now consider that there are thousands of large AI applications coming or underway. It will be an AI world – join in and compete, or get left behind.

Now, let’s take one more detour where it gets a bit weird, but highly plausible, and the potential load of this strange possibility is enough to make one throw up their hands.

A very smart person named Lyn Alden wrote an excellent piece on AI recently, well worth the read for the mind-bending ways she envisions AI possibly unfolding. The article is too massive and sweeping to integrate here, but a few nuggets are worth extracting. The first is the intersection of AI and Bitcoin.

Don’t laugh. Bitcoin may be so 2020, but BTC’s shards in the ceiling still hold interesting ideas and concepts. 

Recall that Bitcoin is based on a system called ‘proof of work’, where Bitcoins are issued to computers like dog treats for grinding away hour after hour solving artificial problems, the process of which updates all the blockchain records that give blockchain and Bitcoin their value. If computer processing power gets too large and Bitcoins are ‘earned’ too fast, the Bitcoin system simply makes it harder to earn a coin. Very cool, except that that simply means more power. Bitcoin is already a legendary power hog, currently consuming almost as much power as Argentina.

Alden in her article speculates on some ways that AI could spin into something incomprehensibly huge, and provides a fascinating and highly realistic scenario.

Imagine hackers or governments make an AI tool that is designed for hacking. Hacking is an iterative activity conducted by relative humanoid boneheads (relative to AI, hear me out) whereby each “advance” in hacking is often soon met with a development by the good guys to stop the bug. Note also that hackers seek to exploit weaknesses, a far different task than defending hacks that clearly show you what the weakness is. 

AI could sift through the web at warp speed, looking for openings, integrating the latest defense tools as part of its stategy, and just keep escalating.

The only way to “defend” against hacks generated by AI would be by having AI as the defender. If the hacking AI keeps functioning and expanding – which it need to to keep up with current defenses – then the defense would have to as well.

There are two relevant points here: One is that the generals in this digital war would soon leave their human masters behind and start doing things humans aren’t even aware of, within a very short time frame.

The second relevant point, maybe just a question, is: What would the power consumption of this madness be? How fast would it spiral? Can anyone even guess?

That is one singular aspect of AI. There could be hundreds or even thousands just like it.

Remember from few paragraphs ago, Bitcoin’s proof of work structure, and what a power hog that is. What if that ‘proof of work’ model, or something similar, becomes a necessity for AI management?

Alden poses the scenario whereby this could happen: “Now that artificial intelligence is making the creation of pictures, videos, texts, programs, and other things almost costless, it’s becoming harder to know which content is genuine compared to which content is a high-quality fake video or personality.”

Such developments could render much of the web useless, if AI was able to generate whatever it wanted for whatever purpose, and build legions of followers and likers and anything else to move the junk up the social media algorithms. Imagine a social media platform overrun by millions of bots or anonymous accounts, generating for-hire popularity of whatever the cheque-writer is paying for, or fake news, or whatever.

Alden speculates that one way to stop such a spiraling process would be some sort of ‘proof of work’ structure, like the Bitcoin model described earlier. To “earn” Bitcoin, actual units of energy must be consumed as processing power, and the power required to run one of the mining machines is the “proof of work”. Consuming actual energy, as Bitcoin mining does, would be one way to put a check on an endless stream of AI fake accounts.

What if that turned out to be the most feasible way to put a check on AI generated spam bot armies? 

How much power would that consume?

One last “and then add this” rant…AI has the potential to revolutionize many fields, perhaps none more so than medicine. Consider what the web knows about your eating habits, for example. It’s simple to skim credit card data to know how often you order pizza. Google also knows quite likely what you order on your pizza, if ordering online. Google has access to your grocery buying habits, if you belong to a points program and the grocer is willing to sell access to data. As a final indignity, many have installed an Alexa or similar into our houses. They listen. They know not just what we buy but what we talk about buying. Alexa knows that you tell the spouse to remember to get some sauerkraut, then Alexa knows that you berate your child for not eating it. Marketing gold. And we paid for the device.

Now imagine an AI engine that can capture all that data and cross reference it with, say, cancer. There’s an unbelievably large data project right there, utilizing I can’t even imagine how much power. AI could take a certain type of cancer and go back into the history of all its victims and look for commonalities or patterns. It could be the most amazing human medical development imaginable. All it takes is a few square miles of data centres and a few billion kilowatt-hours. In a process that needs to be repeated continuously.

Now, take that same culinary data set and test it across hundreds or thousands of diseases or conditions. Well look at that, if you eat spicy chicken wings twice a month AND you have freckles AND you hate green onions AND your cuticles’ weird white part is oversized you might have a 90 percent chance of developing, say, elbow cancer, or maybe Foreign Accent Syndrome (yep, it’s a thing). All this could mean astonishing breakthroughs in medicine, albeit via a bank of computers that would cover the landscape like an endless digital shantytown.

To put a bit more context to the growth potential, listen to Salesforce’ CEO on a recent conference call. Salesforce is one of the largest technology companies in the world, with a market cap of over $150 billion. He commented: “So every CEO I’ve met with this year across every industry believes that AI is essential to improving both their top and bottom line, but especially their productivity AI is just augmenting what we can do every single day…”

It was hard to find AI information that even speculated as to what the power load might amount to. One article from the University of Washington commented that the number of current ChatGPT queries currently consumes the equivalent of the daily energy consumption for about 33,000 U.S. households but that “While these numbers might seem OK for now, this is only the beginning of a wide development and adoption of these models…Also, as models become more sophisticated, they get larger and larger, which means the data center energy for training and using these models can become unsustainable.”

Another article quoted a physicist from the US National Institute of Standards and Technology: “If you just keep throwing more and more compute power and resources into these networks and scale their parameters, they just keep doing better…But we’re burning up more and more energy. And if you ask the question of how big can we make these networks? How many resources can we invest in them? You realize we really start to run into the physical limits.”

What’s astonishing is that the article this is quoted from was titled “A Thirst for Energy: Solving AI’s Power Problem”, and the article offered no solutions at all other than mapping out a new technology/technique to make AI more efficient, and the article concluded with a comment indicating our present helplessness in the face of this power-swilling juggernaut: “A computer can play Go and even beat humans…But it will take a computer something like 100 kilowatts to do so while our brains do it for just 20 watts…The brain is an obvious place to look for a better way to compute AI.”

Well, yeah, that’s factually correct but…is that all we have? “Let’s emulate the brain” as a solution to power consumption? Yikes.

Note that all this new demand for energy – renewable, coal, natural gas, oil, nuclear, all of it – will not displace existing demand; the growth will be incremental to it – new demand creation. Imagine adding 10% to global energy demand from AI alone, a number that seems eminently plausible based on the few models in existence already. And that could be hugely understated – businesses and governments will join the AI party because they have to, and they will source power where they have to. 

AI is coming at us in a massive, power-sucking wave, whether we like it or not. The very nature of its functionality and execution power points to escalating energy requirements that we can’t even imagine.

Every single oil well, gas well, wind turbine, coal mine and nuclear plant is going to be called into service, until some energy breakthrough occurs or the world adopts nuclear power at mass scale in a few years. 

Source: Boereport.com

ENB Top News
ENB
Energy Dashboard
ENB Podcast
ENB Substack