The Massive Need For Energy Due To AI
The winner in the AI race is going to be the country that can produce enough energy to power its AI centers. If Jenson Huang is correct, we are looking at AI "factories" becoming the norm in the future. Unfortunately, when it comes to energy efficiency, AI systems trail that of humans. The amount of brain output we get per joule is massive compared to the AI systems.
In the United States, a recent move was made away from solar and wind. All subsidies tied to those technologies were abolished. Regardless of one's view on them, the reality is all forms of energy needs to increase. This applies to both renewables and fossil fuels.
Could the US be joining Europe in shooting itself in the foot? By making renewables more costly to implement, scaling could be a problem. Drilling can only take one so far. Europe went the opposite direction, killing off much fossil fuel production under its Net Zero policies.
One country that is ignoring this is China. It is getting as much energy as possible, regardless of the source. It has a commitment to both fossil fuel and renewables.
AI is energy hungry. Efficiency in chips helps but it also scales at a slower pace.
The Massive Need For Energy Due To AI
Elon Musk took a novel approach to the energy situation.
To power the expansion of xAI's Memphis cluster, Musk went a bought a natural gas power plant.
A post on X detailed some of the happenings with xAI in Memphis. Musk's response is the key.
“They just bought a power plant from overseas and are shipping it to the US because they couldn’t get a new one [built] in time,” Patel said. “They’re doing all this crazy shit to get the compute.”
Musk responded to the post, describing it as “accurate.”
Although the company declined to comment on the situation, xAI is believed to be pushing towards 1 million GPUs in Memphis by the end of the year. The estimates are there are 340K H-100 equivalents already running.
Some believe the power plant is 2 GW. There is a reason why so much power is required.
The company has announced plans for a second data center in the city, following the purchase of a one million sq ft (92,903 sqm) site on Tulane Road, which is located in the Whitehaven area of Memphis. The site is located next to the Southaven Combined Cycle natural gas power plant, which generates 780MW of power, but even if it were to gain access to the entire output of the facility, it would not be enough to cover the needs of one million GPUs and other IT equipment.
With the release of Grok4, many feel the race for AI dominance is being taken to another level. We are nearing the time where human level "intelligence" is being usurped.
The Need For Inference
The world is in a major crunch for inference. This is one of the bottlenecks the AI world is facing.
When looking at what AI companies are doing with these models, there are two types of compute.
We first have training. This is where the data is fed into the model and it is trained. On top of this, we have inference compute which is what handles the answers to the prompts.
As these models advance, we see more "features" added. For example, a generation ago we saw the ability for the models to "remember" the conversation. This is a bit misleading since the actual mechanics are the previous answers are fed back in. Here is where context window size is important.
Over the last few generations, these have exploded. This is the amount of data that can be fed in, such as an entire book. When remembering, the context window input simply includes the previous discussion.
In the Grok4 demo, we saw some prompts that took a number of minutes to provide an answer. This were highly sophisticated processes meaning a large amount of compute was utilized. This is inference.
We are already likely to the point where inference compute is outweighing training. That is why these clusters are getting so large.
Massive Energy Requirements
It does not take a genius to figure out that energy demand is going to explode. There are only a few ways to get more bang out of these systems.
- use more power efficient chips
- use batteries and other technologies to smooth electrical generation
- generate more electricity
The first two only go so far. Companies like Nvidia are improving the efficiency with each generation of chips. This, however, happens every couple of years.
Ultimately it boils down to the third one. Where is the energy going to come from? More importantly, how quickly can it be put online.
This is why Musk is moving a plant from an unnamed country. xAI simply cannot wait for a new plant to be built from scratch.
The race is on and it all boils down to energy.
Posted Using INLEO
https://www.reddit.com/r/artificial/comments/1ly2lmw/the_massive_need_for_energy_due_to_ai/
This post has been shared on Reddit by @uwelang through the HivePosh initiative.
This post has been manually curated by @bhattg from Indiaunited community. Join us on our Discord Server.
Do you know that you can earn a passive income by delegating your Leo power to @india-leo account? We share 100 % of the curation rewards with the delegators.
100% of the rewards from this comment goes to the curator for their manual curation efforts. Please encourage the curator @bhattg by upvoting this comment and support the community by voting the posts made by @indiaunited.
That's pretty incredible... I didn't even know a power plant could be moved. Not to mention from one country to another.