Economic Singularity: GDP Per Chip

What does the future economy look like?

This is something that many are speculating upon. The advancement into the digital world is progressing with stunning speed. Each year, we see not only more compute rolled out, but the cost per performance keeps increasing.

Moore's Law led to the greatest economic step in human history. This is a snail's pace compared to the pace being set up AI. Whether we are looking at chips, algorithms, or data, each is moving forward. When we look at the collective, that is when things become mind blowing.

It is impossible to believe that our economy in the future will resemble what it is today. For this reason, I keep writing about the economic singularity to help people wrap their minds around what might be possible.

For example, the idea of a 3% growth rate on global GDP could be made absurd. There is no reason that with chips, algorithms and robots that limits such as that exist.

Hence, we are not only looking at new growth levels but also metrics needed to capture what is taking place.

image.png

Source

Economic Singularity: GDP Per Chip

The present economic model values the GDP per capita metric. This is basically taking the GDP of a country and measuring it against the number of people.

Overall, this is a nice metric that can chart the standard of living of a country. While China is the second largest economy, its GDP per capita places it around 70th. When can compare that to countries like Monaco, which have much smaller populations but the wealth is off the charts.

There are a number of ways economists factor the components into GDP. Some simplify it down to population along with technology. This captures the leveraging of a population that technology provides.

A case in point is the tractor. Agricultural output in the United States has skyrocketed over the last 100 years while the population in that industry declined by 98%. Technology is the difference maker.

I wrote a number of articles about the expected decline in labor. We are seeing the foundation of the economy changing. If we narrow it down to labor and capital, we see the labor component dwindling due to AI and robots whereas the capital side expands.

In other words, jobs could be going away, at least from the perspective of being done by humans.

While we can still use GDP per capita, as it does capture economy output due to technology, what if we used something different?

GDP Per Chip

What is the output of a chip?

If we are indeed moving to a world of, as Jensen Huang says, "AI factories", where data centers are the basis for incredible sums of economic output, then perhaps looking at this closer is warranted.

The digital world is the driver of all that will come forth. We can presume the bridge will be robots and other physical devices that are directed by the AI systems.

Hence, we can conclude that most products in the physical world will come with chips.

My assertion here is that chips are going to be the core of the majority of economic productivity. What this means is it becomes the basic unit.

If we adopt this approach, we can see how GDP per chip becomes vital, not only for a nation but also companies.

How much compute does a company have? This is the most basic question under this scenario. The output matters little if no compute is available. Companies that stack up on this are going to find themselves as the leaders in the future.

Naturally, there is still the output required. Buying a bunch of GPUs means nothing unless the data center is producing something people utilize. This is the gap companies have to span. The challenge LLM monetization is a prime example.

Economic Singularity

Could a company be worth $10 trillion? $25 trillion? $50 trillion?

The answer is most definitely. We are dealing with an economy based upon the digital realm. This is not constrained by the limitations faced in the past.

When we look at cost per compute, that is dropping like a rock. The latest iteration of GPUs from Nvidia (Blackwell) shows this, This line of chips draws less energy than its predecessor while offering greater processing for the same money. Thus, we see the cost per compute unit dropping.

Here we have a trend that is not going to stop anytime soon. Jensen Huang said the Nvidia saw a billion-fold increase in cost for AI compute in the last decade.

Think about that for a second and what it means for the economy over the next 10 years. Where will we be in 2035 if this is repeated?

We could see the global economy growing at 30%, 50% or even 100% as we move towards 2040. The ramifications of this, as compared to the historic average of 3%, should be clear.

Basically, it is only a matter of how much more output per chip are we getting.

Posted Using INLEO



0
0
0.000
3 comments
avatar

Congratulations @taskmaster4450! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)

You have been a buzzy bee and published a post every day of the week.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

0
0
0.000
avatar

This post has been manually curated by @bhattg from Indiaunited community. Join us on our Discord Server.

Do you know that you can earn a passive income by delegating your Leo power to @india-leo account? We share 100 % of the curation rewards with the delegators.

100% of the rewards from this comment goes to the curator for their manual curation efforts. Please encourage the curator @bhattg by upvoting this comment and support the community by voting the posts made by @indiaunited.

0
0
0.000
avatar

Very interesting. Ofcourse compute power as it increases will change the course of every industry as we know it...

0
0
0.000