The Future: Energy To Intelligence
It is going to be a world of tokens.
The tokenization of everything is something I have written about for a number of years. Of course, this conjures up sentiment of crypto, which is not completely invalid.
However, to fully understand this situation, we need to head towards a more basic level. For this reason, we will look at the idea of tokens as a unit of measure. I think we are going to see this become even more important in the future.
Most are aware of tokens via the cryptocurrency path. This is something that, for the masses, was the first introduction to this. That said, tokens are something that those involved with different aspect of computer science are very familiar with. It predates crypto by a wide margin, provide quantitative units (among other things) for developers.
In this article, we will ignore many of the application for tokens. The primary focus will be as a quantifier.
The Future: Energy To Intelligence
What is a token? According to Grok, in a NLP system, it is as follows:
In NLP, a token is a single unit of text, typically a word or punctuation mark, derived from splitting a sentence or document. For example, the sentence "I love coding!" might be tokenized into ["I", "love", "coding", "!"]. Tokenization is a key step in processing text for machine learning models.
There are other representations of this yet this sums up what we need.
A token basically quantifies a unit of data. It can mean different things as the definition above shows. It is a way that synthetic data is quantified to merge with human language.
In other words, it is a unit of either input or output.
We see the emergence of this concept with large language models (LLM). When describing the capabilities of a model, one of the things that is often pointed to is the "token window". This defines the amount of data that can be entered into a chatbot.
The output is important especially for our future endeavors.
Our future is one of transforming energy into intelligence. In other words, the goal, going forward, is how quickly can we produce tokens and advance the number that are output.
Energy Turning Into Intelligence
We can debate all day long what defines intelligence. There are many who are theorizing when we will see AGI or ASI (superintelligence). That is nothing more than speculation although some propose that AGI is already here.
Wherever we stand at the moment, we know things will keep progressing. When we look at the components of data - algorithms - compute, we know all are increasing and/or improving.
This means that "intelligence" will be output. What, however, is the input?
Here is where energy enters the picture. Certainly, we need data. Yet, without energy (electricity) nothing works. GPUs depends upon this. It is why Big Tech is scrambling to gain access to enough electricity.
Many have heard of the Kardashev Scale. Again, when asking Grok:
The Kardashev Scale is a theoretical framework proposed by Soviet astronomer Nikolai Kardashev in 1964 to classify the technological advancement of civilizations based on their ability to harness and utilize energy.
It was basically broken down into 3 levels. We won't go deep into it. The key is the Earth is roughly at .7 of becoming a Type 1 civilization, which harnesses all the power from the planet.
The idea of Energy-to-Intelligence can be altered a bit if we swap "intelligence" for "tokens". This might simply the concept into a framework we can work with.
As we advance down this mode of thinking we can see the goal. It is simply more tokens. We need them to increase in quantity and emerge at a faster pace. Obviously, more powerful compute helps a great deal. The same is true for the amount of synthetic data that is being generated.
Speed Is Key
We are looking at a realm where the time from energy-to-tokens will be crucial. More advanced "intelligence" is going to operate at a much faster pace than we see today. It is evident in all the potential bottlenecks experts cite.
For example, many surmise that data is going to be a problem. The tech companies already scraped most of what is available online. Even though we are producing more as time passes, it is not enough to keep pace.
In this situation, the data we are talking about is the input into the system. We are basically talking about requiring more tokens to feed the AI models and applications that are being created.
Of course, this requires energy to power the GPUs which are used for both training and inference. The result is the output of more tokens.
Acceleration is the norm. Each individual component of the system has to be faster. This is the focus of teams in those realms. Whether we look at communications, processing, storage, or energy production, everyone is looking to speed things up.
It is a fact of life when dealing with exponentials.
Hopefully we have a clear grasp of how the future framework looks. This is a simplistic view but I think it narrows it down to the essential.
From here, we can monitor the progress that is being made.
Posted Using INLEO
Data is the life of every technological endeavour and so simplistically valuable for economy that we see around us.
Tokenization has been successful until this moment and the response worldwide has been huge. Its not just about the freedom once has to grab the correct tokens for personal use, but those of the companies that they want to retain their hold by propagating native coins into market as supremacy move.
Congratulations @taskmaster4450! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)
Your next target is to reach 85500 replies.
You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word
STOP
There is definitely going to be more progress no matter where we might be.
!PIMP
https://www.reddit.com/r/energy/comments/1l950nz/the_future_energy_to_intelligence/
This post has been shared on Reddit by @dkkfrodo through the HivePosh initiative.
A future where energy fuels not just technology, but the very fabric of intelligence itself is worth savouring. The race for faster tokens and smarter systems is truly expanding.
I think the questions are:
Do we have the moral and social capacity to regulate this rapid pace?
And as societies, will we be ready for this overflow of "symbolic intelligence"?
Human civilization has already stepped into hyper-quantified world where it seeks to break down the measurable into discreet and processable fragments, that is where we need the assistance of AI, Blockchain, LLM, etc.
In a highly technological world, intelligence is deemed as the result of computational throughput, everything is poised for tokenization, and that's coming for sure, but the question is whether we should overshoot this tokenization function?
Let's preserve the subjective experience as a private good and keep some aspects of life as un-measurable and un-quantifiable.
Technological acceleration is such a thing it will only push for an endless optimization, while it is good, it should be subject to the collective needs of the society and not for commerce.