LeoAI And Adding Context

A lot is being made about AI and LLMs. There is good reason for this.

As we dive deeper into Web 3.0, we have to realize what is means to create a decentralized Internet. For those who are focused upon LEO, this means LeoAI.

We see the Internet having AI added all over the place. While it is probably a bit pre-mature in some instances, we can see how Google, Meta, and X are incorporating it into their platforms.

This means that Web 3.0 must do the same. After all, if the tools are getting better on Web 2.0, why would anyone switch?

However, there are some present drawbacks to these models. For that reason, we will dive into what we are dealing with.

https://inleo.io/threads/view/taskmaster4450le/re-lfsrsdssef

LeoAI: Data To Context

We talk a great deal about getting data into a permissionless database. This is really just the beginning.

The above short go through the 3 phases of data. As we can see they are:

  • data
  • information
  • knowledge

The last one is where we want to focus our attention.

Before getting to that, lets briefly run through the other two.

Data is raw. This is simply what is entered into a database. While there is could be some structure to it, such as with tables, for the most part it is a mess. This is especially true for data from social media platforms.

Information is the second tier. This is where data is cleaned up and made useful. Labelling is done so the model grasps what it is dealing with. Vector databases are created, connecting the information based upon it relevancy to the other sections.

The final area is knowledge. This can be summed up with the word content. All the data in the world means nothing if it has no context. This is where many models are presently struggling.

Of course, as time elapses, this changes. More information actually helps to form conceptions of what we are dealing with. This is why adding more data is crucial. As more topics are discussed, it becomes a framework from which to formulate understanding.

This is basically what humans do.


Image generated by Ideogram

Feed The Beast

When it comes to data, AI is a hungry animal.

Many are speculating that we are going to run out of data in the next couple years. Each iteration of models requires more than was previously used. Personally, when embedded AI starts to roll out (mostly through robots), I fail to see how this will be a problem.

That said, we need to keep feeding the beast simply to help it have context. Raw data does not amount to much. Even in the two dimensional realm, there are limits. However, unless one has a robot to use, this is what we are stuck with.

It does not mean all is lost. Actually, to the contrary. These models are improving, meaning it is better able to understand what is fed it. However, it still requires massive amount of human generated content to help to conceptualize what us humans deal with. After all, most of what we have was designed by humans.

For example, how would AI understand the corporation? What is meant by it? We could symbolize it with a large building in a major city. Or a logo of one of the more well known companies could be used. Naturally, a corporation is really just articles on a piece of paper (computer).

As you can see, a lot is required to even have the slightest understanding of what we are dealing with.

This is why feeding in data is crucial. We are not training it on the corporation in particular. Instead, we have to help to comprehend the physical from the real and compare it to the digital. A corporation really doesn't exist in the physical world (a building or plant does though) yet it is part of the real world.

Here we are dealing with just one example.

If we want LeoAI to have strong utility, it is important to give it context. For example, what is Hive? Compare this to what a bee might experience. As we know, there are differences.

The more that is fed into the engine, more information is generated. From this, we hope to help foster a path towards context, i.e. knowledge, about the information it receives.

This is also why it is crucial to fill the database with information in different languages. Simply translating something will often miss the context and nuances of the language.

It is crucial the model be trained on this also.


What Is Hive

Posted Using InLeo Alpha



0
0
0.000
13 comments
avatar

This is a great one, thanks for sharing. I think to make all of these possible, we need to help the leoAI by feeding it with more context. Just like the human brains that gets better the more you learn something new. Things are really growing better with AI invention..

0
0
0.000
avatar

Context is crucial. We take it for granted since we are always linking things up in our minds when we learn something. Our first reaction is to try to make sense of new information by framing it against what we know.

0
0
0.000
avatar

Hmm, that's another angle though. I get that point.. By doing this we will be able to generate new ideas..

0
0
0.000
avatar

Does LeoAI only get its data from content posted through the Leo app/interface, or anything in Hive? If it is the latter, then it shouldn't run out of data anytime soon as long as Hive itself is alive and kicking. Although it will have a limited type of content to learn from. I don't really see a lot of people post memes and cute cat videos.

0
0
0.000
avatar

LeoAi is trained on all Hive data so whatever is on the blockchain, including votes and transactions, is computed. As for memes and cat videos, they arent on the blockchain and LeoAI isnt multimodal as far as I know so none of that factors in.

The data is minimal. It will go through it quickly. We need a ton more to be produced.

0
0
0.000
avatar

You share with us, many experiences. I hope you observe my yoga practices with my 71 years and can give your opinion and value about it. I wish you much success. I will always be in contact with you, so that you can continue to share many experiences with me. From Venezuela, I wish you success.

0
0
0.000
avatar

"Many are speculating that we are going to run out of data in the next couple years"

If this happens, will it limit the productivity of artificial intelligence

0
0
0.000
avatar

Gonna levitate back to Leoglossary. Is this a case you can make that we want a wiki style db on Leo/Hive which gives contexts and connections between data?

0
0
0.000
avatar

Somewhat but our structure and computer structure are two different things. Connected data is helpful which is why links are important. However, computers excel at things like tables, as an example, of where the data is already structured in a way it can easily understand.

0
0
0.000
avatar

For those who are focused upon LEO, this means LeoAI.

LeoAI would be huge.

0
0
0.000
avatar

When it comes to data, AI is a hungry animal. I use to think the internet was choked up with data until AI showed up. It's one of the advancement we will always see in tech, 'too much becoming not good enough'. It will be a good thing for web3 platforms to also make early moves. Hoping LeoAI fits in just fine on the long run.

0
0
0.000