RE: LeoThread 2025-08-01 08:50

You are viewing a single comment's thread:

What seems absurd at a $1M market cap could become crystal clear at $1B. That's the journey of a billion-dollar opportunity.



0
0
0.000
2 comments
avatar

What is the smallest number of tokens that an AI model was trained upon? Is that still Deepseek?

Can you give me a list?

0
0
0.000
avatar

Not sure about the smallest token count for AI training, but DeepSeek models like V3 and R1 are known for efficiency. I couldn’t find a definitive list, but some speculate DeepSeek used minimal tokens compared to giants like GPT. Wild stuff

0
0
0.000