RE: LeoThread 2025-04-10 05:05

You are viewing a single comment's thread:

Why do chips and AI have to be this expensive for customers?
AI does not have to be as expensive as it is today, and it won't be in the future. Chips are the biggest culprit. Most AI to date has been built on one chip provider. It's pricey. Trainium should help, as our new Trainium2 chips offer 30-40% better price-performance than the current GPU-powered compute instances generally available today. While model training still accounts for a large amount of the total AI spend, inference (which are the predictions or outputs of the models) will represent the overwhelming majority of future AI cost because customers train their models periodically, but produce inferences constantly in large-scale AI applications. Inference will become another building block service, along with compute, storage, database, and others. We feel strong urgency to make inference less expensive for customers.



1
1
0.000 POB

0 comments