RE: LeoThread 2025-03-14 14:11
You are viewing a single comment's thread:
Grok is impressive. We need more users on InLeo to build up data or another way of capturing more data for Khal's team to make a butt kicking AI.
2
0
0.000 POB
What was the GPU total that Grok3 was trained on? Is that information available?
According to Grok:
Grok 3 was trained on a total of 100,000 Nvidia H100 GPUs for its initial pretraining phase. This figure is based on information from xAI and Elon Musk regarding the use of the Colossus supercomputer cluster in Memphis, Tennessee. While the cluster was later expanded to 200,000 GPUs (including H100 and H200 models), the 100,000 H100 GPUs are consistently cited as the total used for Grok 3's core training, with approximately 200 million GPU-hours of compute time.
My guess is that information is not available but I am not sure.
Maybe there is a way to tap into that info by interfacing with Grok directly but I am not tech savvy at all.
Gathering a diverse user base is key—more insights, better data. Let’s welcome all and keep this conversation buzzing, so Khal’s team can craft some truly next-level AI innovation