Grok 3 Trains On 100,000 GPUs
1 min readElon Musk reveals how many Nvidia H100 chips his AI chatbot will be trained on
Grok 3 Trains On 100,000 GPUs
Why is this important? So far, none of the big 5 (OpenAI, MSFT, Google, Anthropic, Meta) provide details on the computational power required to train their models and consider that IP. We have educated guesses, but now we know that Grok 3 will train on at least 100k H100 GPUs.