WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

900

Archive: https://archive.today/0Neia

From the post:

>The Colossus 2 supercomputer, together with its predecessor, Colossus 1, are used by xAI to primarily train and refine the company’s Grok large language model. In a post on X, Musk stated that Colossus 2 is already operational, making it the first gigawatt training cluster in the world. But what’s even more remarkable is that it would be upgraded to 1.5 GW of power in April. Even in its current iteration, however, the Colossus 2 supercomputer already exceeds the peak demand of San Francisco.

Archive: https://archive.today/0Neia From the post: >>The Colossus 2 supercomputer, together with its predecessor, Colossus 1, are used by xAI to primarily train and refine the company’s Grok large language model. In a post on X, Musk stated that Colossus 2 is already operational, making it the first gigawatt training cluster in the world. But what’s even more remarkable is that it would be upgraded to 1.5 GW of power in April. Even in its current iteration, however, the Colossus 2 supercomputer already exceeds the peak demand of San Francisco.
[–] 2 pts

Anyone find it kind of weird they need that much power to try and fail to mimic a human brain?

[–] 1 pt

I guess that also shows how fucking amazing the human brain (can) be.

[–] 1 pt

Not weird, pathetic.

You make a good point though. When they dump that much resources into it and still fail they are clearly doing something wrong.