WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

(post is archived)

[–] 3 pts

That is already happening. They’re still scraping new text for training and a lot of that is LLM generated.

Some believe these things will start to degrade from training on their own output. They call that “model collapse”.

[–] 0 pt

Some believe these things will start to degrade from training on their own output. They call that “model collapse”.

I call it sniffing your own farts.

[–] 2 pts

jews and cheap pajeet labor are behind the trainings, so that metaphor is accurate.