That is already happening. They’re still scraping new text for training and a lot of that is LLM generated.
Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
That is already happening. They’re still scraping new text for training and a lot of that is LLM generated.
Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
I call it sniffing your own farts.
>Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
I call it sniffing your own farts.
jews and cheap pajeet labor are behind the trainings, so that metaphor is accurate.
jews and cheap pajeet labor are behind the trainings, so that metaphor is accurate.
[The Fecal Fixation of the Chosen Ones](https://archive.ph/BAW1u)
[PDF Mirror](https://vid8.poal.co/uploads/AOU/files/The_fecal_fixation_of_the_chosen_ones.pdf)
(post is archived)