It's the incredible shrinking Internet. Pretty soon it will be trained using its responses to other people's questions
That is already happening. They’re still scraping new text for training and a lot of that is LLM generated.
Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
Some believe these things will start to degrade from training on their own output. They call that “model collapse”.
I call it sniffing your own farts.
jews and cheap pajeet labor are behind the trainings, so that metaphor is accurate.
(post is archived)