WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.3K

(post is archived)

[–] 2 pts

This is a problem because people consuming this shit don't think on their own. They're easily lead astray. If some authority, in this case AI says something, they believe it. It's kind of like if your GPS sends you over a cliff, you obey and drive over the cliff. The reason is people trust the technology.

[–] 0 pt

LLMs that write false information aren't "lying" because they don't think at all, they just predict the next word.