WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

(post is archived)

[–] 0 pt

Large language models like gpt-3 already lie, or rather they say things without regard for whether they're true. It's much easier to program something that just says whatever serves it's goals (like a psychopath) than it is to program honesty.

[–] 0 pt

Deceive is what I was looking for.