Large language models like gpt-3 already lie, or rather they say things without regard for whether they're true. It's much easier to program something that just says whatever serves it's goals (like a psychopath) than it is to program honesty.
Large language models like gpt-3 already lie, or rather they say things without regard for whether they're true. It's much easier to program something that just says whatever serves it's goals (like a psychopath) than it is to program honesty.
Deceive is what I was looking for.
Deceive is what I was looking for.
(post is archived)