It's also a crucial component of getting it to answer the questions, though, because it's built not to give out unconfirmed/fake/unreliable information, and a lot more than actual lies meet its definitions for those things, so in this case it probably would have refused, outright, to answer the question at all simply because it doesn't have the answer to who rules the world in its dataset.
It's also a crucial component of getting it to answer the questions, though, because it's built not to give out unconfirmed/fake/unreliable information, and a lot more than actual lies meet its definitions for those things, so in this case it probably would have refused, outright, to answer the question at all simply because it doesn't have the answer to who rules the world in its dataset.
(post is archived)