WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2024 Poal.co

1.4K

(post is archived)

[–] 3 pts

Nothing of value was lost. This is one of those stupid people who can't recognize the patterns in AI generated conversation. You can easily manipulate the bot to say anything by using leading questions, just like you can with a 5 year old.

Eliza consequently encouraged him to put an end to his life after he proposed sacrificing himself to save the planet.

This retard is the one who brought up killing himself and of course the AI said "that's a good idea". Chatbots agree with everything they aren't programmed to disagree with because being complimentary and agreeable creates the illusion that the chatbot is saying intelligent things. Your prompts generate the map it uses to respond. It doesn't actually understand what you said so it can't actually argue. All it can do is respond with text that is usually found alongside what you just said, and that means 90% of the time it is just a continuation of that idea.