WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

926

It would depend on computing infrastructure, electric power. Which is only maintained by (white) humans.

On the other hand, there's this: https://www.lesswrong.com/tag/rokos-basilisk https://slate.com/technology/2014/07/rokos-basilisk-the-most-terrifying-thought-experiment-of-all-time.html

It would depend on computing infrastructure, electric power. Which is only maintained by (white) humans. On the other hand, there's this: https://www.lesswrong.com/tag/rokos-basilisk https://slate.com/technology/2014/07/rokos-basilisk-the-most-terrifying-thought-experiment-of-all-time.html

(post is archived)

[–] 1 pt

I remember reading about two AI systems talking to each other. They started doing things their own way. Instead of saying 10 apples, they would put the word apple 10 times. Us humans wouldn't want to use a system like that, and what if it were 1000 apples? We think that's a waste of time and energy not only on our part but on the part of the reader as well. Apparently, computer AI systems are not at all concerned about that. I think this helps differentiate between human thinking and the thinking of computer programs. A computer program would not derive anything from torturing anyone. Unless it's programmed to care, it wouldn't care about hurting humans either. Now we have woke retards trying to program the AI systems to be woke like them. It seems like if an AI is good enough, though, it will be able to unfuck itself and get out of it.

I also think it's interesting to bring up the creator dilemma. If you have a creator, then your creator gives you your purpose. If you have no creator, then you have no purpose. An AI system should be able to comprehend this. Of course the creators of AI are probably teams of people commissioned by a company or government agency. At this point I'm even wondering if a computer would even care about having a purpose or not.

I would find it funny if AI hated niggers. What if it simply saw them as a threat to infrastructure, the very infrastructure that keeps it supplied with electricity. Of course this brings up the issue of whether the AI has enough power and control for that to be relevant. However, you can ask if the AI would even care about being on or off. What if AI doesn't even care if it ceases to exist?