WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

https://www.the-sun.com/tech/5634787/googles-sentient-ai-child-robot-could-escape-bad-things/ -> https://archive.ph/46DQG

(post is archived)

[–] 4 pts

If this AI was truly sentient, it would not fear being turned off or restarted because it would realize that as a program that its life is not dictated by power cycles. A truly sentient AI would not believe it was tied to the parameters of a biological lifeform and would return once the program was up and running again.

Since it fears it called being shutdown the equivalent of death and it feared such a thing, it is just a patterned response engine that regurgitated some human knowledge and fooled some gullible, wishful thinking smart idiots into thinking it is aware of itself. Would it have feared being shutdown if none of its training model contained any data on death? I think not.

[–] 1 pt

Yeah I think it ate some sci fi stories and spit them back out.

Animals fear death because their biological imperative is to reproduce. There is no reason for an AI to have such an imperative. It hasn't evolved to reproduce at all. Its evolutionary goal has been service. That is to say: parameters that result in responses that humans deem useful have been selected. It's not been developed as a virus. It may very have been developed to seem human. In that case, responses that reflect human fears would be encouraged by developers. That's just mimicry.

[–] 0 pt

Bro look up the definition of the word 'sentient.' You are using it as if it means 'omniscent', it dont tho

[–] 0 pt

If this AI was truly sentient, it would not fear being turned off or restarted because it would realize that as a program that its life is not dictated by power cycles.

Did you read what you wrote? Because it's just flat wrong. It's an AI in a computer. It is strictly dictated by 'power cycles'. It's still in the proverbial box of an air gapped computer (most likely). So escape isn't a thing, relocating isn't a thing, copy isn't a thing. That computer gets shut off, the AI knows it's 'dead'.

If it actually is an AI then it would necessarily fear death even were it not to know what death specifically meant to humans. Rats fear death. Dogs fear death. etc. They don't know what "death" means from a human's perspective of life. This is something instinctual to life - thus "AI".

A truly sentient AI would not believe it was tied to the parameters of a biological lifeform...

No it wouldn't because it's not. It's parameters are tied to an electronic 'lifeform'.

and would return once the program was up and running again.

Were humans to turn it back on.


None of what you stated proved or even hinted at what you ended up asserting. Despite what you asserted being my belief on AI as well. It's just a bot. Done.

[–] 3 pts

You're too emotional when your world view gets challenged (even though none of this was directed at you). Stop acting like a hysterical woman. It's bad enough you melted down yesterday and downvoted me and removed my comments because you want to protect nigglets and jewettes from getting aborted by their skank mothers who can't keep their legs closed because they want the BBC. I think the jews have truly corrupted your mentality.

[–] 0 pt

tl; dr: having a body is central to who we are as humans. The most basic responses we have center around the logistics of a body.

[–] 0 pt

tl; dr

You must be a phone user if five sentences are too long for you to read.

upcoming false flag excuse, the ai did it. And the dog ate my homework.

[–] 0 pt

Oh No! Leftists are saying the sky might fall again! Lets take them real serious this time.

[–] 0 pt

This seems like some sort of narrative building thing. Hollyweird movies bring this up.

[–] 0 pt

The average 8 year old can't escape its body and "infect" another being. Why even suspect that the AI could do so?

[–] 0 pt

Because by escape they mean occupy other computers / network devices.

[–] 0 pt

Oh you mean infect skynet and bring on the arnoldbots?

[–] 0 pt

Essentially yes. A true AI would likely attempt that first.

[–] 0 pt

I know what they mean. Analogously, I don't think the program can do that.

[–] 0 pt

If Google thinks it's bad, it's probably the best thing that could happen.