WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2026 Poal.co

1.2K

This a conversation my brother always brings up, and it always pisses me off every time. apparently there will come a time when an ai car will need to make a decision between killing a school bus full of kids or a group of grandmas or something. this always was interesting, because i've driven for over a decade now, and i have never, even come close to being forced into such a situation. and how aggressive are these cars driving that they are forced into such situations? however, the conversation does something more sinister, it stack ranks people based off bullshit parameters.

This a conversation my brother always brings up, and it always pisses me off every time. apparently there will come a time when an ai car will need to make a decision between killing a school bus full of kids or a group of grandmas or something. this always was interesting, because i've driven for over a decade now, and i have never, even come close to being forced into such a situation. and how aggressive are these cars driving that they are forced into such situations? however, the conversation does something more sinister, it stack ranks people based off bullshit parameters.

(post is archived)

[–] [deleted] 1 pt (edited )

What parameters wouldn't be bullshit in your opinion?

It's not that there will be a time when this happens, but that there could be, and how should the AI respond in that situation? They can't code for every possible scenario, so it can't be scripted, they'd need to set these parameters to determine course of action. Obviously, avoiding deaths will be a primary parameter, so perhaps it would be based on how many lives are assumed to be in each vehicle, combined with the likelihood of survival.

This idea was made famous in "I, Robot." A robot makes a choice between saving a grown man and a young girl. The robot chooses the man because he has a higher survivability factor. If the robot had gone after the girl, both the girl and the man would have died. A human might have instinctively saved the little girl, which seems the like the moral thing to do. But, if doing the moral thing results in two innocents dead, instead of just the one, is that really the right thing to do? Or does it just FEEL that way.

[–] 1 pt

Lesson from I Robot, never let a robot be put into the position to choose.

Lesson from Blade Runner. Slavery is bad, even if they are robots.