WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2026 Poal.co

1.2K

This a conversation my brother always brings up, and it always pisses me off every time. apparently there will come a time when an ai car will need to make a decision between killing a school bus full of kids or a group of grandmas or something. this always was interesting, because i've driven for over a decade now, and i have never, even come close to being forced into such a situation. and how aggressive are these cars driving that they are forced into such situations? however, the conversation does something more sinister, it stack ranks people based off bullshit parameters.

This a conversation my brother always brings up, and it always pisses me off every time. apparently there will come a time when an ai car will need to make a decision between killing a school bus full of kids or a group of grandmas or something. this always was interesting, because i've driven for over a decade now, and i have never, even come close to being forced into such a situation. and how aggressive are these cars driving that they are forced into such situations? however, the conversation does something more sinister, it stack ranks people based off bullshit parameters.

(post is archived)

"Sinister" yeah, well you confront reality or you just ignore it and wait for it to bite you in the ass. He's asking whether you'd rather save people with their whole lives ahead of them or people who are at the very end of their lives.