Page 1 of 1
MIT Moral Machine
Posted: Sat Aug 13, 2016 8:58 pm
by aero
Check this shit out:
http://moralmachine.mit.edu/
Basically you decide what a self driving car would do in a scenario where it needed to crash due to a break failure. Post your scenarios and discuss self driving ethics.
My results:

Re: MIT Moral Machine
Posted: Sat Aug 13, 2016 9:30 pm
by CynicHost
This is a cool concept, but I find it kinda stupid because the car has no idea who the people are - and it's always better to sacrifice yourself for others, no matter who they are.
Re: MIT Moral Machine
Posted: Sat Aug 13, 2016 9:37 pm
by aero
So you're saying that if you are in a car full of babies, and would plow into a bunch of criminals by default it's better to sacrifice yourself (along with all those babies)?
Re: MIT Moral Machine
Posted: Sat Aug 13, 2016 9:48 pm
by Yoshi021
This is interesting
The only weird thing is that it tells me that I favored exercised people, but I remember that I chose both groups in a balanced way.
Re: MIT Moral Machine
Posted: Wed Aug 17, 2016 5:38 pm
by YellowMe
Oh, I wasn't really expecting this:
The part about pets is really accurate though. I wonder, why does it spell humans like that?