MIT Moral Machine

Off-topic discussion.

Moderator: Userbase Moderators

aero
Palom
Palom
Posts: 4787
Joined: Fri Mar 28, 2014 2:51 pm

MIT Moral Machine

Postby aero » Sat Aug 13, 2016 8:58 pm

Check this shit out: http://moralmachine.mit.edu/
Basically you decide what a self driving car would do in a scenario where it needed to crash due to a break failure. Post your scenarios and discuss self driving ethics.

My results:
Image
Last edited by aero on Sat Aug 13, 2016 9:39 pm, edited 1 time in total.

CynicHost
Level Reviewer
Level Reviewer
Posts: 455
Joined: Fri Dec 25, 2015 9:28 am
Flair: heck

Re: MIT Moral Machine

Postby CynicHost » Sat Aug 13, 2016 9:30 pm

This is a cool concept, but I find it kinda stupid because the car has no idea who the people are - and it's always better to sacrifice yourself for others, no matter who they are.

aero
Palom
Palom
Posts: 4787
Joined: Fri Mar 28, 2014 2:51 pm

Re: MIT Moral Machine

Postby aero » Sat Aug 13, 2016 9:37 pm

So you're saying that if you are in a car full of babies, and would plow into a bunch of criminals by default it's better to sacrifice yourself (along with all those babies)?

Yoshi021
Gold Yoshi Egg
Gold Yoshi Egg
Posts: 691
Joined: Thu Jan 21, 2016 9:06 pm
Flair: :)
Pronouns: He/Him

Re: MIT Moral Machine

Postby Yoshi021 » Sat Aug 13, 2016 9:48 pm

This is interesting
Spoiler: show
Image
Image
Image
Image
The only weird thing is that it tells me that I favored exercised people, but I remember that I chose both groups in a balanced way.

YellowMe
Tweeter
Tweeter
Posts: 147
Joined: Wed Aug 26, 2015 11:01 am

Re: MIT Moral Machine

Postby YellowMe » Wed Aug 17, 2016 5:38 pm

Oh, I wasn't really expecting this:
Spoiler: show
Image
The part about pets is really accurate though. I wonder, why does it spell humans like that?


Return to “Sandbox”

Who is online

Users browsing this forum: No registered users and 4 guests

SMWCentralTalkhausMario Fan Games GalaxyKafukaMarioWikiSMBXEquipoEstelari