Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Moral Machine Lets You Decide Who Gets Killed by a Faulty Autonomous Car

With the onset of self-driving vehicles, a major concern raised is how they would respond to the trolley problems. This is basically Moral questions, like when a self-driving car full of elderly individuals has to make a decision on whether to crash to avoid puppies in the cross-walk or if it is ok to run over two criminals to save one doctor. Whose lives are worth more, those of senior citizens or seven-year-olds? MIT researchers have developed a game dubbed the Moral Machine that lets players make the similar calls on ethics. The primary goal of this game is to help build a crowd-sourced picture of human opinion on how machines should make the decision when faced with moral dilemmas.

Participants in this game are asked 13 questions, all with just two options. In all the scenario, a self-driving car with unexpected brake failure has to make a decision: continue moving ahead and run into whatever is in front or swerve out of the way, smashing whatever it finds in its way.

Moral Machine has all types of people in different scenarios including children, elderly, male, females and adult. There are executives, criminals, homeless or nondescript. In one of the questions, participants are asked to choose between saving a pregnant woman in a car or a boy, a female doctor, a female executive, two women athletes and a female physician. The scenario raises more nuanced questions, Should the passenger who never complained about the speeding car be saved? Or should we depend on airbags and other safety features in a crash instead of swerving into unprotected civillians?

The game also poses basic options like whether AI should be involved at all if it will save more lives or just stay passive instead of actively changing events in a way they make it responsible for someone’s death. Most people playing the game definitely thinks the decisions are tough with clear-cut situations, imagine how tough it will be for self-driving cars in amidst chaotic road conditions.

Trolley Problem were first formulated in the late 1960s. The question is it more just to pull a lever, which sends a trolley down a different track to kill one person or leave the trolley on its course where it will kill five. It is an inherent moral problem, and slight variation can significantly change how people choose to answer.

At the end of the Moral Machine, it informs test-taker that their answers were part of a data collection effort by scientists at MIT for research into autonomous machine ethics and society. However, people can opt-out of submitting their data.

Featured Image Credit:techcrunch

The post Moral Machine Lets You Decide Who Gets Killed by a Faulty Autonomous Car appeared first on AptGadget.com.



This post first appeared on AptGadget.com - Technology Reviews, Products And News, please read the originial post: here

Share the post

Moral Machine Lets You Decide Who Gets Killed by a Faulty Autonomous Car

×

Subscribe to Aptgadget.com - Technology Reviews, Products And News

Get updates delivered right to your inbox!

Thank you for your subscription

×