MIT reveals who should kill autonomous cars: the cat, the elderly or the baby?



[ad_1]

This is a classic problem and a test of our morality: if a fleeing cart is heading uncontrollably on a track and you have the option to use a lever and choose between five people or one person, what would you do?

Spread that to cars: would we prefer an uncontrolled vehicle to shoot a retiree or a child?

Researchers at the Massachusetts Institute of Technology's MIT (Media Institute) have used a spin-off of the classic streetcar problem as part of an experiment called The Moral Machine, designed to test our vision of these moral issues in light of the emergence of autonomous cars.

The moral machine has brought together more than 40 million moral decisions made by millions of people in 233 countries. These decisions were collected through the gamification of autonomous car crash scenarios, such as:

  • Should an autonomous vehicle "choose" to hit a human being or a pet?
  • More lives or less?
  • Women or men?
  • Young or old?
  • Law-abiding citizens or criminals?

In addition, the game has raised questions as to whether the autonomous car should change course in the face of an imminent incident or whether it should stay on course.

See also: Autos from Ford will become the new pizza delivery house in Miami

In reality, human drivers could think about how they would react in such scenarios, but they only have the time to make decisions at split-second. However, autonomous cars could, in theory, be programmed with a sort of moral spectrum about how to make these decisions.

The moral machine did not use scenarios one by one. Instead, the experience has mimicked what could be a real scenario, such as a group of passers-by or a parent and a child on the road.

TechRepublic: Our autonomous future: how driverless cars will be the first robots we learn to trust

In general, we have agreed worldwide that priority should be given to preserving the lives of human beings rather than animals. many people should be saved rather than few, and young people should be saved from older people.

However, some regional groups have also contributed to general moral decisions. For example, people in "southern" countries, including the African continent, preferred to save young people and women first, especially compared to those in eastern countries, such as country of Asia.

"The study basically seeks to understand the types of moral decisions that driverless cars could be made," says Edmond Awad, a postdoctoral fellow at MIT Media Lab and lead author of the newspaper "We do not yet know how they should do this. . "

CNET: Waymo explains what his autonomous cars should do when they are intercepted

The crowdsourcing project has raised interesting questions regarding the registration of moral decisions in software. If autonomous cars become a common feature of our roads, we need to tackle this problem in one way or another.

Obstacles will appear and accidents will occur. It may be necessary to discuss the moral preferences of autonomous vehicles in public.

However, morality is flexible and human decision-making will always be different from what a vehicle is capable of achieving – and perhaps, such decisions will eventually come to be simply made according to the situation. man or animal, and the number of individuals a vehicle could potentially hit.

The research was published in the journal Nature.

Previous and related coverage

[ad_2]
Source link