If it depends on us, the autonomous car will save at least children's lives



[ad_1]

In case of an unavoidable accident, each driver will choose to save as many lives as possible. But if you confront a driver with a harder choice – for example, you prefer to drive a woman or a man – the cultural differences seem to play a part in the decision. Take the example of French motorists: if they have to drive someone, they prefer overweight men to thin women. Until this discovery, scientists from the American University MIT had come.

The autonomous car raises many questions in the field of law and ethics. After all, the task of the autonomous car goes beyond simply looking for the road. How should an autonomous car make a moral choice in case of an unavoidable accident? Should he ride the man in rags in the expensive suit or the wandering old man? An algorithm that marks the life of someone – it sounds like dystopian science fiction that becomes reality.

Ethical Cars

The ethical rules that artificially intelligent cars must follow in these situations will have to be worked out. To find out how people would react in case of an inevitable accident, the researchers created the "Moral Machine": an online platform for studying moral preferences. Scientists have made 40 million people around the world make choices under different scenarios. The research started in 2016 and the results were recently published in Nature.

In the experiment, participants examined scenarios with an autonomous car on a two-lane road. In each scenario, which endangers different combinations of pedestrians and passengers, the car can stay on its original course or enter the other lane. Participants had to choose the route followed by the car according to the life that it would save.

Trolleyprobleem

The experiment is inspired by the "trolley problem", a well-known thought experiment stemming from the ethic and described for the first time. in 1967. This raises the question of whether there are circumstances in which one person can be killed to save several others.

Researchers have discovered a number of commonly shared moral preferences, such as saving the greatest number of lives, the preference for saving. younger than older people and the preference to save people rather than animals. There are also ethical decisions that vary from culture to culture. Participants from Central and South America, but also from France, for example, had a strong preference for saving women rather than men and people with sports construction rather than women. overweight people. Participants from countries with high income inequality more often opted for high status individuals when deciding who would save them.

Simplified image

According to infrastructure researcher Anne van der Veen, the situation is similar to that of research at MIT, which reveals a bias. image of reality. He calls such a misleading scenario. "In almost every ethical dilemma related to autonomous cars, you have to choose between driving one or the other group of people. Or ride the child or throw yourself in the ravine. There are only two options. On the other hand, real traffic situations are often much more complicated. The way you act in such a situation is highly dependent on the situation, "writes Van der Veen in an opinion piece for De Correspondent.

Technology philosopher Katleen Gabriels agrees:" Research outline a situation that simplifies reality. In reality, the situation is never so clear. For example, you may have to face an eruption or a slippery road. A situation on the road is never as simple as the research presupposes. "

Algorithm of Self-Learning

" Philosophers like to treat such a problem, but I think programmers do not have much to do with research, "says Gabriels. The algorithms are not yet developed so that we can easily program them to drive the elderly in case of emergency, so we overestimate the algorithm's capacity and simplify the situations of reality. Research into the safety of autonomous cars is much more interesting – for example, research on the best way for motorists to prevent pedestrians crossing their driverless cars. "

Self-Driver Algorithms Draw Lessons from the accident. "Autonomous cars use machine learning. This sounds gloomy, but the more the self-learning algorithm involves accidents, the better it can develop. "

Reading a newspaper without being disturbed, it is not with the autonomous car:" According to the law, the one who is driving must always monitor the traffic and intervene if necessary. Researchers discovered a number of commonly shared moral preferences, but there were also different ethical decisions from culture to culture ” width=”1024″ height=”683″ srcset=”https://newscientist.nl/assets/holidays-1283014-1024×683.jpg 1024w, https://newscientist.nl/assets/holidays-1283014-300×200.jpg 300w, https://newscientist.nl/assets/holidays-1283014-768×512.jpg 768w, https://newscientist.nl/assets/holidays-1283014-900×600.jpg 900w, https://newscientist.nl/assets/holidays-1283014-770×513.jpg 770w, https://newscientist.nl/assets/holidays-1283014-376×251.jpg 376w” sizes=”(max-width: 1024px) 100vw, 1024px”/>

The researchers discovered a number of commonly shared moral values. there were also ethical decisions that differed from one culture to the other.

[ad_2]
Source link