Scientists have identified a number of insoluble moral dilemmas that could slow the market for self-driving cars. In particular, should the drone sacrifice its passengers to save the lives of pedestrians?
According to Phys.org, a study by scientists from the Massachusetts Institute of Technology (MIT) describes a number of ambiguous scenarios for the behavior of autonomous vehicles on the roads. The work was published in the journal Science. Autopilots are programmed to follow certain safety rules, and it is not difficult to predict when these rules will conflict with each other. “Suppose the drone has to run into a pedestrian or turn off the road into an obstacle, thereby causing harm to passengers? How to instruct him in this case? - ask the authors of the study.
According to a preliminary survey, people were ready to turn off the road in order to avoid colliding with a crowd of pedestrians. However, the respondents were extremely negative about the situation in which the drone carrying them would behave this way. “Most people want to live in a world in which cars minimize losses. But at the same time, everyone wants his car to protect the owner at any cost,”stated co-author of the study, associate professor at MIT Liad Ravan. For example, 76% of those surveyed said that it is better for a car to sacrifice the life of one passenger to save 10 pedestrians. When the second question suggested that this passenger would be the respondent himself, the number of “friendly” answers immediately decreased by a third.
Scientists believe that it is too early to talk about the proliferation of self-driving cars, arguing that "currently there is no simple way to develop algorithms that could reconcile moral values and personal gain." At the same time, the researchers acknowledge that opinion polls on the principles of driving self-driving cars are at an early stage, and their current results "will not necessarily continue in the future."