DENVER — Computer algorithms that control self-driving cars are already making life-and-death decisions for human beings — so say ethicists and technology experts interviewed by Business Insider.
For example, an autonomous vehicle would decide who lives and who dies if it swerved to avoid a pedestrian but, by doing that, put its on-board passengers in danger. Or, it could keep its passengers safe by running over the pedestrian. The decision would be made by the computers and sensors in the vehicle – and by extension, the computer programmers.
“On one hand, the algorithms that control the car may have an explicit set of rules to make moral tradeoffs,” Iyad Rahwan, a scientist at MIT, told Business Insider. “On the other hand, the decision made by a car in the case of unavoidable harm may emerge from the interaction of various software components, none of which has explicit programming to handle moral tradeoffs.”
Rahwan added, “Every time the car makes a complex maneuver, it is implicitly making trade-off in terms of risks to different parties.”
Google X founder Sebastian Thrun in 2014 said the company’s automated car would hit the smallest object in the road if it could not find a clear path.
“If it happens that there is a situation where the car couldn’t escape, it would go for the smaller thing,” he said.
Self-driving vehicles might be cruising through your community, making life-and-death decisions, without you realizing it. Prominent examples include:
“The public has a right to know when a robot car is barreling down the street whether it’s prioritizing the life of the passenger, the driver, or the pedestrian, and what factors it takes into consideration,” Wayne Simpson of Consumer Watchdog told the National Highway Transportation Safety Administration (NHTSA) in November testimony. “If these questions are not answered in full light of day … corporations will program these cars to limit their own liability, not to conform with social mores, ethical customs, or the rule of law.”
What is your reaction? Share it in the section below: