The results are interesting, if predictable. In general, people are comfortable with the idea that self-driving vehicles should be programmed to minimize the death toll.
This utilitarian approach is certainly laudable but the participants were willing to go only so far. “[Participants] were not as confident that autonomous vehicles would be programmed that way in reality—and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves,” conclude Bonnefon and co.
And therein lies the paradox. People are in favor of cars that sacrifice the occupant to save other lives—as long they don’t have to drive one themselves.
via Why Self-Driving Cars Must Be Programmed to Kill | MIT Technology Review.
the Online Ride drivers make the very common mistake of acting as though the depreciation on their car costs them nothing because they have already bought the car. In talking to drivers, I have found that most do not account for the true cost of operating their car (around 50 cents/mile or about $20/hour) and rather mostly look at the cost of gasoline. They figure their income by just taking what they are paid and deducting very immediate costs like fuel. They are making an error, and this allows the companies to get drivers for less than the real cost. Robocars, of course, won’t have this issue.
Attacks on the online ride industry will continue. The official reason for slowing it down will be a reasonable sounding one such as safety
As a postscript, I should note that Uber has also gotten into deserved trouble because of very bad privacy practices and abuses, and disturbing attitudes by management about the press, their opponents and even women. These issues are unrelated to the real question, and Uber deserves trouble for them. Though it’s enemies will seize on its real mistakes in other fields to fight their battle.
via Uber's legal battles and robocars | Brad Ideas.
The battle against inertia.
I think one of the best ethical problems to place the driverless car in is the Trolley Problem. In this example, in place of the trolley is a school bus loaded with kids and you behind the wheel of a car on a narrow road and you are going to crash. This is a no win situation, either you pull off the road and sacrifice yourself or you hit the bus killing the kids and yourself. If you are in an autonomous car, will the vehicle be able to make the decision to save dozens of young lives at the cost of yours? Probably not.
So, because of the decision made by the robot car, the kids are dead but you are saved. Flip that around, and if the car kills you to save the kids, without your consent, is the automaker at fault? Who gets blamed in this unlikely scenario? By having autonomous cars are the people just riding in them liable?
via Gas 2 | What is the future of fuel? What's new? What's next? Since 2007, Gas 2 has covered a rapidly changing world coming to terms with its oil addiction..