I think one of the best ethical problems to place the driverless car in is the Trolley Problem. In this example, in place of the trolley is a school bus loaded with kids and you behind the wheel of a car on a narrow road and you are going to crash. This is a no win situation, either you pull off the road and sacrifice yourself or you hit the bus killing the kids and yourself. If you are in an autonomous car, will the vehicle be able to make the decision to save dozens of young lives at the cost of yours? Probably not.
So, because of the decision made by the robot car, the kids are dead but you are saved. Flip that around, and if the car kills you to save the kids, without your consent, is the automaker at fault? Who gets blamed in this unlikely scenario? By having autonomous cars are the people just riding in them liable?