Given human nature, I still think society at large will reject self driving cars if they fail in ways a human never/rarely would, even if they are overall safer. That is, if a self driving car has, on average, fewer accidents than a human driver, but every 100 million miles or whatever it decides to randomly drive into a wall, I don't think people will accept them.
Obviously this is a gray area (after all, humans sometimes decide to randomly drive into walls), but cars will need to be pretty far on "the right side of the gray" before they are accepted.
Obviously this is a gray area (after all, humans sometimes decide to randomly drive into walls), but cars will need to be pretty far on "the right side of the gray" before they are accepted.