Autonomous vehicles would be an amazing thing, but like you said, we don't have autonomous pedestrians, cyclists or roads to suit them. Until everyone by mutual consent agrees to trash the concept of liability (which probably predates fire, and will likely never happen), autonomous vehicles will never come into mass usage in an uncontrolled environment. There will be accidents involving autonomous vehicles leading to death or injury - no doubt orders of magnitude fewer than those of regular vehicles, but at some point, a car's computer will make a 'trolley bus' decision. If a Mercedes built car causes a death, it could be shown logically by lawyers acting for the victim's family that the behaviour causing his or her death was predetermined by the manufacturer. For this reason, it wouldn't surprise me if, without huge legislative changes, the directors of Mercedes could be found criminally liable regardless of contracts between the vehicles operator and manufacturer. Blame is in some ways a compensation afforded to victims by society that helps salve their grief, and society's guilt.
The only alternative is that legislation (agreed by all countries) forces operators of autonomous vehicles to accept legal responsibility for the actions of their cars' AI. If the idea of criminal responsibility was navigated successfully, insurance companies would bear the burden of compensation for injury or death, and as is the case at the moment, would be forced to contribute to a pool to cover payouts to victims of uninsured operators (though the system would have to be vastly improved).
However, will we ever really be morally comfortable with abandoning the idea of blame, and accepting - emotionally, not logically - that the roads are a lottery; that any time, you or someone you care about could have their life snuffed out by a cold, unthinking machine? Could we completely substitute awards of money for feelings of vindication towards another human being in apportioning blame?
Even if people are killed by faults in mechanical devices every day, these devices don't make decisions. The public might see deaths from AI as akin to a death squad of robots, randomly executing people on their doorstep every few days - at least that's how it could be framed by the media.
Most readers here would probably accept the logic that statistically a vastly reduced overall road death rate would be a price worth paying for this kind of lottery. But picture a situation where a young pregnant single mother (a nurse) and her little girl are killed by an AI on their way to school, leaving her other three children orphaned. Can you imagine how the media would frame it? How public opinion could change overnight? I'm pretty convinced that without a completely controlled environment to operate in, autonomous vehicles would be dead in the water after a single incident like this, no matter how much effort went into dealing with the liability issues. People would stop buying them, companies would stop selling them.
How much would it cost to create controlled environments for the exclusive use of automated vehicles (fenced roads with controlled junctions for crossings etc)? In urban areas we don't have the space to assign exclusivity to many roads without a huge uptake in automated vehicles (before which public opinion might turn against them in any case). For freight transport maybe, but then is it more cost effective to upgrade and improve autonomy in rail systems?
The automotive industry should probably concentrate on automated enhanced safety systems long before thinking about self-driving cars.