Personally I don't think fully autonomous vehicles will ever cope with crowded city driving in the UK. They might have a chance in a city with a grid layout, or when every road has helpful beacons every few metres but I'm not confident. When a bus breaks down during rush hour and you have to risk poking the nose of your car out into the opposite lane, so you can see oncoming traffic, what would an AI make of that. Do I want an algorithm taking that risk with my life?
If autonomous vehicles reduced deaths on the road by 90%, that remaining 10% have families who know that a computer killed their loved ones. Can they jail the algorithm for dangerous driving? I don't trust this government but they might be right that 'smart' motorways are safer but it doesn't matter. Each accident, due to the lack of a hard shoulder, is heavily publicised in the media and the court of public opinion have ruled them dangerous and unwanted. Every minor incident with a self-driving car will be front page news.
I love technology but I don't want a self-driving car. I like driving and I can't put the lives of me and my passengers in the hands of a computer (even Notepad crashes sometimes). I don't even want an automatic gearbox, like the majority of people in the UK (UK 60% manual, US 3% stick). I will be upset when there is no longer any manual gearbox cars for sale but I'll probably suck it up. I'll never buy, or ride in, a car without a steering wheel. I do see that a driverless car would be life-changing for disabled or infirm people though
They say it won't be long until a hacker actually kills someone. That seems a lot more likely if we sit in computers moving at 70.