Reply to post: Re: Not a problem solved

My self-driving cars may lead to human driver ban, says Tesla's Musk

DrXym

Re: Not a problem solved

"That's a bold claim."

No it isn't.

"It's far more likely to do it reliably and regularly than a human. It will take statistically proven decisions, and it will do that from a much broader array of sensing inputs than a human could."

The problem is that the things you encounter during a drive are far from regular.

"I struggle with seeing how people who work in computing could see this as unsolvable. "

It's called experience. See aforementioned voice recognition. Or OCR. Or AI. Or robotics. All began with lofty claims and then it turns out turning the analog world into something a computer understands turns out to be damned hard.

"It's simply an engineering problem - the right inputs processed at the right time, matched against a statistically driven decision tree. How is any of that impossible?"

Not one problem, an infinite set of problems, many of which are intractible.

Here's some trivial problems your hypothetical self drive car would encounter:

- The lights are out at the crossroads ahead. Does your car know how to negotiate the crossroads in a safe way which gives gives priority to other drivers according to the time they arrived and prevailing traffic? Can it establish basic signals to other drivers to indicate intent. Or does it just nudge out like an asshole and hope for the best? Or does it annoy the driver by giving up? How does it know to give up? Naturally it would have to do the right thing however many lanes, rights of way, trucks, buses, bicycles, motorbikes and cars (self drive and otherwise) there were.

- A man is standing in the road by the traffic lights. A police man. How does your car know to obey his signals instead of the traffic lights?

- A man is standing in the road by the traffic lights directing traffic. This man is a loony. How does your car know NOT to obey his signals instead of the lights?

- A big truck ahead is stopped and a guy hops out to halt traffic each way so the truck can reverse into some entrance. How far away does your car stop from this? How does it know not to try and overtake this obstacle?

- Your car encounters a stationary bus in your lane. Is the bus broken down? Is the bus stopped at a bus stop or stopped at lights? If it's stopped at a bus stop how long is it likely to be there picking up passengers? When if ever is it safe to pull into the oncoming lane to overtake this obstacle?

- The road has a big pot hole in it. Can your car see this? Can it see it when it's filled with water? Or does it just smash straight through it?

- A road is closed and there is a diversion in place. Does your car follow the signs or just keep driving until it falls into a hole the council just dug?

- You're going up a country lane. 50m ahead you see an oncoming car. Does your car know it has to pull into the verge NOW because there is no verge ahead?

- Your car goes into place with terrible radio coverage, or no GPS like a tunnel, underground carpark or simply a built up area. What does it do? Dead reckoning? Revert to the driver? What?

I could go on but the point is there are too many variables, particularly in urban / country environments for it possibly to do the right thing all of the time. If it's constantly nagging the driver to intervene because it doesn't know what to do then it will become annoying and useless. I expect that even when it does appear in closed loop environments that there will still be some guy in a booth there to remotely extricate the car if it gets confused or confounded by something.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon