so can i drink and get driven home in this?
Google's self-driving car snags first-ever license in Nevada
The Nevada Department of Motor Vehicles has issued the first license plates that will allow Google's autonomous cars onto public highways. Nevada is the first state to devise licensing procedures for autonomous vehicles, and Google is the one of the leaders in that field, having hired some of the top talent that took part in …
-
-
Tuesday 8th May 2012 20:30 GMT bazza
"so can i drink and get driven home in this?"
No. The person sat in the driver's seat will still be responsible for the actions of the car. Basically it's a way for Google to say that the software is in beta, and any bugs and consequences arising from them are not their fault. So if the car has an accident because of a bug and you're pissed or on the phone it's your fault.
Personally I think it is madness. What's the point of all that tech if it can't drive you home from the boozer? And would you really trust Google's software with your life?
-
Tuesday 8th May 2012 21:14 GMT Anonymous Coward
Well
1) It'll get better as it goes on
2) It gives you time to do things that aren't driving. Like using the internet from your in-car display and looking things up on--oh, what's that website again?
The more I think about this the more it feels like Android in your car-a way to give you more time in your life to spend using the internet.
-
-
Wednesday 9th May 2012 12:54 GMT Asgard
@"I can see a future where ..."
@"I can see a future where manually driving a vehicle will be illegal."
I think you are right long term, but that could be many decades from now.
But even shorter term with cars driving themselves, it'll make Johnny Cabs from the film Total Recall entirely possible. Some airports and city projects are already experimenting with pod like cars similar to Johnny Cabs.
But longer term that could even make the need to own a car obsolete for most people. It could also mean the end of buses and coaches and even trains.
Plus lorries could drive themselves, so almost all goods could get automatically shipped around the country entirely by machines and even deliveries to homes could in time become automated with postal robots working off the backs of delivery vehicles working out of automated regional delivery centers.
Theres also a lot of good that can be achieved with network wide traffic flow management to allow easing of traffic jams where robot vehicles cooperate with each other to keep vehicles moving. Road traffic made up of all robot controlled vehicles, would also for example allow emergency vehicles like ambulances and fire engines to order all cars in front of them to make way for them, so the emergency vehicles can get through much faster which would save lives.
-
This post has been deleted by its author
-
-
-
-
Tuesday 8th May 2012 19:04 GMT Cliff
On the one hand...but on the other...
Planes have been fly-by-wire for a long time, the pilot hints to the onboard computer what he wants it to do, it polls its friends, and if they all agree, they do what the pilot asked.
On the other hand FUUUUUUCCCKKK! NOOOOOOO!! How terrifying is that to let loose? Aviation works because the rules are strictly followed (eg different flight levels for flying N-S than S-N, for instance), and those rules built up over decades of accidents, improving each time; and because the planes cost millions in avionics and robust ADA.
Google, gorblessit, is hell-bent on everything being a 'cloud service' with always on-ness to try to simulate a clunky spreadsheet in a browser. They tell us this is the only way, that it is the future. Want a car that polls the server before braking? OK, of course it will have to be autonomous, but seeing as how many of Google's offerings either publically carried the 'Beta' tag for years or crashed and burned, the corporate culture doesn't seem one I would want to trust with anything life-threatening.
-
-
-
Tuesday 8th May 2012 20:02 GMT Dave 126
Re: On the one hand...but on the other...
>With how many human interventions during those 140000 miles?
I don't know, but I imagine that there were no serious injuries, else we would have read about them here... unless the car was intelligent enough to reverse back over the victim and then mercilessly pursue any witnesses, a la Duel (1971).
But seriously, Google must be pretty confident of it not hurting anyone, since that doesn't make for good press.
I can see the biggest issue with automated cars being the effect on the part-time human drivers... like finding yourself driving on the right two days after getting back to Blighty from the continent, people might daydream and think that the car is in control.
-
Tuesday 8th May 2012 20:17 GMT Anonymous Coward
@Dave 126
You misunderstood me, I said interventions not injuries..
As the article says the cars were always operating with a human driver behind the wheel. I wonder how many overrides and corrections the human driver had to make to avoid accidents.
Also where was the car driven? I can't really see a computer driven car interpreting UK roads built around the 60s. Hell I can't even believe it will manage a busy roundabout.
-
Tuesday 8th May 2012 22:15 GMT JDX
Re: @Dave 126
I would expect that the number of times a human has to suddenly jump in to ensure safety is VERY low, otherwise it would not be deemed safe for use. I'd love to see information too, but I think we're all underestimating how smart the car is in the same way we stereotype computer speech recognition as being the way it was in the 90s.
I would not be too surprised if this is Google's 2nd "big product" after Google Search in terms of revenue.
-
Wednesday 9th May 2012 01:37 GMT Ole Juul
Interventions
Also where was the car driven? I can't really see a computer driven car interpreting UK roads built around the 60s. Hell I can't even believe it will manage a busy roundabout.
I suspect these are city cars for local use only. Around here, and many places outside of cities, you would have to trust that the car knows when to cross the yellow line because of rocks in slide areas. It would also need to make decisions based on how steep the cliff beside the road is and how close you can reasonably get. I'm not saying a computer can't learn to second guess wide load logging trucks travelling at high speed and taking up the whole road, but the driver/steering-wheel combination probably isn't going to disappear any time soon.
-
Wednesday 9th May 2012 06:20 GMT Shakje
Re: @Dave 126
At the end of the day, what do statistics on interventions actually matter? It's presumably something that can be improved, but if they cause significantly less accidents than human drivers (as I would expect to be the case) then it's got to be a good thing. I suspect that the biggest barrier to these is overselling it as a safe car, because as soon as it has its first "big" accident, which is inevitable, people will go overboard to criticise it.
-
-
Wednesday 9th May 2012 10:27 GMT James Micallef
Re: @Dave 126
Emotions don't give a damn about statistics. If the number of people run over by computer-driven cars (per car-mile driven) is less than 10% the number of people run over by human-driven cars, there will still be an outcry, because people being run over by human-driven cars is a current, known, issue, while being run over by a computer-driven car is something new.
One of human's deepest emotional responses is (unfortunately) anger followed by desire for revenge (in modern society thinly disguised as 'justice'). If I run someone over, or I crash and my passenger dies, I can go to jail for manslaughter. Who goes to jail when a google-car kills someone? The programmer? They don't send my driving instructor to jail if I screw up (granted it's not an exact analogy) Also, there are whole teams of programmers on this thing. What about the project managers? Top executives at Google? (Yeah, right!).
I think the only way it would be solved is if the owner has ultimate responsibility (same as I would be legally responsible if my hypothetical dog hypothetically attacks someone). But then again, what if the 'owner' of the vehicle is a cab company or a trucking company??
This is ultimately not just about cars, what are the legal implication of (semi-)autonomous, independent actors that behave in a conscious manner but do not possess what we would define as consciousness??
-
Wednesday 9th May 2012 11:44 GMT Aeternum
Re: @Dave 126
I think your first comment is actually quite prophetic.
Last year, there were around 2,500 fatal road deaths in the UK.
Let's say that driverless cards are introduced, and they manage to reduce the road toll to 500 per year. But the occasional bug kills that kid you were talking about. Well, we've saved 2000 people in the year, but the Daily Mail is still going to have a headline of "Road killing machine slaughters child".
-
Wednesday 9th May 2012 12:18 GMT Paul Bruneau
Re: @Dave 126
> "It can be improved", try telling that to a couple who've just lost one of their kids after being run over by a driverless car.
> "It's okay, we found the bug and have installed service pack 1".
As opposed to the zero parents who have been told "we're sorry, but your child was killed in an auto accident" today?
-
-
-
-
-
Wednesday 9th May 2012 00:16 GMT Thorne
Re: On the one hand...but on the other...
What this article fails to mention, is that the car has already clocked up #140,000 miles with no accidents in California...
Actually there was one accident but that was a human driver driving up the rear end of the google car.
The vehicles see 360 degrees around them, see in the dark, never get sleepy, never get distracted, never speed, never run red lights and has perfect reflexes.
Yes someone might step out in front of a google car and get run over but human drivers run over stupid pedestrians all the time. Self drive vehicles will just run over less.
The trucking industry will be the first to replace human drivers. Taxis will be second.
It not a case of if but when. Only real problem will be security. You don't want a couple of tons of metal travelling at 100km an hour infected with the carmageddon virus.
Perfect for hitmen. Reprogram the computer to drive off a cliff. Just before impact change the log to state brake failure and reset the firmware. One accidental death.
-
-
Tuesday 8th May 2012 19:41 GMT Dave 126
Re: On the one hand...but on the other...
Would you be happy with it if were shown, after umpteen thousand hours of testing, to be safer than a human driver?
A human driver might suddenly fall ill, be distracted by their domestic issues, be drunk, be an asshole, not be very good at driving in the first place, drives a grey car in fog and doesn't turn their lights on, falls asleep, is trying to impress his mates, sees an attractive shop window, sees an attractive member of the opposite sex, is wearing high heels, be texting their mates, be listening to a Hendrix guitar solo, gets a fly in their eye, sneezes, drives in the middle of the motorway while not actively overtaking somebody, drops a fag on their lap...
-
Wednesday 9th May 2012 07:20 GMT Giles Jones
Re: On the one hand...but on the other...
Humans have intelligence and driving requires it at times. I imagine the car is using some sort of route planning? so what about inaccurate route data? suddenly you're going down a one way street the wrong way due to the direction being changed. A human would realise their mistake, a self driving car isn't going to know.
People don't give the human body enough credit, it's far more sophisticated and reliable than a computer at doing many things. Computers are great at processing numbers but they are not good at making decisions where there isn't a very clear answer.
It won't be long before this car encounters a problem it hasn't been programmed to deal with and they'll wonder why it was granted a licence.
-
-
Wednesday 9th May 2012 15:39 GMT MrZoolook
Re: On the one hand...but on the other...
Having not followed the link (those kinds of GPS fail stories are ten a penny nowadays) I'm not sure what the intent of your post was. You seemed to have posted it with the intention of promoting autonomous driving over human. You actually ended up doing the opposite. The GPS was at fault (either through poor software or incorrect street data) and the driver was at fault for presuming the GPS was infallible.
If the driver was not relying on a faulty GPS that was in fact faulty, he would have seen he was driving incorrectly into (insert choice of pending disaster here) and taken corrective action. On a fully driverless car, the on board systems would inherently be programmed toassume all systems were functioning completely correctly (else it wouldn't be advertised as completely driverless). This would take out the extra layer of awareness that a human driver gives.
Computer systems fail, as do humans, but humans have an ability to realise they are failing. A computer system ofthis kind would have to be programmed to assume it is infallible while it is not. And THAT is why it's a bad idea.
-
-
-
Wednesday 9th May 2012 07:56 GMT umacf24
Re: On the one hand...but on the other...
That is a truly excellent list and thay all of apply to me (though not necessarily while I'm actually driving, and the heels were a LONG time ago) and I have a licence to drive a car completely unsupervised by a computer or anything else. Really, if that doesn't terrify you, it should.
The way to look at this is to imagine how we will feel in fifty years time about the selfish fucks who endanger everyone by driving their cars in manual on the PUBLIC HIGHWAY FFS! And if you think that an automatic system can't really drive, you've obviously never completed a routine journey and found that you've got no memory at all of the last hour...
Personally I can't wait. If I'm travelling, I want to read or sleep, not jiggle knobs and levers.
-
-
Tuesday 8th May 2012 21:52 GMT bazza
Re: On the one hand...but on the other...
@Cliff
To extend your comparison (if I may) with autopilots in planes, another key difference lies in what the two systems actually do.
Aircraft autopilots are performing relatively simple flight dynamics calculations to keep the plane in the sky. They're processing data from reliable sensors (air speed, altitude, attitude, GPS, etc). They operate in an environment where there's nothing to hit except other aircraft that are easily spotted (radar systems / transponders work really well). Also they are similarly controlled either by an autopilot or by a pilot following the rules of the sky. Missing the ground is generally straightforward provided everything is working. They're not artificial intelligences systems; their rules are actually quite straightforward. In short, the problem is well specified, bounded and comparatively easy to test.
This system of Google's has to make complex decisions based on data that is not wholly reliable (image recognition systems never are) in a complex and highly varied environment. How, for example, does it cope with rain, fog, a patch of smoke, debris in the road, etc? It is a classic artificial intelligence situation in that the rules are not tightly defined. In short the problem is not well specified, is poorly bounded and is a nightmare to exhaustively test. Not a good place to be if one is contemplating offering this to the general public…
Your point about Google's corporate culture is valid. They've not got a tradition of developing safety critical systems. I very much doubt that there's anything about the development of this system that an avionics engineer would recognise as being appropriate and sufficient given the scale of the problem. Personally I'd prefer to spend my driving time looking after what the car's doing myself. I'd rather not spend the time wondering if and when another of Google's bugs will occur.
-
Wednesday 9th May 2012 07:26 GMT Giles Jones
Re: On the one hand...but on the other...
Fly by wire is dead simple as you're flying through air to a nice curvy line. All a fly by wire does is look where the plane is relative to the plan and make little adjustments to move it back to where it should be.
No people in the way, planes are kept very far apart.
Perhaps if driverless cars were controlled by a driving control centre and pedestrians were banned from crossing the road then it would be a safe system.
-
-
Tuesday 8th May 2012 19:35 GMT Cucumber C Face
Only in Nevada
You must hand it to them - they walk their libetarianism like they talk it.
They have decriminalised marijuana, legalised prostitution and gambling. It's also home to gun ownership laws which stop just shy of allowing juveniles to pack tactical nuclear weapons.
Now driverless cars - excellent!
It's also one of the States with the lowest population density. Perhaps it's just as well.
Still it's high on my list for a fun holiday. See you at the Burning Man maybe.
-
Wednesday 9th May 2012 02:23 GMT Brad Ackerman
Re: Only in Nevada
Nevada's gun laws are about as restrictive as you can get while still being a shall-issue state. Since there are only ten states that aren't shall-issue (plus DC, VI, PR, GU, AS, MP), that's pretty far down on the liberal-gun-laws list.
Black helicopters, because there's got to be some state that will let me own an attack helicopter.
-
Tuesday 8th May 2012 19:39 GMT Anonymous Coward
See, this is what I like about Google
It's not lost sight of its main role but it's spread itself out-core tasks like search, maps and mail, extensions of that like Chrome and Android, and way-out-there stuff like self-drive cars and solar panels. It treats itself like a university research department not like a company out to make the maximum profit right now. It really does care about R&D. I really do hope it manages to turn that into amazing, unexpected businesses in future rather than having this stall as an experiment that never quite came off. God knows some drivers right now could be replaced by computers and we'd be better off for it.
-
-
Wednesday 9th May 2012 00:23 GMT Thorne
Re: See, this is what I like about Google
I agree completely, as long as you don't have to create a Google+ account before it lets you drive it.
Forget the google car, it's the Apple car and M$ car you need to worry about. The Apple car only drives you to Apple approved destinations and the M$ car turns blue before crashing. If you see a blue car... RUN!
-
-
-
Wednesday 9th May 2012 11:18 GMT GrahamT
Re: See, this is what I like about Google
The Microsoft car only works if at least one window is open, but slows down if you open too many.
To stop it, you press on the start button. Every so often on a journey the car will become unstable, so you have to stop the car and restart it, if closing and re-opening the windows doesn't work.
If it goes crazy and can't be controlled, you have to press three widely spaced buttons simultaneously to be presented with a dialogue box asking if you want to stop, restart, monitor the performance or lock the doors.
-
-
-
Wednesday 9th May 2012 15:51 GMT MrZoolook
Re: See, this is what I like about Google
"The Apple car only drives you to Apple approved destinations"
I bet the on board internet won't let you watch flash video!
"and the M$ car turns blue before crashing."
But it does come with updates that will ask you to pull over in the middle of the freeway to re-boot.
-
Friday 11th May 2012 00:03 GMT Thorne
Re: See, this is what I like about Google
"But it does come with updates that will ask you to pull over in the middle of the freeway to re-boot."
What do you mean pull over? It'll do it in the middle of the freeway and won't move until it's downloaded three gigs of updates and any attempt to abort it or move the vehicle will result in the vehicle never starting again.
-
-
-
-
Tuesday 8th May 2012 20:09 GMT Anonymous Coward
Re: See, this is what I like about Google
> It really does care about R&D.
Only to maximise profits, duh. Obviously they hope these self driving cars will replace their fleet of StreetView cars and drivers. Drivers that don't need breaks or sleep...
Their existing Streetview cars already have most of the necessary - and expensive - sensors anyway.
-
Tuesday 8th May 2012 22:18 GMT JDX
Re: See, this is what I like about Google
Of course they want to make profit, they'd be a useless company if they didn't. That takes nothing away from the fact that spending their vast profits on many areas of R&D is a good move... historically a lot of our science was done by the rich.
Playing the "they're only doing it to make money" card just shows a total lack of understanding of the world...
-
Wednesday 9th May 2012 11:45 GMT Aeternum
Re: See, this is what I like about Google
In a sense, you're right but then compared to other companies I think they can be far more ethical.
For example, long before data centre power usage was a concern being printed in the mainstream press, Google was forging ahead trying to run their data centres off renewables. I think they also do a fair bit for charity.
-
-
-
-
-
-
Wednesday 9th May 2012 08:49 GMT Michael H.F. Wilkinson
Speed limit on Autobahn? Only in some places. On most stretches they allow you to hit Mach 2. I once saw a video of a Renault Espace fitted with an F1 engine hitting 200 mph on a track. Perfectly OK for the German Autobahn.
The funny thing is most Germans support a 130 km/h speed limit, but that the car industry lobbies very successfully against it.
-
-
-
-
Tuesday 8th May 2012 19:53 GMT Dave 126
Computer driver, human navigator...
I once asked Google Maps to plot a route from Birmingham to Truro... which it did fine, except for including a return trip to Merthyr Tydfil before finding its way back to the M5. 'Never ascribe to malice what could be attributed to incompetence', so I won't suggest that the operators of the Severn Bridge and the Welsh Development Agency hold advertising accounts with the Chocolate Factory.
(Sorry to any readers unfamiliar with the South West of the UK mainland... it's the equivalent of Q>T>B>T>P when you wanted Q>P on your keyboard)
-
-
-
Wednesday 9th May 2012 16:17 GMT MrZoolook
Re: Why?
This is an interesting question. Hypothetically, if I buy a driverless car and instruct it to travel from where I am just inside the state line to somewhere else just inside the state line. The car comes across an obstacle (broken bridge or whatever), and decides to plot an alternative route. The only available routes will take it just over the state line. From here, two scenarios could happen.
1. The car realises it cannot continue the journey, and will either just stall where it is (I will give it credit enough to pull over to a safe parking position if there is one) and somehow inform the driver, police, towing service etc. Not terribly convenient if the owner is a few hundred miles away. Who will bear the brunt of the cost. The owner? The computer developers? The neighbouring state for not allowing the car to travel there?
2. The car disregards (I hope pre-programmed and not changeable) instructions regarding travelling over the state line, and thus ends up driving illegally in the other state for a portion of the journey. Who would be liable here? The owner didn't plot a course to cause the infraction so would be immune from prosecution, he may not even be a qualified driver. The software developers for the car itself, for not enforcing point 1 above? They will just palm it off on the GPS programmers for not sending a specific signal to the car that the journey should be aborted. What if the car does go over the state line, then just stalls because it THEN detects that it isn't in a 'Driverless car freindly area' for safeties sake? What if does that on a connecting bridge or one car wide road to the other state. How much chaos is that going to cause?
-
-
Tuesday 8th May 2012 20:21 GMT ShaggyDog
BetaDrive
No thanks very much. Google might have shit hot programmers and even plenty of good engineers, but trusting them with my life and the lives of others? Not on your nelly.
Plane fly-by-wire systems have multiple, redundant systems, all implementing the same specification but written in clean-room conditions (different company's for each to avoid cross-pollination).
I just don't trust any web company that deeply. It's seems more important in todays' world to get something out today and fix the problems in v2+.
I daresay it will come in time and it might be good in "convoy" mode (minimal spacing to save petrol), but bloody scary.
-
Tuesday 8th May 2012 20:26 GMT TangD
Re: BetaDrive
Given the lack of even basic driving skill displayed here (CT) on a daily basis I'm quite keen to see these become mandatory. Or at least maybe they could automatically report other drivers who do stupid f*cking things like tail gating and cutting others off. Except of course if they were all reported and acted on the courts would fail to meet the demand and no 18 wheelers would ever enter CT again for fear of being impounded.
-
Tuesday 8th May 2012 21:05 GMT bazza
Re: BetaDrive
"...I'm quite keen to see these become mandatory".
Well that'd be interesting, especially if its use (not just it being standard equipment) was mandatory.
If the local legislature were to consider such a move then it would also, provided it had an ounce of common sense, transfer liability for accidents caused by a system failure to the manufacturer (or possibly to itself). It would be highly unreasonable for drivers (passengers?) to be held responsible for failures in a system which they are obliged by law to use.
So if that were done, that would mean Google (for example) would be responsible for every single accident caused by their system failing. That would potentially be one hell of a liability; one trivial bug could cost many millions of dollars in pay outs / fixes / etc.
Still, if it evolved to the point where mandatory use was truly backed up by proven reliability, perhaps then it would also acceptable to allow it to drive us home from the pub when drunk!
-
Wednesday 9th May 2012 06:52 GMT TeeCee
Re: BetaDrive
Yup. In Belgium (where else?) yesterday, I was disturbed from my reverie in slow-moving traffic by a cacophony of horns to the rear. That would be a Belgian performing a "Belgian lane change" (indicate and go, look straight ahead throughout, ignore mirrors) into the half-a-car's-length gap between me and the extremely pissed off articulated truck behind.
-
-
-
Wednesday 9th May 2012 07:22 GMT bazza
Re: BetaDrive
@Thorne,
" you're already trusting your life to programmers and engineers every time you get into a car now"
Not really, at least not in the same sense, and not to the same extent. The mechanical design can be analytically proven to be safe. The car's electronics and software do not yet have total authority over the cars actions. The driver still has direct mechanical / hydraulic control of the car, with the electronics and software just there to help.
The exception is air bags. It is highly important that they do *not* go off until you crash. However, they sometimes do - it happened to friends of mine whilst they were driving along the motorway, nearly killed them. Big fail, Ford. Other exceptions thus far are I think the brake-by-wire system on some Mercedes, and perhaps adaptive cruise control, neither of which I believe are wholly reliable.
With Google's self drive car it's different. Google's line is that you can cede control of the car and relax, in which case the car is now in control, not the driver. But the fact that the driver will remain legally responsible means that not even Google really trust their own system. So why should the driver? After all it'll be their crash, not Googles.
-
Wednesday 9th May 2012 07:36 GMT Giles Jones
Re: BetaDrive
Yes, but the software in ECUs is very simple. It is a lot easier to test for problems.
At least if you are driving and there's a problem you can usually deal with it. How are you going to stop the car driving off the cliff if you can't disengage the autopilot?
Remember Toyota? they couldn't even design an accelerator pedal that didn't get stuck. A simple pivot. But you could at least pull the pedal back with your foot.
-
-
Wednesday 9th May 2012 08:37 GMT TheOtherHobbes
Re: BetaDrive
It's only scary because humans are control freaks and don't understand statistics well enough to appreciate that software can be better at mission critical tasks than they are.
There are around 2000 road deaths a year in the UK, and aside from the occasional mechanical failure, most are caused by human error.
GoogleCar is going to have to clock up a few billion miles without an accident, to prove it's at least as good as manual driving. (The UK accident rate averages less than a handful of events per billion vehicle miles.)
But once that happens - why the hell not?
If software is safer, then software is safer. End of.
(Unless the car serves you ads while it drives you. *That* would be annoying.)
-
Friday 11th May 2012 00:19 GMT Thorne
Re: BetaDrive
"It's only scary because humans are control freaks and don't understand statistics well enough to appreciate that software can be better at mission critical tasks than they are."
Google can't stop accidents cause shit happens. What it will do is remove human error from the equasion. Yes a stupid animal might leap out in front of you and cause a crash but as a human driver could you avoid it? The google car has perfect reflexes and 360 degree night vision.
No more speeding. No more drunk drivers. No more idiots doing burnouts in the middle of the street. No falling asleep at the wheel. No more old guys in hats driving at half the speed limit pissing off everyone behind them. No more road rage.
Over 90% of accidents are caused by humans. Software can't possibly do any worse.
Only problem I can see is all the unemployed highway pigs joining the dole queue. They'll have to take to hurting small animals when they don't have drivers to torment anymore.
-
-
-
Tuesday 8th May 2012 22:44 GMT Filippo
I don't see the point of the "can't trust a software with my life" comments. Accidents rate are just numbers. They can be calculated and compared. Saying that you can't trust a computer-controlled car is just as stupid as saying that you can. Just test the thing; if it gets into less accidents than a human, the rational thing is to use it; if it doesn't, the rational thing is not to use it and keep working on it until it does. I mean, it's not a philosophical question, it's an eminently practical one.
-
Wednesday 9th May 2012 02:20 GMT Ole Juul
@Filippo
Accidents rate are just numbers.
You are correct and the question is indeed a practical one. However, I don't want to be one of those numbers. It's not that I don't trust computers, it's that I don't trust the motor vehicle authorities with them.
As an example I would point out that in several Provinces in Canada, we have many deaths from collisions with wildlife on the road. Around here we see groups of deer along the highways. As you approach them, one or two will usually wait until you get right up to them, then they will panic, and run right in front of the car. No problem, a human sees a deer and slows down or stops until it is clear that none of them are going to run. Perhaps a computer can visually tell the difference between a deer or moose and people, but it will probably be difficult and the driverless car would logically not be applied to this area. In reality what is likely to happen is that we would get more auto related deaths in this area than in cities and that it would be averaged out. So, we end up with more deaths here and lots less in the cities. Such are the ways of bureaucracy where "accident rates are just numbers".
BTW: Just so you can visualize the reality of wildlife problems. Many deaths caused by deer are from people swerving to avoid hitting the animal. This does not work when there is a 200 foot drop beside you.
-
Wednesday 9th May 2012 04:55 GMT Richard 12
Re: @Filippo
That last part implies a computer driver would be safer, because it would not drive off the cliff.
It might still hit the deer, but it should be better at braking and steering accurately than an average driver so is more likely to avoid an obstacle as long as it can detect it.
Probably not Stig quality, but most drivers are not that good.
-
Wednesday 9th May 2012 20:13 GMT Ole Juul
Cliff vs Deer and logical loops
That last part implies a computer driver would be safer, because it would not drive off the cliff.
It might still hit the deer, but it should be better at braking and steering accurately than an average driver so is more likely to avoid an obstacle as long as it can detect it.
Probably not Stig quality, but most drivers are not that good.
That's a good point. In this case it is indeed the human aspect which fails. But now you've got me worried about animal rights activists lobbying for "improved" software. :) Seriously, it is the differentiation between deer and people that has me worried. Hence my conclusion that this system will be generally an improvement in city driving, but we're still going to have human intervention - if nothing else to prevent logical loops hanging you up on your way.
-
-
Friday 11th May 2012 00:31 GMT Thorne
Re: @Filippo
"Around here we see groups of deer along the highways. As you approach them, one or two will usually wait until you get right up to them, then they will panic, and run right in front of the car. No problem, a human sees a deer and slows down or stops until it is clear that none of them are going to run. Perhaps a computer can visually tell the difference between a deer or moose and people, but it will probably be difficult and the driverless car would logically not be applied to this area"
Why is it difficult? Thermal camera see in the dark and pickout the shape and size. It can see the deer hiding in the bushes. It knows it's in a high animal strike area and was driving slower to compensate anyway. The size of the heat source shows it to be a deer so it slows down more and maybe it even blasts out an ultrasonic noise to pre emptively scare the deer so it runs before the car reaches it.
All the time the human in the car doesn't see or hear a thing.
Ok lets assume the google car hits a deer. It knows there's been a strike, it will brake and park safely instead of slamming on the brakes and skidding out of control.
Even with animals, software will be safer. In fact google's footage of night driving shows a deer leaping out in front which the car successfully avoids.
-
-
-
-
Wednesday 9th May 2012 04:34 GMT Steven Roper
Re: Potential cash cow?
Not only will they slow down on passing advertisers' billboards and shops, the GPS will "mysteriously" plot a circuitous and serpentine route from A to B that will take you slowly past every advertiser in the country.
Birmingham to London via Leeds, Norwich and Exeter anyone?
-
-
-
Friday 11th May 2012 00:43 GMT Thorne
Re: Since this is in America...
If a Google-driven car rams into you, how many billions can you sue Google for?
None. The car's black box will show footage of you doing something stupid like stepping out in front of the car or you pulling out suddenly without indicating or any of a million possible stupid things.
Court will suddenly have plenty of cameras as all cars will have 360 views footage. This will be for all crimes. Big Brother has wheels now. Your getaway car is now a police informant. Damn you Audi stool pigeon.
-
-
Wednesday 9th May 2012 07:24 GMT b166er
So if you're in traffic and you see a lorry come piling in from the rear (driver asleep or whatever) does the GDriver take evasive action, or does it let the lorry pile into the back of you?
This is one of the reasons why I'm always in the lane closest to the hard shoulder in traffic, because it serves as my escape lane should I need it.
I'd like to set up some simulations for the GCar and see what it does in those scenarios.
Maybe Google could put up some videos showing examples of the car taking evasive action.
-
Wednesday 9th May 2012 12:39 GMT Anonymous Coward
err
Is your logic flawed? Using your theory the safest place would be the fast lane as lorries arent allowed in that lane (in the UK anyway). How quickly do you think you could take the handbrake off, engage gear and pull into the hard shoulder before the lorry piled into you. You might have about 1second to achieve it. Plus how would you know that its safer to go that way, when the lorry might swerve that way at the last minute.
I think you are proving the point that humans think they are better drivers than they are...
-
-
-
Friday 11th May 2012 00:50 GMT Thorne
Re: Just wondering
how much of the general publics wifi data the car slurps as it drives along?
Bugger that! Google knows you went to the asian brothel "Love You Long Time" last week so now gives you asian porn popup ads.
It also knows you were there for five minutes so also offers viagra ads as well
All part of the service
-
-
Wednesday 9th May 2012 09:36 GMT Dan 10
Anyone read Daemon?
They'll kill us all...
Seriously though, Google don't have a culture of developing safety critical embedded systems, remains to be seen how good these are.
Also, taking the responsibility of driving away from me, but leaving me with the responsibility if it crashes is a cop-out. If the thing runs for miles without incident, do the authorities really expect you to pay attention constantly when there is no incentive to do so? Already there is talk that drivers are less attentive when besieged with myriad driver safety aids (lane deviation sensors, stopping distance sensors linked to auto-braking, etc etc) - if I buy one of these it's because I want to use the travel time more productively.
-
Wednesday 9th May 2012 10:01 GMT Leigh Geary
Actually...
How about a slimmed-down version here in the UK? Remove the overly stupid thing on the roof and restrict these to just the hard-shoulder of Britains motorways and BOOM, you've got a moving bed.
Imagine, you drive yourself to the motorway, flick a switch and an alarm wakes you up when you reach the right junction.
I've got an even cheaper solution involving a lasso and a Polish truck.
-
Wednesday 9th May 2012 11:35 GMT E_Nigma
"Google's fleet will have red Nevada license plates with a Greek infinity symbol, intended to alert other drivers that a computer has control of the vehicle."
So, the meaning of the plate is essentially "Warning! Incompetent driver!" (as otherwise there would be nothing to be alerted about)? Great idea! Now all they need to do is start giving them out to people and they can abolish driving schools and exams, think how much money that will save and from how much needless greenhouse gases emission we'll spare our planet. IMO, if it can't pass at least the same test humans do (as individual humans are expected to improve as they gain experience, and this car unless it has some serious supercomputer and AI on board, won't until it's replaced or specifically updated), it's not fit for road.
-
Wednesday 9th May 2012 12:12 GMT Anonymous Coward
There's a lot of things out there that cause accidents but we dont stop them altogether as the benefits out weighs the cons (though death is a pretty big con i guess) otherwise we would never have allowed the use of Planes, Cars, Trains, Ships, Mobile Phones (for those that believe it causes cancer), Gas Appliances, anything electrical... The list could go on!
Some will then say "I dont use those things so im not at risk", though you might not use them you could still be injured by then, plane falling out of the sky onto your house, the house next door has a gas explosion, train derails and hits you in the face as your walking on the pavement next to the track.
Lets all stop moaning and get on with life instead of fighting against it, you'll be a lot happier person!
-
Wednesday 9th May 2012 12:39 GMT davtom
Autopilots
It's interesting that a number of people have made comparison to autopilots here.
From an air transport point of view, autopilots are simply taking load off the pilot(s) who do not have to maintain control over the aeroplane to execute the six manoeuvres that occur in normal flight: straight-and-level flight, level turns, climbing straight, climbing turns, descending straight and descending turns.
There are other systems that are still available. Of course, there are the pilots, who can (usually) override the autopilot and fly manually if required for any reason. There are air traffic control who modify flight plans, e.g. by passing vectors and altitudes to the pilots, who are then either responsible for flying those vectors or to set the autopilot to fly them.
What the autopilot is doing is reducing the workload of the pilot.
There are similar systems for cars that already exist, one of which is of course cruise control, which attempts to maintain a constant speed by varying the throttle position automatically. This is just a very advanced form of the system.
In the aircraft situation, the pilot is ultimately responsible for the safety of the flight, and I can't see it being any different for a car situation: the driver will be ultimately responsible for the safety of the trip.
Autopilots do increase safety (certain procedures can be flown by autopilots that are not allowed by pilots alone) and I have no doubt that this car "autopilot" will also increase safety.
-
Thursday 10th May 2012 01:59 GMT jake
Re: Autopilots
The difference between autopilots in aircraft and cars is that for the most part, autos move in any direction, and at any speed, in a single plane, with all kinds of obstacles to be avoided[1].
Whereas in aircraft (except around airports) are all moving in the same direction & relative speed at any given altitude, with few obstacles[2].
Autopilots for cars is contraindicated ... unless they are on a separate conduit system than the rest of us. Which ain't gonna happen for local. Might happen for long distance, but probably not before my grand-daughter is married, though (she's a year and a half, or thereabouts).
[1] Cats, dawgs[3], bicycleistotards, iFad wielding pedestrians, impaired drivers, folks from blighty in a left-hand-steer car for the first time (and vice-versa, of course!), mopeds, squids on rice-rockets, suicidal teenagers on skates/blades/boards/razors, the postwo/man, delivery drivers unfamiliar with the area, children not properly stowed by their parents, OAPs with pension check in hand driving their one day per month, and (here in Sonoma, anyway) tourists with blinkers on who aren't looking for anything but The Mission or The Barracks or General Vallejo's house or a specific tasting room ... and etc.
[2] The un-avoidable (bird-strike), and avoidable (weather).
[3] I killed a dawg the other day ... The idiot owner was talking to a friend on a corner. The dawg was on one of those fucking awful retract leads. The dawg spotted a squirrel/bunny/cat/whatever on the other side of the road and took off after it. It bounced off the right side of my front bumper with enough force to pull the idiot owner off her feet. I stopped. So did the Cop who was behind me. The idiot ex-dawg-owner wanted the Cop to arrest me. He declined, telling her that it was her fault, not mine. She complained about her own bruises, "caused by [me]". The Cop said "stupidity "should hurt!", and I went on my way ... If you take anything from this story: If you have a retract lead, throw it away & get a proper lead. Ta.
-
-
Wednesday 9th May 2012 15:19 GMT Anomynous Coward
Derail
There have been many suggestions in the past that if the railways were removed and replaced with tarmac roads lorries using them would be more efficient than trains for freight.
If these systems are good enough this might genuinely be a workable idea - separate from the human-controlled road system you have a robotised road network with fixed routes running a flexible mix of freight and passengers.
I'd like to know the amount of man hours spent by individuals driving themselves around - its an unproductive activity in itself; people obviously aren't always where they need to be but requiring them to be in control of the process of moving themselves around is a waste if there's a machine that can do it more efficiently.
-
Wednesday 9th May 2012 16:43 GMT MrZoolook
Re: Derail
"I'd like to know the amount of man hours spent by individuals driving themselves around - its an unproductive activity in itself"
Spoken like a true company director... "So what if we are sending you on a 400 mile trip on Sunday to get to a meeting at the other end of the state. We still expect a full days workout of you!"
-
-
Wednesday 9th May 2012 18:02 GMT Andus McCoatover
Did the article mention...
T O Y O T A ???
Whose cars spectacularly failed a simple 'throttle-by-wire' 'test'???
Not in my lifetime. Still apprehensive to get on an airbus with it's tiny loystick. Christ, before long, the pilots will be controlling it with a Wiii - whatever it's called - soon enough. (Imagine, about to land, and the captain sneezes...)