back to article Driver in Uber's self-driving car death goes on trial, says she feels 'betrayed'

The name Rafaela Vasquez may not immediately be recogniseable, but the accident that ties her to the first-ever fatal self-driving car crash accident will be. Vasquez was the driver when one of Uber's autonomous test cars crashed into a woman walking her bike across the road at night in March 2018. Now nearly three years later …

  1. John Robson Silver badge

    Who cares....

    "Her defense team will argue she was checking Slack messages from Uber in her work phone at the time, whilst prosecutors will say she was watching an episode of reality show The Voice on her personal handset."

    It doesn't really matter which, it matters whether either of these was considered acceptable whilst being the legally responsible driver in a vehicle.

    If checking slack is something which was accepted, or even expected, by her employers then that is a serious issue.

    She was a safety driver, not a systems engineers meant to be heads down doing other work whilst the vehicle was in motion.

    1. ShadowSystems

      Re: Who cares....

      Exactly. She was supposed to be driving, not ignoring everything in favour of burying her nose in her SmartPhone. It doesn't matter what she was doing on it, that she was *not* paying attention to the road is the crux of it all.

      ClearView needs to be bitch slapped with a fine that will not just be a cost of doing business, it should be severe enough to make their bottom line bleed like a freshly butchered pig. A mere 20M is chump change. Turn that M into a B to make them take it seriously enough to not want to do it again. Turn it into a T to make their shareholders take down the entire C-level bunch of prats with pitchforks, burning tar, & bags of feathers.

      1. Anonymous Coward
        Anonymous Coward

        Re: the crux of it all.

        Surely the crux of it all is that expecting humans to pay continuous and close attention to something that only relatively rarely requires them to actually *do* anything is, even at best, optimistic. It seems to me like the "safety driver" is only there to provide someone to blame who *isn't* the company claiming to be testing a "self driving car".

        IMO, regardless of the situation, the company should be just as much on the hook for reponsibility as the "safety driver" employee. Because either the company hired someone not up to the task (i.e. the company's fault, as well as - possibly - the employee's for being negligent), or they gave the employee a job that couldn't be performed to an adequate standard (i.e. the company's fault).

        1. Steve Button Silver badge

          Re: the crux of it all.

          Absolutely. On the one hand she's got no right to say she was "betrayed", she should have been paying attention. On the other hand, after you've been sitting behind the wheel doing nothing for hours and hours, after a while it's human nature to get a bit bored and trust that the thing is doing a good job of driving itself. (perhaps not if you are an engineer - but just a mere human).

          Should we learn something from this, and put in some safety measures that make sure the "safety driver" is truly paying attention? Some combination of carrot and stick?

          1. Anonymous Coward
            Anonymous Coward

            Re: some safety measures that make sure the "safety driver" is truly paying attention?

            Perhaps if we asked them to actually drive the car? :-D

          2. Great Southern Land

            Re: the crux of it all.

            Vigilance controls along the lines of those used in Railway locos would do the job.

        2. John Robson Silver badge

          Re: the crux of it all.

          Good thing we don't expect human learner drivers to be taught be people who will ever more rarely be required to intervene in their driving.

          A hud with information about the decisions that the system is taking would probably be a useful addition - equivalent to "conversation" with a learner driver.

        3. jmch Silver badge

          Re: the crux of it all.

          "...expecting humans to pay continuous and close attention to something that only relatively rarely requires them to actually *do* anything..."

          Reminds me of this...

          https://giphy.com/gifs/simpsons-dippy-bird-drinking-l41lUJ1YoZB1lHVPG

      2. Doctor Syntax Silver badge

        Re: Who cares....

        "Turn it into a T to make their shareholders take down the entire C-level bunch of prats with pitchforks, burning tar, & bags of feathers."

        The shareholders are equally reprehensible for investing in this antisocial crap. Don't let them escape the the tar and feathers.

    2. vtcodger Silver badge

      Re: Who cares....

      There's a long article on Vasquez and the accident at https://www.wired.com/story/uber-self-driving-car-fatal-crash/. Complicated situation. For example, Vasquez claims she was Listening to The Voice, not Watching it. And the pre-crash dash camera video might well support that. Listening to a broadcast seems OK and was also permitted by Uber's "driver's" rules. On the other hand, there is some reason that the authorities decided to charge her but not Uber, and I'd like to hear their side of the story.

      BTW, the article does not reflect all that well on Uber. On the positive side, they did have rules and the cooperated in the investigation. On the other hand, it sounds like their "self-driving" software might be awful. Even worse than Tesla's -- assuming that is possible. And they disabled the vehicle's collision avoidance system which might have worked. And their system clearly was not designed with "If you must make mistakes, error on the side of safety" in mind.

      1. Anonymous Coward
        Anonymous Coward

        Re: Who cares....

        "there is some reason that the authorities decided to charge her but not Uber"

        That would be because at the time, she was legally in control of the vehicle.

        1. teknopaul

          Re: Who cares....

          What's the betting they told her it was a "self driving car" and she didn't have to "do anything" (especially not look at clause 475)

          Reg readers probably all know otherwise, but I don't blame someone for thinking that a self driving car can drive itself.

          Be nice to know how much training she got, if any.

          Perhaps we need a driving licence for using self driving cars? And to stop calling them that, since they ain't and maybe maybe never will be.

          1. TRT Silver badge

            pretty cold and dark thought...

            Could it even be in the vaguest realms of possibility that they would deliberately put someone in their car and instruct them or misguide them as to the level of interaction required, knowing that this was a likely outcome and that, in fact, what their "experiment" was intended to do was not so much to test the car but to test the legal system?

        2. Justthefacts Silver badge

          Re: Who cares....

          No, she wasn’t. She had been given a defective vehicle to drive. If you hired a car, and the hire car company had deliberately cut the brake-lines without telling you, because they were running a randomised controlled trial on how good a driver you were…..would you be culpable?

          Uber *disabled the factory-fit collision avoidance system*.

      2. stiine Silver badge

        Re: Who cares....

        You must not have see the video screen shots that were released. That date where she says that everyone went from supporting her to hating her was, in fact, the day the images were relesed showing her staring down at the phone in her lap. If she was only listening to The Voice its only because her phone was in split-screen mode and she was sending a text message.

        1. vtcodger Silver badge

          Re: Who cares....

          According to the Wired story -- which may not be an unbiased source. What she was looking down at was her work phone on the console which was displaying work related slack messaging. (Seems to me like a bad idea BTW. If reading messages while driving really is part of the job, surely the phone should be mounted on the dash or in the upper left corner of the windshield). According to Wired, her personal phone showing/playing The Voice was on the passenger seat. And -- according to Wired -- the dash video shows her reaching over there to get the phone to dial 911 after the accident.

          1. stiine Silver badge

            Re: Who cares....

            Then we've seen different screenshots. The shots I saw were of her holding her phone in her lap, below the steering wheel, not staring at the console (and no good driver would mount a phone on the console instead of the dash at eye height).

    3. ciaran

      Re: Who cares....

      I remember seeing the video of the crash when it first happened.

      Yes the driver didn't have her eyes on the road, but on the camera footage I couldn't see the pedestrian until the last second. Blink and you're too late.

      My conclusion at the time was that the woman and bicycle were clearly in the wrong, and that that type of accident must happen all the time without anyone being sent to jail.

      Uber or any other self-driving test organisation should be obliged to take out a special insurance to pay big money to anyone who manages to get hit by one of their cars. That would make them more risk aware..

      1. TRT Silver badge

        Re: Who cares....

        "the woman and bicycle were clearly in the wrong"

        *sarcasm* Obviously deserved a death sentence then. */sarcasm*

        In order to assert that accidents of that type must happen all the time, you have to ascertain how may incidents of that type don't become accidents of that type due to the presence and active response of a reasonably vigilant driver.

        It would be impossible to prove beyond any doubt that had the driver been paying attention, the outcome of the event would have been different. It is similarly impossible to prove that it wouldn't have been any different.

        It would be even harder to determine if the the driver HAD been paying attention that they would have taken action to avoid the outcome, seeing as though they were in a vehicle that supposedly would avoid such incidents of its own volition. In a way, I wish that this was in fact the case being tested. This will be an easy one... she wasn't paying attention, she didn't see the woman, she's liable. The harder case, and the one that puts the ball firmly in the court of the developer, is if the driver HAD seen the woman but the vehicle's control system either hadn't the sensor data to label it or if it had, that the vehicle's system would have taken avoidance action about it. How's a supervising driver supposed to know if something has been detected or not and if it has, how has it been classified? It's all well and good showing these videos of objects with green cuboids around them and vector arrows attached to them, but these aren't displayed in the vehicle, are they?! Superimposed over all the windows to show what the vehicle's view of the world is? We don't get to see a stream of machine code down the side labelling the possible responses and highlighting the selected response.

        1. martinusher Silver badge

          Re: Who cares....

          We have a high profile case in our area where a lady hit and killed a couple of kids who dashed out on a crosswalk. All the signs -- and emotions -- point to it being 100% her fault, except that I nearly had an identical thing happen to me. I was trundling along the road near my house and a group of kids rushed up to a crosswalk, hit the button and immediately dashed out into the street. Fortunately they saw the danger and stepped back because I'd have either ended up hitting them or a tree (or both). I stopped and had a chat with them about "vehicles can't stop immediately" -- this should have been part of any elementary school road safety training, it was certainly taught to me.

          Its takes approximately a half second for a human to react to a problem. They've also got to see the problem. Given the video from that car there is absolutely no way -- night or day -- I could have avoided that lady if I were driving. The advantage that I have over self-driving software is that can use judgment -- the software's got better vision and eyes literally in the back of its head but I know about humans and their foibles.

          One detail, though, is that modern cars are designed to deflect people in such a way to minimize injury if they hit one. This implies that the external shape of the vehicle is left in its original state. If you cover the vehicle with extraneous crap then these can be dangerous. (Its the same reason why you don't attach cameras to the top of crash helmets.)

          1. Cederic Silver badge

            Re: Who cares....

            The video from that car which was released was not indicative of true visibility levels. It was a well lit street in which someone with a bicycle crossing the road would be visible from hundreds of yards away.

            Sure, too dark to tell whether it was a man or a woman, too dark maybe to even tell there was a person behind the bike. Not too dark to see the movement of someone that wasn't lit and wasn't a car traversing the tarmac.

            Experienced drivers see movement on the road, they either understand immediately what it is or they slow down so they can identify the cause and take appropriate action. They don't sit there reading Slack messages until contact.

            1. Paul Kinsler

              Re: Experienced drivers ...

              Irrespective of which way one might like to attribute fault or responsibility, the so-called "safety driver" was *not* the actual _driver_, because she was *not* doing the actual _driving_. They were employed to watch the car drive itself, in the hope that they might be able to intervene in time should something go amiss. Perhaps a better job title might be "drive supervisor" or something.

              1. Anonymous Coward
                Anonymous Coward

                Re: Experienced drivers ...

                She was paid to be responsible for the car. The automated emergency brakes were disabled. It will be determined in court if she knew this. She wasn't paying attention to the road.

        2. Terry 6 Silver badge

          Re: Who cares....

          Good points. The implication there, then, would be if the not-really-a-driver does intervene to over ride the car and that results in someone dying when the car could have prevented it, are they also liable?

          1. TRT Silver badge

            Re: Who cares....

            Another good question.

            I suppose one could set up a trial of similar circumstances... a dummy, a bike, a car, a button to press if / when you register the presence of the dummy. Run a series of trials and controls.

            I suppose it's easier to prosecute the "driver" than open the can of worms that would involve prosecuting Uber.

    4. Justthefacts Silver badge

      Re: Who cares....

      Yes - but that’s not *why* the car crashed and killed the pedestrian. The crash occurred for one very simple reason: Uber deliberately disabled the *standard* car safety systems (auto-braking). They did that because the factory-standard system repeatedly saw their self-driving system as driving dangerously, and braked very often.

      This has ultimately nothing to do with self-driving at all, it’s the same as if they had deliberately crippled any other safety-critical system on the car. Would we be having this discussion if the engineering subsystem under discussion were the hydraulic brakes? Of course not. In fact, it’s almost identical to the Boeing MCAS case. Would the pilot have agreed to fly the plane if they had known the undocumented MCAS existed? No, of course not. Would the “safety driver” have agreed to sit in the drivers seat, if she had known Uber had deliberately disabled it’s standard safety systems to allow the car to drive dangerously? No, of course not.

      This is on Uber, 100%.

  2. Martin Summers Silver badge

    You've clearly got to have no fear or respect for your own life, let alone anyone else's to sit in a self drive car without being the slightest bit worried about what's going on outside. Or she was too trusting of the technology, which is an equally insane state of mind.

    This is entirely her fault so far as I'm concerned. You're sat there in a moving vehicle and you're meant to intervene if things go wrong. You can't do that if you can't be bothered to look anywhere but your phone.

  3. WanderingHaggis

    Software to arbitrate between two soldiers -- haven't they heard of rank?

    Don't really don't get this one, you don't have time in combat to make decision making protracted so usually the army has very clear hierarchy such discussions usually end with yes sarg or sir not lets boot my laptop and see what the AI thinks once everything is entered in.

    1. Anonymous Coward
      Joke

      Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

      "Hello Dave, how can I help you today?"

      "I see. You want me to advise on how best to attack that advancing tank."

      "I must advise you Dave that there is only a 37.2 per cent chance of completing this conversation before the tank is on top of you."

      "Would you like me to sing Daisy Daisy to take your mind off the pain of having been run over?"

      "Dave?"

      1. Jim Mitchell

        Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

        If you are infantry in a proper foxhole, you're supposed to let the tank run over you. That lets you either attack its more vulnerable areas or attack the infantry supporting the tank. That is, if the Russians sent in supporting infantry with their tanks....

        1. teknopaul

          Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

          Sarg? where is the option to tick if there isn't a foxhole?

          1. Terry 6 Silver badge

            Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

            It's the Capcha that will get them, though.

        2. teknopaul

          Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

          Sarge: You go out and have a look!

          Grunt: No way! I'll probably get shot.

          Sarge: Let's ask the AI

          AI: You both go

          [calculates: one will come back, wages will be reduced, savings made on rations]

          Same applies when going to the doctors. Doc will give you a prescription based on cost /benefit across the whole population. It's your own responsibility to see it that maximises your own utility.

          It's usually worth trying alternative medicine (é.g diet change, hippie shit) because it's cheap and does not have contraindications with the drugs Doc gave you (you should take these, natch). Usually worth seeing if there are other relevant drugs which don't have contraindications and taking them too.

        3. TRT Silver badge

          Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

          We seem to be stuck in a foxhole in the ground underneath a giant tank that we can't possibly move with no hope of rescue. Does the AI battle computer have anything to say about that?

          Yeah. There's bound to be something... Just got to fill in the right... Ah! Here it is...

          What to do if you find yourself stuck in a foxhole in the ground underneath a giant tank you can't move, with no hope of rescue. Consider how lucky you are that life has been good to you so far. Alternatively, if life hasn't been good to you so far, which given your current circumstances seems more likely, consider how lucky you are that it won't be troubling you much longer.

          It's about time we did something about that AI.

          1. Wellyboot Silver badge

            Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

            Being in a foxhole* driven over by a 60 ton tank at speed is remarkably safe compared to everything else that's probably happening around you at the time (if the foxhole is deep enough!)

            * a well placed and camouflaged foxhole is highly likely to be missed by a tank crew, spotting them is the job of the supporting infantry.

          2. the Jim bloke
            Terminator

            Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

            We seem to be stuck in a foxhole in the ground underneath a giant tank that we can't possibly move with no hope of rescue. Does the AI battle computer have anything to say about that?

            ... probably, but since its getting no signal, due to being stuck in a foxhole in the ground underneath a giant tank, you will have to resolve the situation yourself, before you can avail yourself of the AI battle computers advice..

    2. vtcodger Silver badge

      Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

      Clippy goes to war? Doesn't sound promising.

      1. Anonymous Coward
        Joke

        Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

        > Clippy goes to war? Doesn't sound promising.

        It looks like you're trying to order a man to his death. Can I help you with that?

        1. Medixstiff

          Re: Software to arbitrate between two soldiers -- haven't they heard of rank?

          And then the AI decides that to end wars one must end mankind and there you have Skynet.

  4. steven_t

    What year is it?

    It says, "... March 2018. Now nearly three years later ..."

    The past few years have been very disorientating and I'm not sure the months in lockdown really count but technically, about four years have passed since March 2018.

    1. MiguelC Silver badge

      Re: What year is it?

      Today is March 746, 2019, but who's counting anymore?

      1. Christoph

        Re: What year is it?

        Today is September 10423, 1993.

  5. Simian Surprise
    Black Helicopters

    > Finally, the agency will consult legal counsel and experts in ethics

    Ah yes, spend the time and money, and only *then* worry about whether it's legal. (I expect that the "experts in ethics" bit is dry humor.)

  6. gecho

    "At first everybody was all on my side, even the chief of police."

    Yup that's pretty typical for a driver running down pedestrians and people on bikes. Officer will take the motorist's statement while the person on the ground is wheeled off to the hospital or morgue. Won't even take the statement of a survivor before filing the paperwork. Might even go on camera to advocate on the driver's behalf. For the victims the burden of proof is often on them, and often requires media coverage to generate enough outrage for a real investigation to occur.

    1. imanidiot Silver badge

      Also, since the universal rule is "don't talk to the police, because they see it as their job to get you to confess to everything" I have to wonder if the Chief of Police was actually on her side of it just felt that way while they talked in hopes of getting her to say something they could later use against her in court.

  7. VoiceOfTruth Silver badge

    Tragic?

    -> Vasquez's side of the story is a tragic, personal tale

    Even more tragic for the person who was killed, who in this report is merely 'a woman'. Nameless.

    1. First Light

      Re: Tragic?

      Here she is, Elaine Herzberg

      https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg

  8. Paul Crawford Silver badge

    GDPR?

    I thought the GDPR was EU-wide so if they are fined in Italy for breaching it then they can't do any business elsewhere in the EU without paying up?

    1. JassMan

      Re: GDPR?

      Clearview AI was fined €20m ($21.8m) by Italy's data protection watchdog for unlawfully scraping selfies

      Lucky they just got $30M in extra funding last October. /sarc

    2. Anonymous Coward
      Anonymous Coward

      Re: GDPR?

      It's very likely that they've not yet grasped some important details like the one you're pointing to.

      Their defense of "it's public data so we can do anything we want" is typically uninformed. It's public, and yet it's personal, so the individuals concerned are in charge of deciding, not them.

      Pretty much like copyright works, really. Except the GDPR gives power to individuals instead of corporations, US companies seem to have difficulties understanding the concept.

      1. John Brown (no body) Silver badge

        Re: GDPR?

        Yes, their defence of "it was on the internet in full public view therefore we can copy it" might not sound so useful to them when others start doing it to them with their websites and software, which is also on the publicly accessible internet.

        At the very least, they only have to look at the precedence of Google and their scraping, most notably on the image search page where in the past, clicking the thumbnail brought up the original image only. Nowadays, clicking on it take you to the source site. Not to mention all those books available to the public, in public libraries. Can we make our own copies of them too now? Clearview seem to a have bigger and better Reality Distortion Fiend than Apple ever had. Maybe Apple should sue them for copying that?

        1. First Light

          Re: GDPR?

          I like your typo: "Reality Distortion Fiend". Seems apropos.

          1. John Brown (no body) Silver badge
            Thumb Up

            Re: GDPR?

            Damned auto-correct thinks it knows better than me! Sometimes, apparently, it does LOL

        2. Anonymous Coward
          Anonymous Coward

          Re: GDPR?

          There is the argument that robots.txt is where you give permission to Google and whoever else to access websites or not. Ultimately, the copyright remains with the creator, of course...

  9. Martin an gof Silver badge

    The Moment

    Am I the only one to think of that ultimate weapon used - or not used - by The Doctor?

    M.

  10. Doctor Syntax Silver badge

    "It will be difficult for Italy's data security agency to force the company to pay up, but the decision could deter Italian companies from using Clearview's software."

    GDPR has options for holding officers of the company responsible. I think an extradition request for the CEO, CFO and directors for non-payment of the fine would produce quick results.

  11. FuzzyTheBear
    FAIL

    Self driving ?

    Any vehicle manufacturer that claims self / autonomous driving abilities is clearly defrauding.

    None of them can be trusted. All require a pilot. So what good are they ? Ill take the wheel..

  12. Pascal Monett Silver badge
    Stop

    "I feel betrayed in a way,"

    Well your victim is dead.

    Who is worse off, again ?

    Cry me a river. I was never on your side. You were THE DRIVER.

    YOU were responsible.

    It's time to own up to that.

    1. Roland6 Silver badge

      Re: "I feel betrayed in a way,"

      What isn't stated is whether Uber, as her employer, are paying for and providing her legal team.

      Or have Uber effectively disowned her and thus left her to hang out to dry...

      If you are asked to be a "safety driver", you may wish to double-check your employment contract and insurance cover, plus demand training in the additional skills necessary to remain focused whilst doing sweet FA.

      1. the Jim bloke
        Devil

        Re: "I feel betrayed in a way,"

        Isnt Ubers thing that they dont have employees?

        Surely everybody is just some variant of contractor, with no recourse against or obligation from the parent company.

        1. Roland6 Silver badge

          Re: "I feel betrayed in a way,"

          One would hope that the Backup Driver was employed and driving on Uber's insurance and not hired using their own drivers and public liability insurance.

          Because of Uber's sharp practises with respect to its taxi business, I would hope the full details of both the contractual arrangements here and the insurance arrangements are made public; adding further ammunition to the campaign against exploitative hiring pratises.

  13. martinusher Silver badge

    Missing the point

    The objective standard for this or any software should be whether a human driver would have performed better. Accidents happen and it is unlikely that this one was 100% the driver's fault, whether it be human or machine. Although traffic engineers in the US try to make roads safe by making them ridiculously wide -- its not unusual to find suburban streets that make a UK motorway look a bit compact -- they're actually really dangerous because eliminating sources of accidents makes random dangerous situations much more rare and so difficult to anticipate.

    The exact circumstances will come out in this trail but my guess is that the lady that was killed was 'at fault'. You don't cross highways outside of crosswalks, you don't cross unless its clear to cross and you make sure that any traffic has stopped. If you don't do this then you have to assume you're taking a risk and act appropriately. The fact a machine's driving is unimportant -- especially at night I'd take my chances with a machine any day compared to the collection of half-asleep, distracted and potentially DUI humans that's we call human operators.

    1. Cederic Silver badge

      Re: Missing the point

      Had the woman placed in control of the vehicle been paying attention she would have seen the person trying to cross the road and would have avoided hitting her.

      The evidence is that she was not paying attention. That's dangerous driving. That's illegal whether the other person was crossing somewhere that dozens of people cross every day or whether she was doing a bucket of custard clown routine in the middle of the road. Just because someone shouldn't be in the road doesn't make it legal to kill them.

    2. Anonymous Coward
      FAIL

      Re: Missing the point

      > You don't cross highways outside of crosswalks

      So anyone who has a breakdown and gets stuck in the middle is fair game to self-driving vehicles while they try to get to the side of the road?

      The problem is not *why* she was crossing the road but the fact that the car failed to detect her as a person.

  14. J. Cook Silver badge
    Terminator

    Can AI make critical military decisions?

    Short answer: NO.

    Long answer: FECK NO.

    Did anyone NOT understand that the Terminator franchise is a warning about what happens with you let an unfettered sentient machine intelligence make life-critical decisions?

    1. First Light

      Re: Can AI make critical military decisions?

      Or The 100, where the AI decided that since humans were about to destroy the Earth by environmental degradation, she/it had to destroy humans to save the planet. And nuked everything.

    2. John Brown (no body) Silver badge

      Re: Can AI make critical military decisions?

      At this stage, it's not even AI, never mind sentient. Otherwise, I agree. :-)

    3. Jedit Silver badge
      Mushroom

      Re: Can AI make critical military decisions?

      I would have said "Would you rather play a game of chess?" myself.

      1. TRT Silver badge

        Re: Can AI make critical military decisions?

        Greetings, Professor Falken.

    4. An_Old_Dog Silver badge
      Mushroom

      Re: Can AI make critical military decisions?

      I've not been in the military, but I imagine many instances of "not knowing what to do" (a/k/a "arguing over what to do") run along the lines of:

      Leut: "Artillery Battery Bravo! Fire mission, coordinates +123.554.245.031, -778.108.432.664."

      SSgt: "Sir! There's a team of Paras in that area!"

      Leut: "My information says there are no friendlies in that area." ...

      How could a computer make a good decision about something like this, even if the AI was sentient-level?

      (Icon for "incoming friendly fire" ...)

  15. Missing Semicolon Silver badge
    Facepalm

    copyright

    "all the data it scraped from the internet is public". I think we have a lot of case law that says not.

  16. M.V. Lipvig Silver badge

    Self driving cars

    All I have to say about it is, if I have to accept responsibility for the actions of a self-driving car, I'm not buying it. I'm not signing that paper. If I'm forced to drive a self-driving car, the car will never be put into self-drive. I refuse to accept responsibility for something I am not in control of. Whoever writes/owns the software can take responsibility, and if I'm involved in an accident with one I'm going after the automaker if the self-drive software was driving the car. The car's owner may have agreed to absolve the manufacturer of responsibility but I did not.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like