back to article Amazon delivery staff 'denied bonus' pay by AI cameras misjudging their driving

AI cameras inside Amazon’s delivery trucks are denying drivers' bonus pay for errors they shouldn’t be blamed for, it's reported. The e-commerce giant installed the equipment in its vehicles earlier this year. The devices watch the road and the driver, and send out audio alerts if they don't like the way their humans are …

  1. Cybersaber

    Too soon

    I'm struggling with this whole 'safety monitor' concept. Work with me here:

    It's day 32 of this initiative, and I've been sitting behind the wheel for a total of roughly 300 hours cumulatively, and the thing hasn't malfunctioned yet. But today, a drunk driver loses control in front of me and swerves into my lane, and is about to cause a collision. My attention is all over the place. Maybe I'm bored to death and that boredom is making my attention wander, maybe not.

    Let's start a time loop, and ask which of these scenarios might happen:

    Iteration 1:

    The safety driver has been instructed to let the machine drive, and intervene if it doesn't handle it appropriately.

    T + 4.03 seconds, the driver has decided that the machine isn't reacting appropriately

    T + 4.3 seconds, the driver tries to react but it's too late to prevent a tragic accident.

    Iteration 2:

    T + 1 second the drive sees the drunk is about to do something stupid and decides his life is more important than AI training

    T + 1.23 seconds the driver reacts and prevents the accident, but now the AI hasn't actually proven that it would or would not have recognized the threat and responded appropriately, thus defeating the point of the initiative.

    Iteration 3:

    Like one, but the driver was bored to tears because no mortal human can maintain that level of attention for hundreds of hours sitting in front of a wheel they're not even steering, so he never sees the drunk coming and tragedy ensues.

    This is merely one of the several major categories problems with AI drivers. If they're not already competent to drive without human minders, they shouldn't be on the road.

    1. oiseau
      WTF?

      Re: Too soon

      I'm struggling with this whole 'safety monitor' concept.

      Hmm ...

      I'm struggling to understand just why federal regulatory bodies allow this.

      Wait ...

      Yes, now I get it.

      Maybe it's really not regulatory body stuff ...

      Bozos now not only has to recover the cost of his divorce.

      He also has to recover (and make a profit) from his New Shepard joyride, which must have also cost him a bundle.

      How can he do that without gouging his employees just a bit more than usual?

      ie: first employing them via contractors to avoid unionisation and then screwing them over for good measure.

      Malignant asshole.

      O.

    2. Youngone Silver badge

      Re: Too soon

      You've misunderstood the Amazon story. This is not an AI driving the van, it is a camera system monitoring the driver so that Amazon can refuse to pay the driver what she or he is owed.

      It is Orwellian, true, but in a different way.

      1. W.S.Gosset

        Re: Too soon

        You've misunderstood his comment. He's referring to the final sub-story, not the first.

    3. doublelayer Silver badge

      Re: Too soon

      So, in your view, when they've had the car driving without people on their testing location and tried various ways of making it crash and it's working, what is the next step? I realize that your options are possible, but the only options I see are to try this, ban self-driving vehicles permanently, or allow them without testing. Which one is your plan?

      As for the boredom possibility, I think it can be solved by reducing the hours that the testers are driving the cars. If they've worked 300 hours by day 32, that's a very long day (even assuming it's 32 days of working not 32 calendar days). Testing for two hours per day and swapping out people to do something else means they're less likely to be bored and faster on reaction time.

      1. Cybersaber

        Re: Too soon

        Or make special AI-only lanes to control for the amount of human unpredictability. That's how industrial robots have traditionally worked - keep the space around them controlled.

        Re the amount of hours, I was thinking of the hours I'd imagined real teamsters driving cargo worked per day. You're right that for testing they could reduce the schedule, but then they'd not be testing against real-world conditions.

        (Disclosure: I am not a teamster and have never driven a commercial vehicle on cargo routes, so I might be off in my assumptions.)

        1. doublelayer Silver badge

          Re: Too soon

          As for time limits, testers definitely shouldn't work as long as drivers because they're doing different jobs. The testers are there to make sure that, if it turns out the cars are not safe enough to drive on their own, nobody gets hurt or as few as possible. If they're successful, they eventually leave and the cars drive alone with more public confidence in their safety. That takes a slightly different skill set than a normal driver who already knows that they have to control everything all the way through.

          The AI-specific lanes would work, but I don't think that's going to happen. It's too expensive to install those in most places.

    4. ecofeco Silver badge

      Re: Too soon

      There is a flaw in your arguement.

      It's not 4 seconds, It's more like two seconds.

      So the scenario is, drunk driver makes bad move. Second second, crash.

      1. Cybersaber

        Re: Too soon

        If a professional trucker hasn't already noticed a drunk driver and began to plan defensively before 2 seconds before, well, that's not a display of great driving skill.

        I was trying to think of a fair scenario - if it was a totally out-of-the-blue threat, then it changes my post somewhat. I was envisioning a situation where an alert driver could have marked a dangerous vehicle, and be watching it and actively be observing the developments.

        For an OH **** moment, I mean the driver would already be reacting to save lives and property, and we'd have like scenario #2 above. The whole point is testing the safety of the systems, so I tried to think up a scenario that theoretically have the driver alert and ready to react, but testing the machine to see how it responds to an emergency.

    5. Flocke Kroes Silver badge

      Re: Too soon

      There is a way to train an AI driver to handle rare and dangerous situations. Pick one of the ghost towns that was bought up specifically to train AI drivers. Replace the safety driver with a crash test dummy. Replace the drunk driver with a professional stunt driver. Stage near misses repeatedly until the AI can handle them.

      That only deals with one class of problem. The next up is pot-holes. I know where they are on my most common journeys. I expect on-coming vehicles in the middle of a narrow road where there is a pot-hole on the other side. After rain I still know which puddles hide pot-holes big enough to wreck a slow moving car. There is no way I would let an AI loose on roads that are not in excellent condition.

      1. hoola Silver badge

        Re: Too soon

        This entire self driving thing is being driven by profit and the handful of geeks that insist on every latest gadget.

        Ask yourself why all the companies that are heavily involved with driverless vehicles are investing all this money?

        Is it to "make roads safer"?

        Is it to be able to replace expensive drivers with cheap "monitors"?

        Is it so that in the end there will be no human in the vehicle?

        The only beneficiaries of these sorts of things are the big corporations that push it.

        If Amazon can do away with all their drivers just look at all the extra profit. They already take the micky on tax so if you end up with bit tech forcing ever more people into unemployment of the dregs of the job chain and at the same time paying less tax what happens?

        Eventually society breaks down, the trouble is that the likes of Bezos and Musk just don't give a toss.

        The next wave of automation is going to do untold damage to society because so many jobs will go with nothing to replace them. Then the only option is for Governments to pick up the social welfare tab on ever decreasing tax revenue. People argue that it has happened before, yes it has but there are not the alternative jobs now.

        1. doublelayer Silver badge

          Re: Too soon

          It's being pushed by people who build such things and hope to sell them, just like every other product. That's what companies do, and this isn't unusual. In addition, though the companies want it to succeed for their own profit, those things you mentioned are actually possible benefits we would get from having it done.

          As for "it has happened before", yes it has and it was really the same. You could make arguments like this for literally any technological advancement and people did. That is what progress looks like, and pain is inevitable in it. When computers automated lots of administrative actions, some people lost their jobs. Yet we still benefited quite a lot from them and, since you're posting here, I suspect you have benefited more than most. Whether the drivers of delivery vehicles lose their jobs to self-driving vehicles, more train transport, or drones, it's going to happen if the technology is efficient enough. Instead of trying to hold it back in the hopes that nothing changes and we don't have to care about the negatives of the current situation, we should plan for what we're going to do when progress happens.

        2. Mike 137 Silver badge

          Re: Too soon

          "Is it so that in the end there will be no human in the vehicle?"

          Welcome, the riderless motorbike.

      2. Cybersaber

        Re: Too soon

        The problem is that real life is messy. You can't think up and simulate every kind of disaster on the road in a ghost down. Sure, common, easy stuff like bad drivers, sure. What about fires on the roadside? What about birds or insects obscuring vision?

        There are just too many scenarios to test for. The other recently posted article about the police crash lawsuit highlights that. Even emergency vehicle lights are uncommon, but not all that rare, and it defeated Tesla's AutoPilot.

        They don't have to use ghost towns either. Have the same AI system set up, but in reverse. Have human drivers work in pairs on the roads - one drives, and the AI observes his actions and habits in response to unpredictable events. The partner would be responsible for 'markup' of the recording like

        0:18 - Nothing happened, but that's *because* the driver recognized this basketball bouncing across the road meant that there was likely a kid about to run after it and he stopped.

        0:20 - This is the kid that would have been flattened if the driver hadn't stopped.

        1:35 - Driver slowed in response to seeing another car weaving across the lines and having trouble with lane keeping

        1:43 - Driver changed lanes twice to get around the drunk and prevent him from being a danger.

        3:22 - Driver was unable to see lane markings due to faded white lines and low angle of the sun washing out vision.

        etc.

        etc.

        Do that on those same Dallas highways they're putting those cargo trucks on and I drive all the time, and you'll have LOTS of data points about bad drivers and hazardous conditions.

    6. O RLY

      Re: Too soon

      A version of Iteration 3 happened in real life already in Arizona in 2018 when an Uber autonomous car killed a pedestrian while the human operator was not paying attention. El Reg reported the human may have been streaming TV:

      https://www.theregister.com/2018/06/22/uber_fatal_crash_driver_distracted_police_report/

      Crash story:

      https://www.theregister.com/2018/03/19/uber_self_driving_car_fatal_crash/

      Prosecutors charged the human, not Uber, with homicide, but the case appears to be unresolved:

      https://www.theregister.com/2020/09/16/self_driving_uber_homicide_charge/

      https://www.phoenixnewtimes.com/news/uber-self-driving-crash-arizona-vasquez-wrongfully-charged-motion-11583771

      1. Mike 137 Silver badge

        Re: Too soon

        I just submitted a response to the DCMS Digital Regulation consultation, and in the section on AI and autonomous verhicles I stated:

        "Where lives and livelihoods are at stake, even in the name of progress the public can not legitimately be considered an involuntary test bed for systems development".

        Anyone disagree?

  2. Eclectic Man Silver badge
    Unhappy

    AI driving assessment

    The denying of bonus payments to drivers who are 'cut up' by other drivers seems harsh, although a lot of vehicles are fitted with dash cams nowadays. So there ought to be an effective appeals process. Of course if the car cutting them up is a Tesla on autopilot ...

    I remember driving a motorbike around Leeds when I was a student. The number of times I was cut up by careless car and van drivers was scary. 'Sorry mate, I didn't see you'. Really? It is broad daylight, I am wearing a bright yellow reflective tabard and my headlamp is on. How much more visible do I have to be? Eventually I gave up as I considered that other road users made it too dangerous.

    1. Snowy Silver badge
      WTF?

      Re: AI driving assessment

      You can not see what your not looking for, they did not see you because they where not looking for you.

      1. Eclectic Man Silver badge

        Re: AI driving assessment

        Oh, you mean like how many times the players in white passed the basketball?

        https://www.youtube.com/watch?v=vJG698U2Mvo

        1. Falmari Silver badge

          Re: AI driving assessment

          @Eclectic Man Wow that's brilliant. I was chuffed I got 15 right then they mention the gorilla! Never saw it!

        2. yetanotheraoc Silver badge

          Re: AI driving assessment

          That basketball experiment was the first thing I thought of when Snowy commented. And Falmari confirms it - I don't even need to click on your link. I upvoted your earlier "gave up" comment. Equally I gave up bicycling on the roads. The drivers are just too horrible, if they are not looking at their phones then they are looking at the kids in the back seat. The suggestion above about training AI/ML using human drivers, I guess assumes professional drivers. Using average human drivers would just train the AI/ML to run over motorcycles, bicycles, pedestrians, first responders, potholes, etc.

    2. The Man Who Fell To Earth Silver badge
      FAIL

      Re: AI driving assessment

      "So there ought to be an effective appeals process."

      It's Amazon. Appeal is futile.

      Ever try to deal with them outside of their script, like you ordered one of something but 4 showed up? They have no mechanism to return the 3 you don't want. None. Nada. Because their algorithms say it can't happen. Even when it does.

      I'm the one with $200 of Amazon stuff I didn't order, wasn't charged for, and can't return because there's isn't a mechanism to even report to them that it happened. (If they only sent two when I ordered one, I could return one for credit and keep the other for free.)

      1. Not Yb Bronze badge

        Re: AI driving assessment

        This happened to me a few times, though not with anything more than about $30. Called them up to find out what happened, and someone I've never heard of had sent the stuff to me as a gift.

        One was a clearly unlicensed Harry Potter wax seal... the other a voice changer with headset line-in inputs. Random, and nothing was charged to my credit cards either.

      2. elsergiovolador Silver badge

        Re: AI driving assessment

        You are probably a victim of "brushing scam". Sellers would create Amazon accounts using your personal details and then buy their own products in order to be able to review them.

        Indeed Amazon is useless at reporting this. My suggestion would be to file a police report, but likely they will tell you to go to Amazon...

      3. Eclectic Man Silver badge

        Re: AI driving assessment

        In the UK*, if someone sends you an unsolicited item, if you tell them then they have six weeks to arrange return / collection or it becomes yours. If you don't tell them you have to wait six months. before it is legally yours*. Generally for small items the cost of return and processing is so much that it is uneconomic.

        *Please check with your local Citizen's Advice Bureau before taking the work of an internet commentator/troll on any matter of law.

    3. ThatOne Silver badge
      Devil

      Re: AI driving assessment

      > The denying of bonus payments to drivers who are 'cut up' by other drivers seems harsh

      Why, that was the whole point of installing cameras: Reducing bonus payments using the good old "computer says".

  3. Snowy Silver badge
    Facepalm

    10 year plan

    Is it 10 years as by the time it fails your announcing a new plan, which makes it two times better than a 5 year plan.

  4. elsergiovolador Silver badge

    Salesmen

    To me it sounds like someone tricked the managers and sold them dopamine inducing idea of total control of their staff.

    It went probably like this: "If you give us part of your budget, we will create a system that will discipline all those lazy drivers and make sure they work like robots. You will be the prince of the Amazon and Jeff himself will give you a ticket to space." all sprinkled with funnels, AI, machine learning, control, efficiency, growth, dashboards and more control.

    You must have an evil mind and zero morals to do something like this.

    1. Chris G

      Re: Salesmen

      I think Bozos already has the evil mind, Amazon couldn't be as bad as it is with employees if the culture wasn't okay with the C suite.

      Denying bonuses helps to maximise profit and every cent helps.

      1. Neil Barnes Silver badge

        Re: Salesmen

        MBA thinking. Everything that is being counted is being controlled, and only the bottom line matters.

  5. Anonymous Coward
    Anonymous Coward

    If the AI was any good...

    ...It'd be driving the van.

  6. Anonymous Coward
    Anonymous Coward

    Amazon

    21st century version of slavery?

    What next, charging the drivers for the fuel if they’re forced to detour?

  7. ecofeco Silver badge

    Quit

    Seriously. Quit and find another job. Any job. This is micromanaging bollocks designed to steal money from the drivers. And damned dangerous as well. NOBODY needs a backseat driver.

    1. marcellothearcane
      Unhappy

      Re: Quit

      All too often, that's not an economically sound option.

      A lot of the drivers are not financially stable enough to go without a job, and you can be sure that Amazon work you hard enough and for long enough hours to prevent you being able to find other opportunities.

      And even then, I'd expect drivers to consider the devil they know to be better than the devil they don't.

    2. Sherrie Ludwig

      Re: Quit

      That is what has been happening in the USA. Every business has "we're hiring!" signs out, and people aren't applying. On the other hand, I can get my hair cut and styled, my yard work done, help cleaning out my garage, light housekeeping, my tires changed, etc,. all by local jobbers who accept cash. People are working, just not on payrolls.

  8. Coastal cutie

    Mirror, signal, manoeuvre

    I presume by side mirror El Reg means wing mirror - so if I understand this correctly, a driver checks their mirrors to make sure they don't squash something overtaking or a foolhardy vehicle/cyclist/pedestrian coming up on the inside just as they are about to turn right (this is the USA remember) and gets penalised for it? What are they supposed to do, stare rigidly straight ahead and stuff anything else around them!

    1. hoola Silver badge

      Re: Mirror, signal, manoeuvre

      Presumably yes because that is what the chimps that setup the entire solution are expecting.

      1. Cybersaber

        Re: Mirror, signal, manoeuvre

        Sounds like a common agile failing:

        Client story: Drivers are being found inattentive on the road, causing accidents.

        Client Requirement: Write software that detects when the driver looks away from the road such as at the radio or a phone. Send a signal if the driver looks away (so that we can penalize them for inattentiveness)

        Agile developer does exactly what is required in the specification, no more, no less.

        The people asking for the change are trying to solve one problem, the dev is doing the agile thing. Everyone is doing the 'right thing' from their narrow point of view, but bad things are resulting.

  9. This post has been deleted by its author

  10. Anonymous Coward
    Anonymous Coward

    Crap AI

    We've got these things in our work vehicles.

    A few months ago I got called into the managers office and accused of driving without a seatbelt. The spy system was guessing based on what the camera showed and ignoring the in-vehicle seatbelt detector.

    I saw the video and at the time I was driving into the sunset on cloudless day. The images were overexposed and you couldn't see any details. The image quality was so bad it showed the seats as being a pinky sort of colour when they're actually a dark grey. The manager insisted the AI was right.

    I contacted the spy company about the incident, they agreed with me, said that "feature" was still in beta testing and it shouldn't be counted on.

    I went back to the manager with this info and was told ....it's not important...drop it... What an asshole, if it's not important then did he bring it up (in an unfriendly way). It was then that I decided to leave the company at my first opportunity.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like