back to article Transport watchdog's patience wears thin as Tesla Autopilot remedies may not be enough

The US National Highway Traffic Safety Administration (NHTSA) has written to Tesla as the automaker's electric cars keep crashing despite a recall to fix problems with the Autopilot software. The NHTSA opened a recall query to investigate the remedy's effectiveness for the earlier recall in December, which affected more than …

  1. BartyFartsLast Silver badge

    Me? A cynic?

    A cynical person might believe Tesla are full of crap and Musk is a snakeoil salesman who lurches from crisis to crisis with ever more outlandish, hyperbolic claims.

    1. tony72

      Re: Me? A cynic?

      Teslas are objectively the safest cars on the planet, as measured by safety agencies around the world, including Euro-NCAP and IIHS. Now even the safest cars are going to have some percentage of accidents, and as there are a lot of Teslas on the road, that very small percentage still adds up to a noticeable absolute number. Of course those accidents need to be investigated, and even the safest car on the planet can still be improved, but the existence of that handful of accidents doesn't invalidate the objective facts of how safe Teslas are.

      1. Fruit and Nutcase Silver badge
        Alert

        Re: Me? A cynic?

        Teslas are objectively the safest cars on the planet...

        A similar view has been expressed in the case of the "Smart Motorways" in England...

        National Highways - the agency in charge of smart motorways - says the latest data shows that "smart motorways are our safest roads".

        https://www.bbc.co.uk/news/av/uk-68865188

        They opened up the hard shoulder/breakdown/emergency lane of what were the safest roads in the country and put technology in to alert drivers of obstructed lanes - yep, that's going to work

        1. LenG

          Re: Me? A cynic?

          I live near one of these aberrations, They are "safe" due to the well known effect that if you make something sufficiently and obviously dangerous people actually take more care.

          1. Fruit and Nutcase Silver badge
            Alert

            Re: Me? A cynic?

            if you make something sufficiently and obviously dangerous people actually take more care.

            There was an episode of Top Gear a very long time ago where Clarkson said words to the effect of; The way to get people to drive slowly was to put a great big spike on the steering wheel where the airbag is - which would then make the vehicle to be driven very slowly in order for the driver to avoid their head getting impaled on it

            1. veti Silver badge

              Re: Me? A cynic?

              I had a teacher at school who said that, long before Clarkson came to TV.

            2. Bebu
              Windows

              Re: Me? A cynic?

              the driver to avoid their head getting impaled on it and that from he who seemed to have a problem with the idea of being shot in the face.

      2. Andy 73 Silver badge

        Re: Me? A cynic?

        The issue at hand seems not to be the safety record, but the failure mode. It's great that your car doesn't often hit people, but if when it does hit people it tends to kill them, then that's still a problem.

        It's even more of a problem if these are actually avoidable accidents and there is a critical flaw in the safety systems that should be able to prevent them.

        It's even more of a problem if your CEO has been claiming that those cars can drive themselves autonomously without harm or damage. This is a (fairly crucial) test of his ability to meet the requirements of responsible corporate behaviour when flaws clearly still exist in the vehicles.

        1. cyberdemon Silver badge
          Stop

          Re: Me? A cynic?

          One way to avoid this would be to stop making enormous cars that weigh upwards of a Ton..

          Every car these days looks as if it has been inflated like a balloon compared to previous models..

          There's a kind of 'arms race' mentality in car safety - You the driver are safer if your car is bigger and heavier than the car/person you collided with.

          And in America there's an even bigger issue with the legislation: Apparently most SUVs are considered to be "light trucks", and therefore get around much of the safety legislation designed to protect pedestrians from cars.

          1. veti Silver badge

            Re: Me? A cynic?

            I heard that cars are getting wider, on average, at a rate of about half an inch per year. Which is why they'll no longer fit in the garages or parking spaces of last century.

            Someone needs to put a hard limit on that, unless we want to be condemned to widen everything every couple of decades, forever.

            1. Fruit and Nutcase Silver badge

              Re: Me? A cynic?

              I heard that cars are getting wider,

              Occupants are getting wider too

      3. aerogems Silver badge

        Re: Me? A cynic?

        The number of cars Telsa has sold in its entire span of existence compared to just the number of Honda Civics or Toyota Corollas sold in the same time period is probably barely a rounding error.

      4. heyrick Silver badge

        Re: Me? A cynic?

        "Teslas are objectively the safest cars on the planet"

        Downvote because you're conflating two entirely different things.

        Quote from elsewhere on the web: The Model 3 is rated with the best possible crash rating from NHTSA in the USA - and the IIHS said it was the safest car in the world.

        So what this is saying is that the model 3 is the car you're most likely to be able to walk away from if you are unfortunate enough to be involved in a crash. In that way, perhaps it is indeed a very safe vehicle, for the occupants.

        However driving assistance with a tendency to go awry and kill other people... does not in any way make it a safe car. Quite the opposite. Usually, with cars, accidents that lead to fatalities are due to mechanical problems or something the driver messed up (or both, if they neglected servicing). In this particular case, if the intelligence goes wrong then it's the car itself, and it doesn't matter one bit how wonderfully safe the car is for the people inside if it ends up inadvertently killing people outside.

        The reference to Musk being a snake oil salesman is because while all this autopilot stuff is impressive, it's maybe not yet ready for public roads and it needs to be hammered into people's heads that it is a driver aid, not a driver replacement. Which means, for the love of Christ, stop calling it stuff like "Autopilot" and "Full Self Driving", because - at the current state of technology - that's still an outlandish claim.

        1. aerogems Silver badge
          Joke

          Re: Me? A cynic?

          People think that by buying a safe car it excuses them the responsibility of learning how to drive the fucking thing! First you learn to drive, then you buy your safe car!

          -- George Carlin

        2. DancesWithPoultry
          Facepalm

          Re: Me? A cynic?

          > However driving assistance with a tendency to go awry and kill other people.

          Lane keeping assist......

          The technology that, when steering out to pass a cyclist on an otherwise empty country road, tried to swerve me into the poor bugger.

          1. David Hicklin Bronze badge

            Re: Me? A cynic?

            >> Lane keeping assist......

            That encourages the use of indicators!

            1. DancesWithPoultry
              Megaphone

              Re: Me? A cynic?

              On an "otherwise empty road".

              Who should I indicate to? The squirrels in the trees?

              Regardless, lane keeping assist is menace on narrow country roads where one routinely needs to cross the centre line..... and turning it off involves pissing about with a touch-screen menu every time you start the car ('cos it won't stay switched off).

      5. Kevin McMurtrie Silver badge

        Re: Me? A cynic?

        Tesla drivers are the worst. They bought a Tesla because they can't/won't drive safely, and are counting on Autopilot to do all the work. Musky says it works and all the advertising says it's amazing. These people get on the road, engage Autopilot, and stop paying attention to driving because Autopilot doesn't require them to. Now you have Teslas crashing into everything because Autopilot is a sham.

        1. aerogems Silver badge
          Facepalm

          Re: Me? A cynic?

          Damn. I had to take care of something else before I could finish my above post before you posted yours. It would have fit much better under yours. Oh well.

        2. Bebu
          Windows

          Re: Me? A cynic?

          Tesla drivers are the worst. They bought a Tesla because they can't/won't drive safely, Volvo drivers relegated then? "Lights are on but no one is home" said of Volvo drivers at least in AU. :)

      6. DS999 Silver badge
        Flame

        Teslas are among the LEAST safe

        https://www.thedrive.com/news/tesla-drivers-have-the-highest-crash-rate-of-any-brand-study

        Read and weep, fanboy

        1. aerogems Silver badge
          Holmes

          Re: Teslas are among the LEAST safe

          To be fair, you're talking about two different things.

          OP is talking about what happens if you get into a crash. You're talking about how shitty drivers buy Teslas.

          1. DS999 Silver badge

            Re: Teslas are among the LEAST safe

            No he's not, he said "the existence of that handful of accidents doesn't invalidate the objective facts of how safe Teslas are"

            Now sure, since Teslas like most EVs are heavier than an equivalent ICE vehicle they are relatively safer for the occupant - same reason moms like to buy oversized SUVs in the US to protect their children at the expense of the children in the sedan they run into.

            Having the highest accident rate is almost certainly because of Musk's marketing around Autopilot/FSD making owners believe it is more capable than it really is. The article from the link I posted made that clear with the large number of accidents resulting from "autopilot disengaging" (i.e. it was being misused by someone not paying attention who was not ready to take control when it encountered a situation it could not handle)

    2. Lee D Silver badge

      Re: Me? A cynic?

      He's a rather mediocre techy guy acting as the absolute worst salesman in the world, masquerading as his idea of a "cool guy", propped up only by bankrolling otherwise-failing companies (Tesla was near bankruptcy several times) based on a MERGER with Paypal many decades ago that made him an accidental billionaire.

      I have never found any redeeming feature in the man.

    3. hoola Silver badge

      Re: Me? A cynic?

      How about just removing the certifications from the cars so they cannot be sold in the normal way.

      They become "kit cars" (Q Plates in the UK)

      Or just disable the entire steaming pile of shite that is "Autopilot".

      There really does come a point when these tech companies (and Tesla is a tech company) need to understand that responsibility actually does exist.

      If these issues were with an established car manufacturer this would not be happening.

      I would also go one step further and mandate that any critical control has to be a manual switch with tactile feedback. I am sick of all this touch screen shite where you cannot adjust where the blower is blowing because you have to press a touch screen that requires LOOKING AT.

      Maybe I am just a grumpy old fart......

      1. Paul 195

        Re: Me? A cynic?

        Touchscreen controls are the worst. I've had several rentals where the heating/cooling controls were only accessible by touch screen, Which is not where you want them when you are hurtling down the autobahn and you've realised that you need more heating. They're a nasty cost-saving measure and objectively less safe than old fashioned knobs, switches and levers which you can work by touch alone.

        None of which stopped another Tesla fanboy on a social media thread I was in from insisting that anyone who objected to them was a luddite who didn't understand technology.

        I've got Tesla's new marketing slogan:

        I came for the car, but I stayed for the cult.

        1. David Hicklin Bronze badge

          Re: Touchscreens

          A recent article I read in a magazine had this little snippet:

          "Euro NCAP has finally recognised that confusing and complicated touchscreens can cause drivers to take their eyes off the road for too long so, if car makers want to earn a 5-star rating, then from 2026 some key functions will have to be controlled by classic switches and knobs instead."

          Sounds like the Telsa will need a redesign

        2. Anonymous Coward
          Anonymous Coward

          Re: Me? A cynic?

          "I came for the car, but I stayed for the cult"

          Typo, shouldn't that be an N in the last word?

          1. Paul 195

            Re: Me? A cynic?

            I had a little bet with myself that someone would suggest an alternative spelling for cu*t

  2. aerogems Silver badge
    Holmes

    Who knew that getting rid of LiDAR in favor of much less reliable optical scanning would have such a deleterious effect on safety? Aside from anyone who has at least an average level of deductive reasoning skills that is.

    1. MachDiamond Silver badge

      "Who knew that getting rid of LiDAR in favor of much less reliable optical scanning would have such a deleterious effect on safety?"

      I don't think that Tesla vehicles ever had LiDAR. They did have RADAR, but that was discontinued a while ago and was even removed from the car in some cases when people brought their car in for service.

      Since the Wizard never gave Elon a brain, he doesn't understand how they work and why humans rely on vision primarily and can get away with it. We still use our other senses to feel out our environment. If somebody enters a room we are in but aren't in our line of sight, we can sense their presence and infer who they are through hearing and even smell. Even if we don't hear them, we might feel them if they block air movement. There was a study on hearing where people put on headphones and blindfolds and a "hole" was moved through a sonic environment. It was very creepy and I have to wonder why that hasn't been tried on a scary movie. It also points up that we don't understand precisely all of the ways we comprehend our environment.

      1. aerogems Silver badge

        I'm reasonably sure it was LiDAR, but I also don't really give enough of a shit to look it up because it doesn't change the fact that they had something superior to optical cameras and made a conscious choice to move away from it.

        1. Jan 0 Silver badge
          Headmaster

          LIDAR

          In what way isn't LIDAR optical scanning?

          RADAR is scanning with lower frequency "radio" emissions.

      2. Bebu
        Windows

        Toto, I've a feeling we're not in Kansas anymore

        Since the Wizard never gave Elon a brain - I reckon the wiz should hand in his wand but then I recall he was actually just a cast away shonky snake oil merchant. Anyway his line in brain was probably the charcutier's rejects. As the Scarecrow, Musk must be the ultimate straw man and as such you would think he might desist from playing with fire.

        he [Musk] doesn't understand how they [brains] work and why humans rely on vision primarily and can get away with it.

        Musk and the legions of his betters that aren't familiar with neuro- or cognitive science are not alone in this. That includes most of the current AI circus.

        The image the retina captures and sends to various parts of the brain is pretty rough - The difference between that image and what we "see" both verges on miraculous and is terrifying at the same time.

        I don't know about anyone else but driving a vehicle is easily the task that for me requires the greatest concentration and thought and I have over 50 years of practice.

        Is any of Musk's FSD wet dreams ever going to detect a childs ball rolling on to the road from between parked cars and then going to slow down preparing for emergency braking?

        Think of what is involved here - construction of a plausible model of the immediate future based on the unconscious or intuitive understanding of the human world - in a word imagination.

        1. Paul 195

          Re: Toto, I've a feeling we're not in Kansas anymore

          You've absolutely put your finger on the problem with self-driving software; it can't imagine or predict. A good many years ago on a country road I saw what looked like a couple of firelfies up ahead (it was a pitch-black night, no street-lighting or moonlight). The weird cyclic motion confused me though until I came up with the accurate hypothesis that I could see the reflectors on pedals for a bicycle being ridden with no lights. I slowed down and was able to pass said cyclist safely. Presumably an FSD Tesla would have ploughed into him at 50 mph.

          And then legions of fan boys would have said the cyclist deserved it for not having any lights. Well, doubtless the cyclist has some responsibility for his own safety, but equally car owners have a responsibility not to trust control of their death machine to a half-witted piece of software demonstrably not up to the task.

          1. CrazyOldCatMan Silver badge

            Re: Toto, I've a feeling we're not in Kansas anymore

            A good many years ago on a country road I saw what looked like a couple of firelfies up ahead

            I've had a similar experience riding my motorbike at night - saw what looked like a motorbike coming towards me to shifted left to ensure that we would pass safely (small country road, no streetlamps). Realised at the last minute that what I had assumed was a bike was in fact an older car (round headlights!) with only the nearside headlight working..

            Fortunately I still had enough time (closing speed was only about 30mph) to get right to the side of the road so that the mono-lamp car could go past. It didn't even move aside so I suspect that, despite my headlamp working properly, the driver hadn't even seen me. Given the time of night, I suspect the driver was on the way home from the pub.

  3. usbac

    The problem is that none of these enhancements to driver attention monitoring solves the real problem with this level of driver assistance (not actual "auto pilot", despite what Musk is trying to sell). The problem is that it takes time for the driver to react, and on roads and highways, that time just isn't there.

    In aircraft autopilot works very well because there is much more time to react to an autopilot disengagement. You are typically not 1-2 seconds from colliding with something or someone when autopilot disengages. If you are, then you have big problems. Also, with aircraft, pilots are trained in human factors, and we understand how to handle the handover from automation to human control. Normal drivers are not trained for any of this, and reading a Tesla owners manual is not proper training.

    1. aerogems Silver badge

      Air travel also has a dedicated ground crew working to coordinate traffic and will definitely be in touch if you start veering too close to anyone else.

      I sometimes wonder if we're doing autonomous driving wrong. Instead of every individual car trying to make independent decisions, maybe have a central system that acts similarly to air traffic control for airlines. Sensors in the road and car tell it where other cars are, how fast they're going, all that yummy telemetry, and then it can coordinate all the traffic on the roads. Then you don't have a case where cars with a SoC specced for having LiDAR sensors are suddenly expected to take on the additional processing required for optical only scanning, on top of everything else they manage. You just have one massive cluster where you can always add additional nodes if you need more processing power.

      1. Someone Else Silver badge

        You just have one massive cluster [...]

        But...but...but...isn't that the case now?

        1. aerogems Silver badge
          Trollface

          If you add a four letter word starting with the letter "f" after "cluster" above... then yes... yes it is.

      2. Anonymous Coward
        Anonymous Coward

        You just have one massive clusterfuck.

      3. MachDiamond Silver badge

        "Sensors in the road and car tell it where other cars are, how fast they're going,"

        Like that isn't going to go wrong. The Smart Motorways' sensors are often offline for days/weeks/months at a time and many don't require digging up the road to repair. If they can't be bothered to fix non-functional CCTV cameras watching the roads, how can it be expected that in-road sensors will be maintained?

        What you described is part of the spec for PRT (Personal Rapid Transit). The vehicles take care of operating themselves and keeping on the guideway while a central computer tells them how to dispatch, where to turn and how fast to go, but not exactly how to do those tasks. In a closed system, that can work nicely. Even a human driven service vehicle can be accommodated in the system. Before UltraGlobal wound down, they had plans that would allow private ownership of vehicles so somebody could drive to the city with the system and log into the computer for autonomous control. Parking, charging and pickup would all be taken care of. It made a lot of sense for dense downtowns. Not only could people be moved around, so could cargo. Vehicles could go into buildings so a load of dairy products being sent to a grocery store could stop in the warehouse area of the store, be unloaded and the vehicle would be on it's way while those items get moved quickly into the refrigerators/freezers. Passenger vehicles available to the pubic would not have access to those stops and portals could be secured. Employees of the store could be given a pass so they could be routed to the back room where they could get on and off in safety.

      4. katrinab Silver badge

        That only work if every road user is automated, like for example on the Docklands Light Railway.

        1. Michael Wojcik Silver badge

          And even if they are, there are Really Bad Failure Modes if there's an equipment failure. That failure might be in the V2V communication, or it might be something purely mechanical like a part falling off or an axle shearing.

          Cars currently are not, broadly speaking, well-maintained. Many jurisdictions have periodic inspections to try to ensure a minimum level of upkeep, but that's not a perfect system, and other jurisdictions lack it. There's nothing currently stopping me from driving any old piece of junk from Michigan or New Mexico (where there are no inspections) to, say, Massachusetts (where there are).

      5. Paul 195

        The problem with this suggestion is that it assumes that all the self-driving vehicles only have to cope with other cars. But in the real world they need to cope with pedestrians, stray dogs, runaway prams etc. And the other problem with this suggestion is that lobbyists will argue that people, not cars, are the problem and we should have strict laws and segregation keeping them away from motor traffic.Which would make our cities even worse hellscapes than they already are.

        If that seems improbable, here is your reminder that the US has jay-walking laws, legislating against people being able to cross roads, because of lobbying by motor manufacturers.

    2. Anonymous Coward
      Anonymous Coward

      Simple answer, if it don’t work, or is not safe, fails a risk assessment - recall and disable it until can be evidenced to work.

    3. Paul 195

      An aviation expert will correct me if I'm wrong, but didn't some of the early airbus planes lean too heavily into automation and they had to dial it back for causing the same problems as we see with Tesla software - it did too much causing pilots to disengage so that they then had problems resuming control when needed.

      Software like Tesla autopilot is guaranteed to be dangerous. It works most of the time so drivers disengage. And then when it says "hey, I don't know what to do, help me out human", the human hasn't been maintaining context and hasn't got time to rebuild it.

      1. David Hicklin Bronze badge

        >> early airbus planes lean too heavily into automation and they had to dial it back

        It is a general problem (not just airbus) of "too much automation" where the pilots hardly ever fly the plane so when the automation does crap out their skills have eroded so much that they can't cope.

        1. Michael Wojcik Silver badge

          Air France 447 is one tragic example. The captain was sleeping after a long night; the other two crew members were inexperienced; the plane came out of autopilot after the computer lost airspeed information, and they kept trying to climb while the plane was stalling.

  4. Fruit and Nutcase Silver badge
    Alert

    Simple Test

    Musk stands in path of a Tesla on "Autopilot" - lets start at 5 mph and then repeat at 5 mph increments. If and when the Tesla fails to detect Musk is in it's path and runs him over, then, that is the threshold speed the "Autopilot" can be certified for. Oh, and best have an ambulance/medivac helicopter on standby during the exercise

    1. usbac

      Re: Simple Test

      Except that the ambulance has to be driven solely by Tesla autopilot...

      1. aerogems Silver badge
        Coffee/keyboard

        Re: Simple Test

        Don't go giving him ideas! The last thing we need are Tesla ambulances and fire trucks on the roads.

        1. Someone Else Silver badge

          Re: Simple Test

          Hey, a Tesla fire truck isn't likely to run into the rear end of ... another fire truck.

          Right?

          1. MachDiamond Silver badge

            Re: Simple Test

            "Hey, a Tesla fire truck isn't likely to run into the rear end of ... another fire truck."

            IDK, the desire to mate is a strong instinct. What models will they be using to train these AI's?

            1. aerogems Silver badge
              Coat

              Re: Simple Test

              If Xitler didn't have some kind of thing about gear shifters he could have sold a dildo option. You know it would have sold well. And hey, no having to worry about your hand slipping off the shifter.

            2. CrazyOldCatMan Silver badge

              Re: Simple Test

              What models will they be using to train these AI's?

              Pornhub abd 4Chan.

              A winning combination!

    2. aerogems Silver badge

      Re: Simple Test

      I like the general idea. Let the CEO literally stand behind in front of the product. If he's not 100% confident that he'll be able to walk away completely unharmed, and willing to actually put his body on the line, then it's not ready to be sent out to punters.

      1. Noram

        Re: Simple Test

        And he has to do it with a randomly chosen example for every hardware revision of every model, under varying lighting conditions including with the car heading towards the sun when it's low.

        And then redo it with every software update that affects the system

        I like the idea of the snake oil salesman proving he trusts it, but I don't trust him to actually do it under conditions we know the Tesla's have issues with*, and with a vehicle that hasn't been specifically chosen for the task and made sure it's working correctly.

        *I seem to remember the Tesla cameras have issues with the sun being "wrong" and the tesla fans saying "well humans get dazzled as well", ignoring the fact that humans can adjust the position of their head/eyes and do things like drop the sunshade, and will typically slow down if dazzled.

        1. aerogems Silver badge

          Re: Simple Test

          *I seem to remember the Tesla cameras have issues with the sun being "wrong" and the tesla fans saying "well humans get dazzled as well", ignoring the fact that humans can adjust the position of their head/eyes and do things like drop the sunshade, and will typically slow down if dazzled.

          That's also part of the beauty of something like LiDAR. It doesn't get dazzled by the sun, so regardless of time of day, or other weather conditions*, if you're coming up on some other solid object, it can trigger a warning.

          * IIRC, LiDAR doesn't work so well in fog, but then neither does optical cameras or human eyeballs, and we're still talking about one use case where things are basically the same and all the other use cases are vastly improved.

      2. MachDiamond Silver badge

        Re: Simple Test

        "Let the CEO literally stand behind in front of the product."

        What? Like Jeff Bezos taking a flight in a Blue Origin Rocket with his brother and Wally Funk? Good luck with getting Elon to do something like that.

  5. Anonymous Coward
    Boffin

    Some statistics of electric car accidents

    Forbes Dec 2013: Tesla drivers had 23.54 accidents per 1,000 drivers. Ram (22.76) and Subaru (20.90) were the only other brands with more than 20 accidents per 1,000 drivers for every brand.

    1. Screepy

      Re: Some statistics of electric car accidents

      Interesting link, thanks.

      If you click through that link to the site where they got the data from there are some other interesting bits in there.

      RAM really not coming out looking very good - topping the charts in a lot of those results - more incidents (not accidents) than any other brand.

      BMW has the most drivers caught driving under the influence of alcohol.

      And, as you mentioned, Tesla, which are the most accident prone.

      Also, a small point but important, your subject line says it's electric cars, but that study covers both EV and ICE - which makes it a much more interesting dataset.

      1. MachDiamond Silver badge

        Re: Some statistics of electric car accidents

        "RAM really not coming out looking very good - topping the charts in a lot of those results"

        With a big heavy vehicle it does seem like you would see them in more accidents as they are not nearly as maneuverable as a small compact car. They take longer to stop with a higher center of gravity.

        I'm not surprised about the incidence of drink driving with BMW. The person bought the car thinking it would make them more desirable and interesting and instead just drained their bank account and the depreciation has driven them to drink. If I meet somebody, I really hope their dating me doesn't have anything to do with my car other than having one I can use to take them on dates. Have you seen what it costs for a pair of Rammstein concert tickets? If they would rather go see Wishbone Ash at the local club in the BMW..........Not that WA isn't a kickin' show.

      2. veti Silver badge

        Re: Some statistics of electric car accidents

        If you buy a car named RAM, your subconscious is bound to come to certain conclusions about how it should be driven.

        I mean, what's next? PILE? SMASH? MOW?

        1. Anonymous Coward
          Anonymous Coward

          Re: Some statistics of electric car accidents

          If you don't like the name RAM, then the car probably isn't for Ewe.... Tup be honest, I won't be flocking to buy one either. Given the price, I'd feel like I had been fleeced.

    2. veti Silver badge

      Re: Some statistics of electric car accidents

      I was going to point out that 2013 was a long time ago...

      ... but then I followed your link and saw the date stamp. You meant 2023.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like