back to article Criminal justice software code could send you to jail and there’s nothing you can do about it

American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects' defense teams, and in some cases provably biased. In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of …

  1. James 51
    Big Brother

    Sounds like responsibilty laundering. It wasn't me what put that innocent person in jail, it was the software what did it. No, we won't be fixing it, that's the software comapanies job. And they just went into flat pack bankrupancy and were bought by a nice Chiense firm with a line in soical scoring.

  2. John G Imrie
    Big Brother

    Computer says ...

    if (suspect.race eq 'white') {

    guilt -= 1;

    }

    else {

    guilt += 1;

    }

  3. Anonymous Coward
    Anonymous Coward

    It's only a matter of time before they start targeting (Education? Segregation?) people based on an algorithm that determines how likely they are of committing a crime even when they haven't, it'll probably use social media data as well.

    1. monty75

      A handful of pre-cog psychics in a pool of water would work too

      1. earl grey
        Trollface

        A handful of pre-cog psychics in a pool of water would work too

        With a toaster.

    2. Anonymous Coward
      Anonymous Coward

      'It's only a matter of time before they start targeting... Segregation'

      You can bet Facebook / Google etc want in on this. 1st-round is matching / targeting ads. 2nd-round is hoovering up financial-transactions / patient-health info. 3rd-round is being involved in every transaction or event that has any kind of data aspect etc.

      Does your child get to have the surgery they need? Do they get a place on the college course they deserve? Do they get the job they long for? Do they get a loan to buy a dream home? Who do they get to date? Ask Zuckerberg, as we're all Suckerburgs now.

  4. Giovani Tapini
    Mushroom

    Trade secrets, pah

    The precise algorithms and weightings of parameters a trade secret?

    If its not patentable, then its probably only secret because its obvious(ly flawed) and anyone with access to training data could have a go.

    The authorities should own the process IP to ensure it has some level of transparency, even to themselves. Leaving decisions to a corporation (recalling OCP for example) is outsourcing law enforcement to an unacceptable degree in my view.

    1. Herring`

      Re: Trade secrets, pah

      "(recalling OCP for example)"

      You have to wonder if they have their own Directive 4 in there somewhere.

      1. Androgynous Cupboard Silver badge

        Re: Trade secrets, pah

        What, "do not run across dance floors"?

    2. Mark 85 Silver badge

      Re: Trade secrets, pah

      Being a "trade secret" is a lot different than a patent. Trade secrets are generally very closely held by the company whereas a patent is published. For example, the recipe for Coca-Cola isn't patented. If it had been, the recipe would be in the wild and anyone could make the product.

    3. Anonymous Coward
      Anonymous Coward

      Re: Trade secrets, pah

      > If its not patentable, then its probably only secret because its obvious(ly flawed) and anyone with access to training data could have a go.

      If it's the current fashionable flavour of AI - i.e. neural network machine learning - then there is no algorithm to disclose.

      It's just a black box with a whole load of weights which were iteratively tuned based on training data.

  5. alain williams Silver badge

    Open justice should mean open decisions

    which means that for such decision aids the code (and the data that it 'learns' from) should be open source.

    1. hplasm
      Big Brother

      Re: Open justice should mean open decisions

      'Open justice'? where did you see that?

    2. JohnFen

      Re: Open justice should mean open decisions

      I agree, but I don't think you need to use the term "open justice". Unqualified "justice" suffices.

    3. joed

      Re: Open justice should mean open decisions

      What justice? It's just a law. By lawyers and for lawyers.

      Obey.

  6. This post has been deleted by its author

  7. Anonymous Coward
    Anonymous Coward

    Thinking about this I can honestly see no reason why the way this determines sentence or parole shouldn't be disclosed. It's not like you can alter the parameters to game the system as you have committed the offence or have a criminal past and those convictions taken into account. The only reason anyone would want to keep it secret is if it's doing something it shouldn't be such as profiling by race.

    1. Doctor Syntax Silver badge

      "The only reason anyone would want to keep it secret is if it's doing something it shouldn't be such as profiling by race."

      A few other possibilities. One is that it's such a pile of crap that they wouldn't be able to sell it or those that bought it would want their money back. Alternatively, it's such a pile of crap that the victims would take them for everything they've got and more in damages. More likely it's another of those AI things where nobody knows how its arriving at conclusions so it's not so much they don't want to disclose anything, more a case of they can't.

      When NI had the judge-only courts the judge had to give a reasoned account of how he came to his decsion (which, of course, is more than a jury has to do). If S/W were to be a tribunal of fact I'd expect failure to give a reasoned decision to be basis for appeal against conviction. I'd also be interested in how S/W instructed itself in matters of law; with a jury trial this is always done in open court and can be a basis of appeal on the grounds that the judge made an error in law.

      If the S/W is determining sentence then I'd expect lack of explanation there to be the basis of an appeal against sentence.

  8. Geekpride
    Boffin

    Simple solution

    Someone needs to create a system to evaluate this kind of software and judge whether the training data was biased, meaning the outputs can't be trusted. I suggest something big and impressive with lots of flashing lights, maybe even a Jacob's Ladder or two. It can put on an impressive show, then just print out "BIASED TRAINING INPUTS. SOFTWARE CANNOT BE TRUSTED".

  9. Anonymous Coward
    Anonymous Coward

    "The company behind COMPAS acknowledges gender is a factor in its decision-making process and that, as men are more likely to be recidivists, so they are less likely to be recommended for probation,"

    Male inmate interest in the trans movement to spike in 3-2-1...

    1. Anonymous Coward
      Anonymous Coward

      Acksuwally, the gaming has already been going on...

  10. Anonymous Coward
    Anonymous Coward

    A golden opportunity

    Any dis-proportionality in outcomes is proof of systemic racism. So, if law enforcement is going to use programming to do their job anyway, it's time to make sure the programming enforces equal sentencing outcomes for all races, regardless of any dis-proportionality in violation rates. Yes, this means whites will need to be incarcerated at rates higher than their proportional rate of infractions, but that's what we need to do to achieve a colour-blind society.

  11. Electronics'R'Us Silver badge

    Previous issues with 'trade secret' software

    In 2005, a judge ruled (upheld in US circuit court) that a DUI defendant had the right to have the breathalyzer source code revealed and reviewed.

    Perhaps this software might be subject to the same analysis.

    https://news.slashdot.org/story/09/01/15/195242/breathalyzer-source-code-ruling-upheld

    1. Alan Brown Silver badge

      Re: Previous issues with 'trade secret' software

      "In 2005, a judge ruled (upheld in US circuit court) that a DUI defendant had the right to have the breathalyzer source code revealed and reviewed."

      And it's notable how fast law enforcement agencies have sprinted from US courtrooms when use of Stingrays has been challenged by attempting to drag the technology into the open.

  12. Anonymous Coward
    Anonymous Coward

    what about the right to face your accuser?

    If your accuser is an algorithm, doesn't that mean source code on the table?

    1. Mark 85 Silver badge

      Re: what about the right to face your accuser?

      The isn't accusing anyone nor convicting them. All it's doing is providing the judge/court with sentencing guidelines after a conviction. I won't comment on the validity of said software as others have pretty much covered that ground.

      1. P. Lee

        Re: what about the right to face your accuser?

        >The isn't accusing anyone nor convicting them.

        Not effectively quite true.

        Parole is part of it so it is effectively judging the likelihood of a future transgression and altering the sentence based on that.

        A longer non-parole period based on reoffending rates could be seen as effectively an extra conviction.

        Moreover, a judge may be questioned over his reasoning. Not only is the software unquestioned, but the fewer humans making the decisions, the less training data is available which isn't creating a feedback loop.

        This needs to stop.

        1. Jtom

          Re: what about the right to face your accuser?

          The judge still decides. They are not bound to what this, or sentencing guidelines, suggest. A decent defense attorney can and will point out mitigating factors for the judge to consider.

          The alternative to these strategies is mandatory sentencing, which takes the judge completely out of the sentencing phase.

    2. DJO Silver badge

      Re: what about the right to face your accuser?

      If your accuser is an algorithm, doesn't that mean source code on the table?

      The source code may well be perfectly fine, but if the training data was biased it matters not an iota how good the code is.

      I suspect the code is just as crappy as the training data but it might not be.

  13. Anonymous Coward
    Anonymous Coward

    The software is appropriately emulating the bias of the US judical system

    The jury-based court decisions in the US judical system are essentially a lottery scheme, and studies have proven that even decisions by US judges without juries are regularly biased. Just upholding traditions! Americans are used to being screwed by the US judical system, and this scheme ensures the wealth of 1.35+ million of US lawyers. Fixing that system would be like declaring independence from the British government, obviate more than half of the current lawyers, and reduce the income of the leftovers to 25% of their current earnings. Revolution!

    The US presidential elections could be easily fixed, by making it a direct vote. But by traditional focusing of campaigns on the "swing states", manipulation by the two parties is easier and more "cost-efficient" for campaigners. But this historic architectural flaw can be just as easily (ab-)used by outsiders ...

    There are no "good" flaws. Neither in elections, nor in crypto, nor in judical systems.

  14. Ken Hagan Gold badge

    At the risk of Godwin-ising the discussion on the first page (*)

    Police can say 'It's not my decision, the computer told me to do it,'

    I believe the actual phrasing you are looking for is "I was only obeying orders." and not only has this one been (quite famously, IMHO) shot down in court, it is plain embarrassing when the orders you are following have come from a machine rather than a superior officer.

    (* In fairness, it's a fairly high risk when the topic is "being a racist bastard and trying to pin the responsibility on someone else".)

  15. fidodogbreath Silver badge
    Terminator

    Clippinator: Judgement Day

    "You have been found guilty on all counts. Your fate will now be decided by the Microsoft Sentence 365 Condemnation Wizard."

  16. Ken Moorhouse Silver badge

    Maybe the algorithms are secret...

    That doesn't mean to say that the inputs used in a specific case are. Presumably the inputs used are recorded by the court.

    If the defending side wanted to argue fairness they should be entitled to call for a re-run of the algorithm but with each contentious input changed by request (I'm thinking a similar idea to that used for jury selection). If changing a date of birth, for example, had a significant impact on the outcome then you know that there is something untoward embedded in there.

    Using that as an example, let's say that the authors of the system singled out their exact dates of births for special treatment (should their algorithm end up analysing them) then nobody would be any the wiser, except hitting on one at random.

    The legal system should dictate that the defense is allowed to make x attempts to game radically different results with inputs within certain limits of those used against them in order to undermine the validity of the algorithm used.

    One important point to note is that the algorithms used should be frozen for the length of the trial, otherwise a rerun, or amendments made to inputs during the trial may produce different results due to the addition of further field data.

  17. a_yank_lurker Silver badge

    A serious logic error

    While the averages of the data may say this or that the actual defendant is a human being who is not some statistical average. For a valid justice system there has to be a way to factor in the human element that being the defendant. Some, not matter what the results say, should receive harsh sentences and others with the same 'results' should receive leniency. This task judgment from a real human who can size up the person; we call them judges for a reason.

  18. Sureo

    Some day you will be tried, convicted and sentenced by an AI computer. No need for those expensive judges and lawyers and scarce court rooms.

  19. Anonymous Coward
    Anonymous Coward

    So,

    "the code is untested, unavailable to suspects' defense teams, and in some cases provably biased."

    Working as designed then...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021