Sounds like responsibilty laundering. It wasn't me what put that innocent person in jail, it was the software what did it. No, we won't be fixing it, that's the software comapanies job. And they just went into flat pack bankrupancy and were bought by a nice Chiense firm with a line in soical scoring.
Criminal justice software code could send you to jail and there’s nothing you can do about it
American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects' defense teams, and in some cases provably biased. In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of …
COMMENTS
-
-
Monday 13th August 2018 19:04 GMT Anonymous Coward
'It's only a matter of time before they start targeting... Segregation'
You can bet Facebook / Google etc want in on this. 1st-round is matching / targeting ads. 2nd-round is hoovering up financial-transactions / patient-health info. 3rd-round is being involved in every transaction or event that has any kind of data aspect etc.
Does your child get to have the surgery they need? Do they get a place on the college course they deserve? Do they get the job they long for? Do they get a loan to buy a dream home? Who do they get to date? Ask Zuckerberg, as we're all Suckerburgs now.
-
Monday 13th August 2018 11:53 GMT Giovani Tapini
Trade secrets, pah
The precise algorithms and weightings of parameters a trade secret?
If its not patentable, then its probably only secret because its obvious(ly flawed) and anyone with access to training data could have a go.
The authorities should own the process IP to ensure it has some level of transparency, even to themselves. Leaving decisions to a corporation (recalling OCP for example) is outsourcing law enforcement to an unacceptable degree in my view.
-
Monday 13th August 2018 17:37 GMT Mark 85
Re: Trade secrets, pah
Being a "trade secret" is a lot different than a patent. Trade secrets are generally very closely held by the company whereas a patent is published. For example, the recipe for Coca-Cola isn't patented. If it had been, the recipe would be in the wild and anyone could make the product.
-
Wednesday 15th August 2018 07:39 GMT Anonymous Coward
Re: Trade secrets, pah
> If its not patentable, then its probably only secret because its obvious(ly flawed) and anyone with access to training data could have a go.
If it's the current fashionable flavour of AI - i.e. neural network machine learning - then there is no algorithm to disclose.
It's just a black box with a whole load of weights which were iteratively tuned based on training data.
-
This post has been deleted by its author
-
Monday 13th August 2018 13:56 GMT Anonymous Coward
Thinking about this I can honestly see no reason why the way this determines sentence or parole shouldn't be disclosed. It's not like you can alter the parameters to game the system as you have committed the offence or have a criminal past and those convictions taken into account. The only reason anyone would want to keep it secret is if it's doing something it shouldn't be such as profiling by race.
-
Monday 13th August 2018 15:59 GMT Doctor Syntax
"The only reason anyone would want to keep it secret is if it's doing something it shouldn't be such as profiling by race."
A few other possibilities. One is that it's such a pile of crap that they wouldn't be able to sell it or those that bought it would want their money back. Alternatively, it's such a pile of crap that the victims would take them for everything they've got and more in damages. More likely it's another of those AI things where nobody knows how its arriving at conclusions so it's not so much they don't want to disclose anything, more a case of they can't.
When NI had the judge-only courts the judge had to give a reasoned account of how he came to his decsion (which, of course, is more than a jury has to do). If S/W were to be a tribunal of fact I'd expect failure to give a reasoned decision to be basis for appeal against conviction. I'd also be interested in how S/W instructed itself in matters of law; with a jury trial this is always done in open court and can be a basis of appeal on the grounds that the judge made an error in law.
If the S/W is determining sentence then I'd expect lack of explanation there to be the basis of an appeal against sentence.
-
-
Monday 13th August 2018 14:45 GMT Geekpride
Simple solution
Someone needs to create a system to evaluate this kind of software and judge whether the training data was biased, meaning the outputs can't be trusted. I suggest something big and impressive with lots of flashing lights, maybe even a Jacob's Ladder or two. It can put on an impressive show, then just print out "BIASED TRAINING INPUTS. SOFTWARE CANNOT BE TRUSTED".
-
-
Monday 13th August 2018 16:56 GMT Anonymous Coward
A golden opportunity
Any dis-proportionality in outcomes is proof of systemic racism. So, if law enforcement is going to use programming to do their job anyway, it's time to make sure the programming enforces equal sentencing outcomes for all races, regardless of any dis-proportionality in violation rates. Yes, this means whites will need to be incarcerated at rates higher than their proportional rate of infractions, but that's what we need to do to achieve a colour-blind society.
-
Monday 13th August 2018 16:57 GMT Electronics'R'Us
Previous issues with 'trade secret' software
In 2005, a judge ruled (upheld in US circuit court) that a DUI defendant had the right to have the breathalyzer source code revealed and reviewed.
Perhaps this software might be subject to the same analysis.
https://news.slashdot.org/story/09/01/15/195242/breathalyzer-source-code-ruling-upheld
-
Tuesday 14th August 2018 08:04 GMT Alan Brown
Re: Previous issues with 'trade secret' software
"In 2005, a judge ruled (upheld in US circuit court) that a DUI defendant had the right to have the breathalyzer source code revealed and reviewed."
And it's notable how fast law enforcement agencies have sprinted from US courtrooms when use of Stingrays has been challenged by attempting to drag the technology into the open.
-
-
-
-
Tuesday 14th August 2018 04:32 GMT P. Lee
Re: what about the right to face your accuser?
>The isn't accusing anyone nor convicting them.
Not effectively quite true.
Parole is part of it so it is effectively judging the likelihood of a future transgression and altering the sentence based on that.
A longer non-parole period based on reoffending rates could be seen as effectively an extra conviction.
Moreover, a judge may be questioned over his reasoning. Not only is the software unquestioned, but the fewer humans making the decisions, the less training data is available which isn't creating a feedback loop.
This needs to stop.
-
Tuesday 14th August 2018 14:44 GMT Jtom
Re: what about the right to face your accuser?
The judge still decides. They are not bound to what this, or sentencing guidelines, suggest. A decent defense attorney can and will point out mitigating factors for the judge to consider.
The alternative to these strategies is mandatory sentencing, which takes the judge completely out of the sentencing phase.
-
-
-
Tuesday 14th August 2018 10:22 GMT DJO
Re: what about the right to face your accuser?
If your accuser is an algorithm, doesn't that mean source code on the table?
The source code may well be perfectly fine, but if the training data was biased it matters not an iota how good the code is.
I suspect the code is just as crappy as the training data but it might not be.
-
-
Monday 13th August 2018 17:39 GMT Anonymous Coward
The software is appropriately emulating the bias of the US judical system
The jury-based court decisions in the US judical system are essentially a lottery scheme, and studies have proven that even decisions by US judges without juries are regularly biased. Just upholding traditions! Americans are used to being screwed by the US judical system, and this scheme ensures the wealth of 1.35+ million of US lawyers. Fixing that system would be like declaring independence from the British government, obviate more than half of the current lawyers, and reduce the income of the leftovers to 25% of their current earnings. Revolution!
The US presidential elections could be easily fixed, by making it a direct vote. But by traditional focusing of campaigns on the "swing states", manipulation by the two parties is easier and more "cost-efficient" for campaigners. But this historic architectural flaw can be just as easily (ab-)used by outsiders ...
There are no "good" flaws. Neither in elections, nor in crypto, nor in judical systems.
-
Monday 13th August 2018 18:35 GMT Ken Hagan
At the risk of Godwin-ising the discussion on the first page (*)
Police can say 'It's not my decision, the computer told me to do it,'
I believe the actual phrasing you are looking for is "I was only obeying orders." and not only has this one been (quite famously, IMHO) shot down in court, it is plain embarrassing when the orders you are following have come from a machine rather than a superior officer.
(* In fairness, it's a fairly high risk when the topic is "being a racist bastard and trying to pin the responsibility on someone else".)
-
Monday 13th August 2018 18:50 GMT Ken Moorhouse
Maybe the algorithms are secret...
That doesn't mean to say that the inputs used in a specific case are. Presumably the inputs used are recorded by the court.
If the defending side wanted to argue fairness they should be entitled to call for a re-run of the algorithm but with each contentious input changed by request (I'm thinking a similar idea to that used for jury selection). If changing a date of birth, for example, had a significant impact on the outcome then you know that there is something untoward embedded in there.
Using that as an example, let's say that the authors of the system singled out their exact dates of births for special treatment (should their algorithm end up analysing them) then nobody would be any the wiser, except hitting on one at random.
The legal system should dictate that the defense is allowed to make x attempts to game radically different results with inputs within certain limits of those used against them in order to undermine the validity of the algorithm used.
One important point to note is that the algorithms used should be frozen for the length of the trial, otherwise a rerun, or amendments made to inputs during the trial may produce different results due to the addition of further field data.
-
Monday 13th August 2018 23:57 GMT a_yank_lurker
A serious logic error
While the averages of the data may say this or that the actual defendant is a human being who is not some statistical average. For a valid justice system there has to be a way to factor in the human element that being the defendant. Some, not matter what the results say, should receive harsh sentences and others with the same 'results' should receive leniency. This task judgment from a real human who can size up the person; we call them judges for a reason.