Just wondered...
Does anyone in government give a shit about these reports or is this time filler for MPs?
Increased use of algorithms in decision-making risks disproportionately affecting certain groups, MPs have said, urging the government to boost transparency and tackle bias - but not forget the value of public data. In a report on algorithmic decision-making, published today, the House of Commons Science and Technology …
Yes, the govt does care about reports like this. Think it through. It's only a matter of time before some huge scandal breaks over public sector data, and when that happens the parties will take turns to explain how they followed these rules more faithfully than the other lot.
Cynicism is fine, but don't let it lull you into simple mindedness.
It added that the data held by public authorities could be a potentially lucrative asset, urging the government to ensure it negotiates good deals for its use by private companies.
Why is government absolved from the GDPR?
I provide information to the government, when I have to, purely so that the government can provide me with a particular service. That's it. The government has no authorisation to sell that information to any third party, anonymised or not.
Stuff 'lucrative assets' - the Tories would (and probably have) sold their grand-mothers for catfood.
Particularly when it comes to NHS data this could be dangerous. Imagine people not going to the doctor because they don't want their health details shared with Google, Facebook, Experian, etc. If such behaviour catches on it could help turn a small virus outbreak into a national pandemic!
Governments used to sell our names and addresses from the electoral register. Then somebody sued and won.
So now we have the real register, with all entries, for use by government and by political parties and candidates. Then there is the commercial register, containing only those who have not opted out, for sale to anybody.
We need a similar scheme for other data, with an additional requirement. Everybody is entitled to be individually paid 50p or whatever for their data, and the costs of those payments must be borne by the purchaser of that data.
Not just UK, its happening in Google/Facebook 'EU-Land' too:
------------------------
https://www.irishtimes.com/business/technology/wary-of-the-public-services-card-you-have-good-reason-to-be-1.3351106
https://www.irishtimes.com/business/technology/privacy-rights-it-s-natural-facebook-would-choose-ireland-1.3400531
One of the core tenants of truly fair and democratic government is that the decisions concerning an individual can be defended and reviewed.
Which seems to run directly against any kind of machine learning as even the researchers that create it cant explain why a given decision has been taken.
How can anyone challenge a decision when all the data about the decision is lurking in the AI's code?
That's the trouble with 'deep learning' style heuristics - they are utterly inscrutable, because the entire 'algorithm' (if you can even call it that) is completely dependent on the ENTIRE DATASET that it was trained on. That often amounts to Petabytes, and cannot possibly be audited. If the dataset is skewed (e.g. in the same direction that our society is skewed) then the resulting 'algorithm' will be skewed the same way. So it solves precisely nothing.
That said, I am quite worried about the "age of accountability" affecting well-meaning humans (e.g. teachers, doctors, police) being compared against machines and being expected to act like robots. The last time I dealt with police, they were scarily algorithmic, probably because they themselves were scared for their own jobs if they didn't follow their programming..'
As for GDPR - it limits the use of deep learning by "accountable" authorities like the police/HMRC/you/me/etc but anyone already operating outside the law/domicile is unaffected. So the likes of Facebork will get away 'scot' free, selling your data to all and sundry, mining it to determine your political preferences, and brainwashing you with Brexit/Trump/WWIII propaganda.
All those luvverly unintended consequences just waiting to happen due to the ignorance of those letting this particular monster out of the box. Foolishly having been overconconfident about competence and good intentions saving us from harm.
"As a result, the algorithmic decision may disproportionately discriminate against certain groups, and are as unacceptable as any existing "human" discrimination."
Or as acceptable as any existing discrimination, depending on whatever a government chooses to do at any point in the future. Oh well, judging by the naivety of the politicians displayed here, who talk as if algorithms are nothing more than natural phenomena, rather than instructions created by people, I'm sure the average Mr Buttle or Mr Tuttle in the street will not be too worried.
"As a result, the algorithmic decision may disproportionately discriminate against certain groups, and are as unacceptable as any existing "human" discrimination."
Yes, see I read the paragraph you've identified as meaning "The algorithm might forget to incorporate any of the stupid racist/sexist 'positive' discrimination we've spent decades trying to enforce". Positive discrimination being much like an HIV test - you really don't want a positive result.
"Only then will we benefit (profit) from the enormous value of our health data. Deals are already being struck without the required partnership models we need."
By selling personal data to 3rd parties ? I haven't signed up for that. Also deals are already being struck ? FOI request please Reg ?
Part 2, Chapter 2, section 16 of the UK Data Protection Act 2018 - "Power to make further exemptions etc by regulations" - in principle allows ministers to create any exemptions they like "for the performance of a task in the public interest or in the exercise of official authority".
Consequently, government is practically speaking allowed to do anything it likes with our personal data by invoking these purposes (particularly the latter purpose), and such regulations are not subject to the full Parliamentary scrutiny that would be applied to primary legislation.
This is interesting (and disquieting), since the fundamental origin of data protection law was Article 12 of the 1948 Universal Declaration of Human Rights, which declaration was specifically defined to protect the public from governments. Unfortunately, as generally adopted into laws, the right to privacy is not one of the absolute rights.
Long live democracy!
..Algorithms, in looking for and exploiting data patterns, can sometimes produce flawed or biased "decisions" — just as human decision-making is often an inexact endeavour....
...just use the Climate Change algorithms?
Everyone knows that these are perfect, settled science, and incapable of error....
One way or another the Tories will get the deficit and national debt down as we are not out of the woods just yet, but I also suspect some people will stop using public services like the NHS; those that cant avoid public services will just have to put up with being a product, like some other businesses view those that interact with them.
Govts are just the ultimate monopoly business, your taxes are just the cost so find a different provider where possible if it can save money. The quality of service will be variable though, some will be better some will be worse as the Govt is heavily resourced in some areas but not others.
Its interesting concerns about data skewing or data bias will exist in the data sets, sure it can exist in the data sets, so perhaps being a minority in a country may not be the best place to be located in the world with some algorithms, due to genetic differences and the possibility of rare conditions not being spotted, but it also depends on the "remit" of the algorithm.
Most algo's will be performing important but boring tasks, like looking for 2d (xrays) or 3d (mri scans) pixel patterns to determine if someone has lung cancer or breast cancer, a highly boring but highly paid job. These algo's could be highly effective as there is no boredom, reduction in attention to detail to contend with, a way for the Govt to reduce medical negligence claims even if its just used for highlighting suspects for further human investigation and/or confirmation, so its a cost saving exercise in one way, whilst helping to direct money into areas of research which can further enhance our collective knowledge with a reduction in potential suffering.
However I cant help but be remind of this https://www.youtube.com/watch?v=zUQgthIs7pM
especially if the algo's and/or equipment sensors are not good enough compared to a human with greater intuition not yet included in the algo and/or data set.