Re: Nobody can sensibly deny that this is a moral imperative
Nice of the home secretary to openly admit that she thinks I am nobody. I am shocked at her honesty and fully expect her to be pressured by her peers into a prompt resignation.
Tech companies could be fined $25 million (£18 million) – or ten percent of their global annual revenue – if they don't build suitable mechanisms to scan for child sex abuse material (CSAM) in end-to-end encrypted messages and an amended UK law is passed. The proposed update to the Online Safety bill [PDF], currently working …
"Moral Imperative" = "of highest importance"
Nobody can deny that preventing child (or indeed, any) sex abuse is a moral imperative.
Nobody can deny that allowing people to communicate privately is a moral imperative.
Home Secs job, like that of many politicians, is to balance dozens of moral imperatives against each other. That's why it's a hard job, and not one that should be assigned to fuckwits.
Incidentally, also...
Nobody can deny that children having a roof over their heads is a moral imperative
Nobody can deny that children having enough to eat is a moral imperative
etc
See if the current government gives a flying f**k about any of that
She's just making a blanket false claim "Brits will share child porn if we cannot spy on everyone" there. There is no moral imperitive for a fiction she created.
She's variously changed the tune from "Terrorists" to "National Security" now to "Pedos" as the reason for backdooring end-to-end encryption.
The best response to "Think of the children!" is "Jimmy Saville always did!"
The worst predators tend to operate in plain sight, usually posing as stalwart pillars of the community.
After all, you're NOT going to entrust your kids to the dirty raincoat brigade or a bunch of heavily tattooed gangbangers - but you probably won't think twice about letting them hang out at a church social group, etc
(Ironically, the heavily tattooed harley-riding gangbangers are likely to be extremely protective of kids, etc - as are almost all "screaming queens" I've known in my life)
"Ironically, the heavily tattooed harley-riding gangbangers are likely to be extremely protective of kids, etc"
Don't judge a book by it's cover and all that. Any large enough group of people, whether that be tatooed bikers, football fans, church social group , rotary club etc* is sure to contain a fair number of decent people, a few truly excellent dudes/dudettes, and a handful of obnoxious wankers.
*except parliament where the proportion of obnoxious wankers is rather higher
It's that they seem to keep getting it backwards.
"Masks are now a personal choice, we trust the public will do the right thing", loads of people stop wearing masks immediately, even when places ask them politely to keep doing so.
"If we don't spy on everyone, they will all immediately do <insert vile act here>", yet almost no one will do it, because they just won't.
They would simply demand access to the file before encryption via a backdoor to PGP (or your encryption of choice)
And the fact that you are attempting to bypass the government's right to all your data marked you out as an obvious evil-doer... lock him up, immediately!
"We, and other child safety and tech experts, believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy, while ensuring children remain safe online." They believe this, but refuse to say what leads them to believe this. Open source implementations of E2E encryption have been around for ages, if it was possible then they could easily demonstrate it.
Indeed, Monitor everything everyone does so that scanning the actual communication being sent becomes moot, they'll already know everything.
You'd think they weren't already tapping all the telemetry sent to the OS mothership.
Edit: someone disagrees with the captains accurate summing up!
Actually, she hasn't got a clue what any of it really means at all...
They stopped trying to be rational when they kept getting the "this won't work" response, so now they just want to bully everybody into complience without having to provide a solution... "It's the LAW!"
Given the current debacle in parliament, how she has the cheek to talk about "moral imperative" I cannot fathom.
Client side encryption, plus not sending or receiving messages that are deemed illegal without further action, and a way for the user to check and send something they believe is marked incorrectly. Like a picture of the Virgin Mary and Baby Jesus that could easily be mistaken for something else.
>They believe this, but refuse to say what leads them to believe this.
Boxed ticked, parents can sleep whilst the children surf the web.
Which immediately identifies the flaw in this statement; the first part ie. end-to-end encryption, has any meaningful impact on children being safe online.
End-to-end encryption won't stop what happened at Disney's Club Penguin.
"Boxed ticked, parents can sleep whilst the children surf the web."
Problem #1: over 1/3 of detected sexual offenders are under the age of 18 and equally distributed between genders
Yes, really
Let's not forget Jamie Bolger. For all the outcry, the case type isn't _particularly_ unusual when you look at history, only becoming rarer more recently
> won't stop what happened at Disney's Club Penguin
Had no idea what that was, had to look it up on the BBC news website: Disney forces explicit Club Penguin clones offline. The original website was designed to specifically target children aged 6 to 14 - I wonder why The Walt Disney Company needs to keep on shutting these websites down *ponder*
Club Penguin - online: 2005-10-24, offline: 2017-03-30, was replaced by Club Penguin Island
Club Penguin Island - online: 2017-03-29, offline: 2018-12-20, created a vacuum (that was quickly filled by clones) when shutdown.
It is trivial, you encrypt one copy of the E3E message with your private key and the recipients public key. And to comply with the law, you encrypt second copy of the E3E message with your private key and a personal GCHQ/CSAM/government public key, sending the a copy of the message (Which they would then get computers to automatically scan using neural networks trained with existing CSAM, and a human would only be allowed to access any messages with an actual court order issued by a judge). Of course this would only work if people had locked down devices that could only execute the government mandated E3E communication application(s) and had no ability to run any unsanctioned applications (no matter how trivial they may be to create - in case someone reading this post does exchange CSAM, I'm not going to explain how). The mentally damaged individuals who own and send CSAM to each other would obviously use the government mandated E3E communication application(s) because they are severely mentally damaged individuals ? Just like these people in the government who created the online safety bill.
Maybe the solution is to start simple, implement the application for governments to test first for say 50 years. If anyone in the government is caught not using the application, they can serve some jail time. And every message sent by everyone in government is decrypted and made publicly available after say 20 years.
Ok, i'll bite.
There is already a child abuse image content list available which includes hashes of child porn images. To be compliant, all you'd have to do pn the client end when somebody attaches or receives an encrypted image is to check the image hash against a list of known child porn hashes, and if a match is found then flag it up to the police.
That would be totally compliant with this law, it could only inconvenience people attaching images on the child abuse image content list to encrypted messages and it leaves end to end encryption intact.
In fact the only possible potential this has for scope creep that I can see would be the police asking if they could keep a list of hashes attached to messages so after they've raided a paedophile and got an extra few hundred/thousand images to go on the list that they could retrospectively check to pick up anybody else sharing the same material. Even if this was done, a list of MD5 hashes presents quite a limited threat to privacy, or freedom of expression.
Any solution is trivial to circumvent by pre-encrypting the image before you send it over an E2E channel.
The point is to be seen to obey the letter of the law to avoid fines.
This is all performative nonsense by a bunch of politicians that don't understand mathematics or engineering so I can't see anything other than a performative response.
Australia banned encryption that cannot be backdoored a while ago (basically telling engineers to 'nerd harder' when they said the mathematics would not allow it) but I have not seen anything to say this has ever been enforced - maybe I missed it?
Maybe somebody took a politician aside, smacked them across the head with a didgeridoo, and pointed out that not only does proper encryption not have back doors (kind of the point unless they want to try legislating new laws of mathematics), but actually enforcing their dumb law would essentially shut down online banking, purchasing, pretty much anything to do with money, and all the supposedly secure stuff on websites.
In other words, get a clue galah.
It ended just fine: the bill never got out of committee.
And they didn't "try legislating new laws of mathematics". There was a bill to "recognize a contribution" to mathematics. That said contribution (squaring the circle, of course) was rubbish was what ended up dooming the bill.
Now, it might have made it out of committee and to the floor had a Purdue professor not happened by and been invited to review it. And it might even have passed. Legislatures pass all sorts of rubbish no-effect bills like that: recognizing some personage of minor import, establishing State Whatever Day or Official State Nonsense, and so forth. These might be "laws" in a notional sense but have no real-world effect; they're just posturing.
(There are many discussions of foolish or odd laws. Unfortunately most of them are themselves rubbish, recounting anecdotes without any attempt at verifying them from primary sources. I recommend Underhill's The Emergency Sasquatch Ordinance as an exception to that unfortunate trend; he did the research and provides citations. Also he's a better writer than most of the others.)
The point is that if a law requires it to be done; that's a method of doing it.
Ok, it's trivial to circumvent through editing the files so the MD5 hashes are different each time or a number of other methods. It still complies with the law. If you kept a list of the MD5 hashes then when the police nicks a paedophile and goes through their stash of images then they get a bunch of new MD5 hashes which could be compared to the file sharing history, and you then have a list of other paedophiles who'd shared those files.
If I was a policeman I think i'd probably be happy with that.
While you probably couldn't prevent anybody from circumventing the checks if they are done on the client side, you could probably detect that the child porn filter has been disabled by various methods, I can think of a few off the top of my head. One suspects that the National Crime Agency would be just as happy with occasional lists of people detected circumventing it, as that has to be reasonable cause for a search warrant.
I don't think that either the police or politicians expect perfection, just some good faith efforts.
The Microsoft CSAM hash database doesn't use MD5 or any other cryptographic hash. It uses PhotoDNA hashes, which are intended to produce the same result under a variety of transformations.
That also reduces its precision and increases the false-positive rate, of course. You can't have it both ways. Nor is it proof against all transformations, and automating applying a series of transformations until you get a different PhotoDNA result is an obvious easy attack on the system.
There are other issues with using a large PhotoDNA hash database for client-side scanning, such as the size of the database and the computational requirements.
The whole idea is idiotic and typical political pandering.
To be compliant, all you'd have to do pn[sic] the client end when somebody attaches or receives an encrypted imagefile of any type is to check the imagefile of any type hash against a list of known child porn files we're looking for hashes, and if a match is found then flag it up to the police.
"available which includes hashes of child porn images"
The problem with a hash is that it is a mathematical equivalence. Is this picture the same as that picture?
Well, couldn't that essentially be broken by scaling the image, say, 5% either way? Or compressing it a little more? Or gently messing with the colours? It wouldn't take much ingenuity at all to batch convert a bunch of images from known matches to unknowns.
Plus, with only a result and no actual image to work with, how does one train a machine to be able to recognise such a thing in this case? It'll be like that judge who said that he couldn't define pornography, but he'd know it when he saw it. Well, we would have to teach a machine to know, and given the hysterical responses a lot of people have (not to mention the malignant behaviour of the police these days) we would have to teach it to be accurate and have a low rate of false positives, yet protect children by catching everything that is bad. In other words waffle-waffle-magic-waffle-done. There, that was easy, in government land.
Meanwhile, in reality...
The problem with a hash is that it is a mathematical equivalence
Aside from the special case of perfect hashes, no, it isn't. Lossy hash schemes (i.e., almost all of them) will, by definition, tolerate some change in the input. The hash currently used for this nonsense, PhotoDNA, is meant to tolerate things like scaling, compression, and relatively minor changes to color, cropping, and so forth.
How well it does so is one question, but there are far more interesting ones, of course. Like how generating PhotoDNA hashes and comparing them against a large database could be implemented efficiently on client devices, for example. (It can't.) Or what guarantees people flagged by false positives would have against excessive response. (None, that's what they'd have.) Or how we could trust client applications that have any mechanism for reporting anything to "authorities" somewhere. (We can't.) Or how much effect this would have on the problem. (Very little.)
believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy
They are probably envisaging a "trusted third party" in the middle doing the scanning with end-to-end encryption connecting both ends to the middle. This is fine as long as it is a "trusted third party" that could be relied on to rigidly and securely perform the required task and no more. Unfortunately it can't be relied on to do anything like that. The required level of security to protect the users right to privacy would make the trusted third party resemble an opaque box. Mission creep would then secretly extend the monitoring criteria turning it into a "untrustable third party" at which point you might as well rename it CESG monitoring point.
The obvious choice to do the monitoring would be the pron merchant the the UK government was going to use to verify age for access to <cough> 'adult' sites. (they probably wouldn't need to worry about using hashes)
Hmm, wonder what happened to that bit of legislation... sorry, 'box ticking'
Come up with something so vile that nobody can question it, then destroy privacy in the name of stopping it. Once the capability is developed, it WILL be used to spy on any and all communications. While there is security to be had in anonimity, that only works until the powers that be decide to take an interest in you. All it takes to get someone interested is to cut the wrong person off in traffic.
Of course no dodgy user would ever avoid official implementations of encrypted chat. {S} So eventually an open source coders comes up with multiple client side software. Said coders living in a country that does not support general snooping and remain anonymous for their own safety? Existing chat coders are in what legal position ? Next, support for complete packet analysis looking for the use of unapproved encryption data streams ? Its as if TLAs have plans for big Data retention and real time analytics and bought the right pollies. Regardless, asking Big tech to snoop is another Fox guarding Hen House scenario.