back to article Tech world may face huge fines if it doesn't scrub CSAM from encrypted chats

Tech companies could be fined $25 million (£18 million) – or ten percent of their global annual revenue – if they don't build suitable mechanisms to scan for child sex abuse material (CSAM) in end-to-end encrypted messages and an amended UK law is passed. The proposed update to the Online Safety bill [PDF], currently working …

Page:

  1. Flocke Kroes Silver badge

    Re: Nobody can sensibly deny that this is a moral imperative

    Nice of the home secretary to openly admit that she thinks I am nobody. I am shocked at her honesty and fully expect her to be pressured by her peers into a prompt resignation.

    1. jmch Silver badge

      Re: Nobody can sensibly deny that this is a moral imperative

      "Moral Imperative" = "of highest importance"

      Nobody can deny that preventing child (or indeed, any) sex abuse is a moral imperative.

      Nobody can deny that allowing people to communicate privately is a moral imperative.

      Home Secs job, like that of many politicians, is to balance dozens of moral imperatives against each other. That's why it's a hard job, and not one that should be assigned to fuckwits.

      Incidentally, also...

      Nobody can deny that children having a roof over their heads is a moral imperative

      Nobody can deny that children having enough to eat is a moral imperative

      etc

      See if the current government gives a flying f**k about any of that

      1. Anonymous Coward
        Anonymous Coward

        Re: Nobody can sensibly deny that this is a moral imperative

        She's just making a blanket false claim "Brits will share child porn if we cannot spy on everyone" there. There is no moral imperitive for a fiction she created.

        She's variously changed the tune from "Terrorists" to "National Security" now to "Pedos" as the reason for backdooring end-to-end encryption.

        1. Alan Brown Silver badge

          Re: Nobody can sensibly deny that this is a moral imperative

          The best response to "Think of the children!" is "Jimmy Saville always did!"

          The worst predators tend to operate in plain sight, usually posing as stalwart pillars of the community.

          After all, you're NOT going to entrust your kids to the dirty raincoat brigade or a bunch of heavily tattooed gangbangers - but you probably won't think twice about letting them hang out at a church social group, etc

          (Ironically, the heavily tattooed harley-riding gangbangers are likely to be extremely protective of kids, etc - as are almost all "screaming queens" I've known in my life)

          1. jmch Silver badge

            Re: Nobody can sensibly deny that this is a moral imperative

            "Ironically, the heavily tattooed harley-riding gangbangers are likely to be extremely protective of kids, etc"

            Don't judge a book by it's cover and all that. Any large enough group of people, whether that be tatooed bikers, football fans, church social group , rotary club etc* is sure to contain a fair number of decent people, a few truly excellent dudes/dudettes, and a handful of obnoxious wankers.

            *except parliament where the proportion of obnoxious wankers is rather higher

        2. James 139

          Re: Nobody can sensibly deny that this is a moral imperative

          It's that they seem to keep getting it backwards.

          "Masks are now a personal choice, we trust the public will do the right thing", loads of people stop wearing masks immediately, even when places ask them politely to keep doing so.

          "If we don't spy on everyone, they will all immediately do <insert vile act here>", yet almost no one will do it, because they just won't.

    2. Anonymous Coward
      Anonymous Coward

      Re: Nobody can sensibly deny that this is a moral imperative

      Another "problem" of minuscule size that requires a nuclear weapon dropped from orbit as if "it's the only way".

  2. heyrick Silver badge

    Amusing article

    There's no mission creep here, we're only interested in dealing with THINK OF THE CHILDREN.

    And the article ends with two other potential targets, evidence of creep and how such a scheme could easily be expanded for "subversive" content.

    Uh-huh.

  3. Anonymous Coward
    Anonymous Coward

    Irrelevant really though, isn't it ?

    Fucked if I'm letting "approved by Priti Patel" encryption handle anything of mine before I encrypt it myself.

    1. John69

      Re: Irrelevant really though, isn't it ?

      Exactly how widely that will be adopted is a question, but it will certainly be the MO of kiddie porn flingers.

    2. Trigonoceps occipitalis Silver badge

      Re: Irrelevant really though, isn't it ?

      "We, and other child safety and tech experts, believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy ... "

      Says Priti Patel BA Economics (University of Keele)

    3. bombastic bob Silver badge
      Big Brother

      Re: Irrelevant really though, isn't it ?

      what would happen if you use end-end encryption to send encrypted files? Just keep adding layers until "they" throw their hands in the air and give up.

      1. Anonymous Coward
        Anonymous Coward

        Re: Irrelevant really though, isn't it ?

        They would simply demand access to the file before encryption via a backdoor to PGP (or your encryption of choice)

        And the fact that you are attempting to bypass the government's right to all your data marked you out as an obvious evil-doer... lock him up, immediately!

  4. John69

    If they can do why do they not tell us how?

    "We, and other child safety and tech experts, believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy, while ensuring children remain safe online." They believe this, but refuse to say what leads them to believe this. Open source implementations of E2E encryption have been around for ages, if it was possible then they could easily demonstrate it.

    1. Captain Hogwash

      Re: If they can do why do they not tell us how?

      Client side scanning prior to encryption is what she's talking about.

      1. Wellyboot Silver badge

        Re: If they can do why do they not tell us how?

        Indeed, Monitor everything everyone does so that scanning the actual communication being sent becomes moot, they'll already know everything.

        You'd think they weren't already tapping all the telemetry sent to the OS mothership.

        Edit: someone disagrees with the captains accurate summing up!

      2. ClockworkOwl
        Thumb Down

        Re: If they can do why do they not tell us how?

        Actually, she hasn't got a clue what any of it really means at all...

        They stopped trying to be rational when they kept getting the "this won't work" response, so now they just want to bully everybody into complience without having to provide a solution... "It's the LAW!"

        Given the current debacle in parliament, how she has the cheek to talk about "moral imperative" I cannot fathom.

      3. Flocke Kroes Silver badge

        Re: Client side scanning

        OK, lets try this. First I will need to gather collection images including CASM and have it tagged by cheap labour so I can train my AI. Next, to prove that I am forwarding only CASM to Priti Patel I have to publish my dataset.

        Is any part of that legal?

        1. Spazturtle Silver badge

          Re: Client side scanning

          Microsoft already maintain a database full of neural hashes of CSAM which is the one everyone uses.

          1. Captain Hogwash

            Re: Client side scanning

            Although whether or not this is actually the kind of material they will be looking for is uncertain. Even if it is, other targets may exist for the next administration, or the next, or the next, etc.

            1. Anonymous Coward
              Anonymous Coward

              Re: Client side scanning

              It's not just 'the next administration'... you've also got other repressive regimes, hostile foreign entities, right the way down to hackers and, erm, faecebook and the like

          2. Richard 12 Silver badge

            Re: Client side scanning

            And is therefore utterly useless, because nobody has any idea what is actually in it.

      4. gnasher729 Silver badge

        Re: If they can do why do they not tell us how?

        Client side encryption, plus not sending or receiving messages that are deemed illegal without further action, and a way for the user to check and send something they believe is marked incorrectly. Like a picture of the Virgin Mary and Baby Jesus that could easily be mistaken for something else.

    2. Roland6 Silver badge

      Re: If they can do why do they not tell us how?

      >They believe this, but refuse to say what leads them to believe this.

      Boxed ticked, parents can sleep whilst the children surf the web.

      Which immediately identifies the flaw in this statement; the first part ie. end-to-end encryption, has any meaningful impact on children being safe online.

      End-to-end encryption won't stop what happened at Disney's Club Penguin.

      1. Alan Brown Silver badge

        Re: If they can do why do they not tell us how?

        "Boxed ticked, parents can sleep whilst the children surf the web."

        Problem #1: over 1/3 of detected sexual offenders are under the age of 18 and equally distributed between genders

        Yes, really

        Let's not forget Jamie Bolger. For all the outcry, the case type isn't _particularly_ unusual when you look at history, only becoming rarer more recently

      2. Bartholomew Bronze badge

        Re: If they can do why do they not tell us how?

        > won't stop what happened at Disney's Club Penguin

        Had no idea what that was, had to look it up on the BBC news website: Disney forces explicit Club Penguin clones offline. The original website was designed to specifically target children aged 6 to 14 - I wonder why The Walt Disney Company needs to keep on shutting these websites down *ponder*

        Club Penguin - online: 2005-10-24, offline: 2017-03-30, was replaced by Club Penguin Island

        Club Penguin Island - online: 2017-03-29, offline: 2018-12-20, created a vacuum (that was quickly filled by clones) when shutdown.

    3. Anonymous Coward
      Anonymous Coward

      Re: If they can do why do they not tell us how?

      It is trivial, you encrypt one copy of the E3E message with your private key and the recipients public key. And to comply with the law, you encrypt second copy of the E3E message with your private key and a personal GCHQ/CSAM/government public key, sending the a copy of the message (Which they would then get computers to automatically scan using neural networks trained with existing CSAM, and a human would only be allowed to access any messages with an actual court order issued by a judge). Of course this would only work if people had locked down devices that could only execute the government mandated E3E communication application(s) and had no ability to run any unsanctioned applications (no matter how trivial they may be to create - in case someone reading this post does exchange CSAM, I'm not going to explain how). The mentally damaged individuals who own and send CSAM to each other would obviously use the government mandated E3E communication application(s) because they are severely mentally damaged individuals ? Just like these people in the government who created the online safety bill.

      Maybe the solution is to start simple, implement the application for governments to test first for say 50 years. If anyone in the government is caught not using the application, they can serve some jail time. And every message sent by everyone in government is decrypted and made publicly available after say 20 years.

    4. Peter2 Silver badge

      Re: If they can do why do they not tell us how?

      Ok, i'll bite.

      There is already a child abuse image content list available which includes hashes of child porn images. To be compliant, all you'd have to do pn the client end when somebody attaches or receives an encrypted image is to check the image hash against a list of known child porn hashes, and if a match is found then flag it up to the police.

      That would be totally compliant with this law, it could only inconvenience people attaching images on the child abuse image content list to encrypted messages and it leaves end to end encryption intact.

      In fact the only possible potential this has for scope creep that I can see would be the police asking if they could keep a list of hashes attached to messages so after they've raided a paedophile and got an extra few hundred/thousand images to go on the list that they could retrospectively check to pick up anybody else sharing the same material. Even if this was done, a list of MD5 hashes presents quite a limited threat to privacy, or freedom of expression.

      1. Steve Graham

        Re: Ok, i'll bite.

        Trivial to circumvent. Try again.

        1. genghis_uk

          Re: Ok, i'll bite.

          Any solution is trivial to circumvent by pre-encrypting the image before you send it over an E2E channel.

          The point is to be seen to obey the letter of the law to avoid fines.

          This is all performative nonsense by a bunch of politicians that don't understand mathematics or engineering so I can't see anything other than a performative response.

          Australia banned encryption that cannot be backdoored a while ago (basically telling engineers to 'nerd harder' when they said the mathematics would not allow it) but I have not seen anything to say this has ever been enforced - maybe I missed it?

          1. heyrick Silver badge

            Re: Ok, i'll bite.

            Maybe somebody took a politician aside, smacked them across the head with a didgeridoo, and pointed out that not only does proper encryption not have back doors (kind of the point unless they want to try legislating new laws of mathematics), but actually enforcing their dumb law would essentially shut down online banking, purchasing, pretty much anything to do with money, and all the supposedly secure stuff on websites.

            In other words, get a clue galah.

            1. Someone Else Silver badge

              Re: Ok, i'll bite.

              [...] (kind of the point unless they want to try legislating new laws of mathematics), [...]

              Seems they tried that once in Indiana. It didn't end well....

              1. Michael Wojcik Silver badge

                Re: Ok, i'll bite.

                It ended just fine: the bill never got out of committee.

                And they didn't "try legislating new laws of mathematics". There was a bill to "recognize a contribution" to mathematics. That said contribution (squaring the circle, of course) was rubbish was what ended up dooming the bill.

                Now, it might have made it out of committee and to the floor had a Purdue professor not happened by and been invited to review it. And it might even have passed. Legislatures pass all sorts of rubbish no-effect bills like that: recognizing some personage of minor import, establishing State Whatever Day or Official State Nonsense, and so forth. These might be "laws" in a notional sense but have no real-world effect; they're just posturing.

                (There are many discussions of foolish or odd laws. Unfortunately most of them are themselves rubbish, recounting anecdotes without any attempt at verifying them from primary sources. I recommend Underhill's The Emergency Sasquatch Ordinance as an exception to that unfortunate trend; he did the research and provides citations. Also he's a better writer than most of the others.)

        2. Roland6 Silver badge

          Re: Ok, i'll bite.

          >Trivial to circumvent.

          But good enough for this bunch of politicians to tick the box and move on.

        3. Peter2 Silver badge

          Re: Ok, i'll bite.

          The point is that if a law requires it to be done; that's a method of doing it.

          Ok, it's trivial to circumvent through editing the files so the MD5 hashes are different each time or a number of other methods. It still complies with the law. If you kept a list of the MD5 hashes then when the police nicks a paedophile and goes through their stash of images then they get a bunch of new MD5 hashes which could be compared to the file sharing history, and you then have a list of other paedophiles who'd shared those files.

          If I was a policeman I think i'd probably be happy with that.

          While you probably couldn't prevent anybody from circumventing the checks if they are done on the client side, you could probably detect that the child porn filter has been disabled by various methods, I can think of a few off the top of my head. One suspects that the National Crime Agency would be just as happy with occasional lists of people detected circumventing it, as that has to be reasonable cause for a search warrant.

          I don't think that either the police or politicians expect perfection, just some good faith efforts.

          1. Michael Wojcik Silver badge

            Re: Ok, i'll bite.

            The Microsoft CSAM hash database doesn't use MD5 or any other cryptographic hash. It uses PhotoDNA hashes, which are intended to produce the same result under a variety of transformations.

            That also reduces its precision and increases the false-positive rate, of course. You can't have it both ways. Nor is it proof against all transformations, and automating applying a series of transformations until you get a different PhotoDNA result is an obvious easy attack on the system.

            There are other issues with using a large PhotoDNA hash database for client-side scanning, such as the size of the database and the computational requirements.

            The whole idea is idiotic and typical political pandering.

      2. Captain Hogwash

        Re: If they can do why do they not tell us how?

        To be compliant, all you'd have to do pn[sic] the client end when somebody attaches or receives an encrypted imagefile of any type is to check the imagefile of any type hash against a list of known child porn files we're looking for hashes, and if a match is found then flag it up to the police.

      3. heyrick Silver badge

        Re: If they can do why do they not tell us how?

        "available which includes hashes of child porn images"

        The problem with a hash is that it is a mathematical equivalence. Is this picture the same as that picture?

        Well, couldn't that essentially be broken by scaling the image, say, 5% either way? Or compressing it a little more? Or gently messing with the colours? It wouldn't take much ingenuity at all to batch convert a bunch of images from known matches to unknowns.

        Plus, with only a result and no actual image to work with, how does one train a machine to be able to recognise such a thing in this case? It'll be like that judge who said that he couldn't define pornography, but he'd know it when he saw it. Well, we would have to teach a machine to know, and given the hysterical responses a lot of people have (not to mention the malignant behaviour of the police these days) we would have to teach it to be accurate and have a low rate of false positives, yet protect children by catching everything that is bad. In other words waffle-waffle-magic-waffle-done. There, that was easy, in government land.

        Meanwhile, in reality...

        1. Michael Wojcik Silver badge

          Re: If they can do why do they not tell us how?

          The problem with a hash is that it is a mathematical equivalence

          Aside from the special case of perfect hashes, no, it isn't. Lossy hash schemes (i.e., almost all of them) will, by definition, tolerate some change in the input. The hash currently used for this nonsense, PhotoDNA, is meant to tolerate things like scaling, compression, and relatively minor changes to color, cropping, and so forth.

          How well it does so is one question, but there are far more interesting ones, of course. Like how generating PhotoDNA hashes and comparing them against a large database could be implemented efficiently on client devices, for example. (It can't.) Or what guarantees people flagged by false positives would have against excessive response. (None, that's what they'd have.) Or how we could trust client applications that have any mechanism for reporting anything to "authorities" somewhere. (We can't.) Or how much effect this would have on the problem. (Very little.)

    5. Persona Silver badge

      Re: If they can do why do they not tell us how?

      believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy

      They are probably envisaging a "trusted third party" in the middle doing the scanning with end-to-end encryption connecting both ends to the middle. This is fine as long as it is a "trusted third party" that could be relied on to rigidly and securely perform the required task and no more. Unfortunately it can't be relied on to do anything like that. The required level of security to protect the users right to privacy would make the trusted third party resemble an opaque box. Mission creep would then secretly extend the monitoring criteria turning it into a "untrustable third party" at which point you might as well rename it CESG monitoring point.

      1. Anonymous Coward
        Anonymous Coward

        Re: If they can do why do they not tell us how?

        The obvious choice to do the monitoring would be the pron merchant the the UK government was going to use to verify age for access to <cough> 'adult' sites. (they probably wouldn't need to worry about using hashes)

        Hmm, wonder what happened to that bit of legislation... sorry, 'box ticking'

      2. Michael Wojcik Silver badge

        Re: If they can do why do they not tell us how?

        That would make it not end-to-end encryption, so it fails to achieve their stated aim.

        Of course it would succeed at their actual aim, which is to outlaw end-to-end encryption.

  5. Anonymous Coward
    Anonymous Coward

    The way to attack -

    Come up with something so vile that nobody can question it, then destroy privacy in the name of stopping it. Once the capability is developed, it WILL be used to spy on any and all communications. While there is security to be had in anonimity, that only works until the powers that be decide to take an interest in you. All it takes to get someone interested is to cut the wrong person off in traffic.

  6. John70

    Testing

    I suggest that testing should be done on ministers encrypted chats first.

    Seems to be plenty of perverts and sex pests in West Minister.

    1. Mishak Silver badge

      Re: Testing

      If we're lucky, that may lead them to conclude the "false-positive" rate is too high and scrap the whole idea.

  7. Neil Barnes Silver badge

    Way to go, Priti

    In the midst of a complete government melt-down, start to assume that you can legislate world-wide. Just keep believing six impossible things before breakfast...

    And don't let the door catch you on the arse on your way out.

    1. Wellyboot Silver badge

      Re: Way to go, Priti

      The name on the door might change, the attitudes within do not.

      When in power 'Think of the children', when in opposition 'Oppose Big Brother'.

      1. Flocke Kroes Silver badge

        Re: Way to go, Priti

        Mostly right, but opposition - no matter the colour - have consistently provided sufficient support for a surveillance state.

  8. Denarius Silver badge

    Another Clipper Chip episode

    Of course no dodgy user would ever avoid official implementations of encrypted chat. {S} So eventually an open source coders comes up with multiple client side software. Said coders living in a country that does not support general snooping and remain anonymous for their own safety? Existing chat coders are in what legal position ? Next, support for complete packet analysis looking for the use of unapproved encryption data streams ? Its as if TLAs have plans for big Data retention and real time analytics and bought the right pollies. Regardless, asking Big tech to snoop is another Fox guarding Hen House scenario.

  9. sitta_europea Silver badge

    I do rather like the fox guarding henhouse analogy.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like