back to article Client-side content scanning is an unworkable, insecure disaster for democracy

Fourteen of the world's leading computer security and cryptography experts have released a paper arguing against the use of client-side scanning because it creates security and privacy risks. Client-side scanning (CSS, not to be confused with Cascading Style Sheets) involves analyzing data on a mobile device or personal …

  1. Anonymous Coward
    Anonymous Coward

    This whole "CSS" fiasco is a disaster only the RIAA and MPAA could love... and you KNOW that is why Apple is doing it, not to "protect the children."

    1. Anonymous Coward
      Anonymous Coward

      Apple gets 3 birds with one stone

      Sell a device, the overpaying consumer is happy with shiny toy, Apple are happy with margins, host governments are happy with ability with mass surveillance and lets Apple repeat.

      Client-side scanning - scan your stuff for content deemed 'inappropriate'?! How did we slide into this nonsense?

      "Give me six lines written by the most honorable of men, and I will find an excuse in them to hang him."

      Cardinal Richelieu, 400 years ago.

      1. elsergiovolador Silver badge

        Re: Apple gets 3 birds with one stone

        "Give me a man and I will find the crime" - Andrey Vyshinsky (a state prosecutor of Joseph Stalin's Moscow Trials and in the Nuremberg trials)

    2. trisul

      Apple has its own agenda

      I think you misunderstand Apple if you think they're just servants to other industry's agendas. No, they have their own. I believe their motivation was to create a mechanism that replicates what Google, Facebook and others are already doing, but keeping it on the device to be able to claim it is under user control.

      Amazingly, they were immediately attacked and the critics seem more willing to accept Google and Facebook doing it on the server, under no control than Apple having it done on the user device, where some control is at least possible.

      It was a PR fiasco for Apple, just as the battery issue where they simply made it possible for the iPhone not to crash when the battery weakened, but never told anyone this was done.

      I'm not defending the scheme, it has been proven to be faulty. Not as faulty as what Google and Facebook are already doing today, but nevertheless dangerous.

      1. doublelayer Silver badge

        Re: Apple has its own agenda

        I agree, and yet think you're wrong about the others. I would rather have Google and Facebook comb through my data on their servers than Apple comb through my data on their devices, because I go to lengths not to put any data on Facebook or Google's servers. For that matter, I also put very little data on Apple's servers. That's where I can exert my control, by not allowing things on other people's servers. If they run it on things I own and use, they have much more access to the place where my data really is, and I have less ability to know what is available to be analyzed and what will happen to it. It's not like they were going to offer a "Do you want all your stuff scanned" switch.

        1. Gordon 10

          Re: Apple has its own agenda

          @Doublelayer & @Hayrick.

          Unfortunately you are both starting from the wrong premise - that is that that average user cares about the difference between client side and server side scanning.

          They don't. Technical implementations aside - the average user has nearly always made the wrong choice when it comes to accepting intrusions vs convenience.

          Whilst you're arguments appeal to the techies and the enlightened I see no reason why it would appeal to the average user, thus you are shouting into the void.

      2. heyrick Silver badge

        Re: Apple has its own agenda

        Downvote because there's a big difference between something that a user chooses to share with the world (whether it be a supposedly private copy on a cloud server or spewed onto Facebook) and something that the user keeps private on their own personal device.

      3. Il'Geller

        Re: Apple has its own agenda

        They all have the same agenda: how to find the best answer in tons of texts, the same as at NIST TREC QA. Only one answer and in its context, only one! Farther the answer should be sold, which is easy.

        Thus Apple, as well as Microsoft, IBM, OpenAI, GumGum and a few thousand more companies are trying to gain each user’s texts, distill patterns and sell. Apple is not any different form the rest ...

  2. Anonymous Coward
    Anonymous Coward

    "Is it prudent to deploy extremely powerful surveillance technology that could easily be extended to undermine basic freedoms?"

    Of course it is, if that's your goal.

  3. Anonymous Coward
    Anonymous Coward

    Chip shortages

    Apple cut production targets for the new iPhone "due to chip shortages".

    Don't make unsubstantiated [blanket accusations of wrongdoing] against your [customers/voters/society], in order to justify [intrusive mass surveillance by the few against the many], lest you suffer [generic excuse of the day].

    This is true whether its Apple, or Janet Yellen.

  4. Neil Barnes Silver badge

    Isn't it wonderful

    That our western technology leaders, encouraged perhaps by our politicians, are so keen and eager to leap headlong into the surveillance society which they claim to so abhor when practised by, say, the CCP?

    This bollocks would have made the Stasi proud to have invented it.

    1. elsergiovolador Silver badge

      Re: Isn't it wonderful

      Power is extremely addictive. Sometimes one get hooked from the first decree and then just want more and more. Then they dream about citizens conspiring against them and they need to know who and when and how. It turns into an obsession. Eventually ends up in mass graves.

      I think when the government looks at this type of surveillance, it's a sign they need to get sacked and turned to a rehab (and never let near power again, like alcoholics near the booze).

      1. Anonymous Coward
        Anonymous Coward

        Re: Isn't it wonderful

        Trouble is, what if said alcoholic is the breadwinner, and lives depend on the drunk drinking or he'll become an angry knurd and start beating on the wife and kids? Seems you lose either way. Similarly if the country in question is an industrial superpower supporting hundreds of millions of people. Sound to me like a lot of people are going to die as a result.

    2. ThatOne Silver badge

      Re: Isn't it wonderful

      > they claim to so abhor when practised by, say, the CCP

      They are just jealous the CCP has so much power and they haven't. It's like when you sit in your old wheezy family sedan and see a sports car zoom past...

      That's why they all rush to copy everything the CCP does.

    3. This post has been deleted by its author

    4. Loyal Commenter Silver badge

      Re: Isn't it wonderful

      How else are they going to ban mining bots from Eve Online?

      Oh, you meant the other CCP.

      As you were...

    5. Ken Hagan Gold badge

      Re: Isn't it wonderful

      The best bit, surely, is that the victims pay for the infrastructure and think it's an essential part of their life.

      1. Falmari Silver badge

        Re: Isn't it wonderful

        @Ken Hagan "The best bit, surely, is that the victims pay for the infrastructure"

        Not just the infrastructure they also pay the running costs. The scan runs on the victim's device so it is using their resources. Also the results from the scan will be sent with the image, using the victim's data allowance.

        1. Paul Hovnanian Silver badge

          Re: Isn't it wonderful

          Under the CCP, you pay for the bullet. So what's new?

    6. Cliffwilliams44 Bronze badge

      Re: Isn't it wonderful

      For years after 9/11 the Left has crowed and crowed about intrusive surveillance on private citizens, and rightly so!

      But now that they see that this surveillance can be used against their political enemies, well that is just fine and dandy!

      1. veti Silver badge

        Re: Isn't it wonderful

        Downvoted for treating "the Left" as a monolith. You're talking about scores of factions, each with different perspectives and priorities. Of course they don't all agree on everything, the wonder is if they can all be persuaded to agree on anything.

        (Side note, this is why trolling your enemies is tactically stupid. It unites them like nothing else could. Trump proved that.)

        1. Charles 9

          Re: Isn't it wonderful

          Thing is, it was still a very close thing, as trolling the enemy ALSO rallied the base to a greater degree than just four years ago. IOW, for the GOP "pwning da Libs" is its own draw. A narrow shift here and there, and Trump would've been re-elected. Now the GOP is weighing the scales to not leave things to chance next time while still rallying the base. At some point, they're gonna stop caring about rallying the enemy...they simply won't be able to vote.

          Basically, trolling your enemies is only tactically stupid if it rallies the enemy MORE than it rallies the base. If the latter is true, you come out ahead anyway.

  5. Adrian 4


    Listen to experts ?

    No. We listen to vendors now.

  6. Anonymous Coward
    Anonymous Coward

    More Misdirection???

    Last time I looked, the typical smartphone has huge amounts of both read-only and read/write memory -- plus huge amounts of CPU power in four or eight CPUs.


    How do I know that this debate is not pure misdirection -- and CSS is ALREADY embedded in my smartphone?


    Quote (William Burroughs): "The paranoid is a person who knows a little of what is going on"


    P.S. My mobile is a ten year old 2G feature phone.....

    P.P.S I seem to recall rumours (!) that the NSA had snooping technology embedded in Cisco devices......just saying!

    1. Anonymous Coward
      Anonymous Coward

      Re: More Misdirection???

      On the new Android 11, the day after taking some pictures of extremely orange CA sunset, I got a notice that the Android smart-something had "improved" one of my pictures and "did I like it?".

      That's not CSS because I wasn't uploading anything - but yes the minders are already in the kitchen helping themselves to whatever is in the fridge. Fact of life, full stop.

      1. Denarius

        Re: More Misdirection???

        In the current SMS spam deluge on a near new phone , I sometimes get Google alerts that incoming is spam. How does Google know that if I have not set up SMS scanning , sharing or anything else with them and are logged out ? If the Telco was checking I could understand that they might note unusual volumes coming from random numbers, but a remote entity ? Any suggestions for an Android mail client that is not trying to send everything like address books to Google or otherwise snoop. Even willing to spend money.

        1. Dan 55 Silver badge

          Re: More Misdirection???

          K9 Mail or FairEmail on FDroid are probably what you're looking for.

          As for SMS, you would probably need to disable the read SMS permission on Google Play Services and use an alternative SMS client if the current one is Android Messages, e.g. Signal (which also includes an SMS client).

  7. Anonymous Coward
    Anonymous Coward

    Ideal for scammers... "We've scanned your device and found illegal pictures" alongside a link to Apple's posting of the initial idea for 'proof' that they had the ability

  8. Omnipresent Bronze badge

    minority report

    Wait until the AI sends a droid after you.

    1. The Travelling Dangleberries

      Re: minority report

      Or a small drone carrying a payload of flying nanobots who in turn are each carrying a tiny (but sufficient) payload of ricin arriving at your home in the early hours of the morning.

      Sleeping with your bedroom window open might not turn out to be as good for you as you thought.

      1. Charles 9

        Re: minority report

        Oh, and if your house has a chimney in it, you're screwed, then?

  9. elsergiovolador Silver badge


    The problem is that this does not matter. If power hungry hacks want to scan your content they will.

    If expert advice does not align with the government / corporate desires, then experts are changed until the advice meets the government / corporate goal.

  10. Doctor Syntax Silver badge

    "Moreover, the issue is not just illegal content. In the UK, for example, the Draft Online Safety Bill contemplates a requirement to block legal speech that some authority finds objectionable."

    Well, why did anyone wonder why they wanted to take back control?

    1. Paul Hovnanian Silver badge
      Big Brother

      "a requirement to block legal speech that some authority finds objectionable"

      And just where to they think they get the authority to do suc~po_~{po ~poz~ppo\~{

      [NO CARRIER]

      1. Anonymous Coward
        Anonymous Coward

        Divine right?

  11. ThatOne Silver badge
    Big Brother

    OMG, it it too late already?

    > In the UK, for example, the Draft Online Safety Bill contemplates a requirement to block legal speech that some authority finds objectionable

    Gosh, so even supposedly not (too) repressive regimes are already moving to control stuff they deem "politically unacceptable" (i.e., dissenting or critical). It's not just some theoretical possibility, it's already happening on your very own doorstep, no need to live in N.Korea or whatever.

    Addressing this with pure technicalities ("only CSAM lists") won't help of course, but I assume it wasn't meant to help, just to make a "we're doing everything we can" type of excuses possible.

    This has to be culled, immediately, before it becomes the norm. Before the terminally numb Facebookers start gibbering about not having anything to hide. We all have something to hide, considering that our very existence necessarily bothers somebody, somebody who might do something about it.

    1. Anonymous Coward
      Anonymous Coward

      Re: OMG, it it too late already?

      Regardless of whether I have anything to hide, I don't trust proprietary software, software which has behaviour that is deliberately nonreproducible, or software the conclusions of which I can't challenge quickly, effectively, and at no cost to myself. If software with any of those attributes is deployed for the purpose of determining whether something I have is something I ought to want to hide, I reject and refuse it absolutely and unconditionally. "I have nothing to hide" is not even wrong: it doesn't answer the charge.

      1. heyrick Silver badge

        Re: OMG, it it too late already?

        "or software the conclusions of which I can't challenge quickly, effectively, and at no cost to myself"

        Good luck getting a loan, or credit, or a new job. A scary amount of day to day stuff is processed by some black box AI instead of actual people (because people need paid, a computer doesn't). Furthermore it seems from various reports that these things are set to "reject by default", plus there's no knowing that a machine rejected you, no ability to know why, and ultimately no accountability. Computer says No, so piss off (or we'll accept you but at a rate that will screw you).

    2. Doctor Syntax Silver badge

      Re: OMG, it it too late already?

      "We all have something to hide"

      In fact, we have stuff we're contractually bound to hide. Show me someone who does anything online and thinks they haven't and I'll show you someone who clicked through the T&Cs without reading them.

      1. heyrick Silver badge

        Re: OMG, it it too late already?

        The government has already mandated that it's a crime to not hand over passwords on demand.

  12. yetanotheraoc Silver badge

    Black box

    It's that episode of Star Trek where the ship's computer was subverted and Kirk on trial was able to cross-examine the computer. Except in Apple's setup, there is no way to audit the findings.

    1. ThatOne Silver badge

      Re: Black box

      That's because Star Trek is (idealized) fiction, while Apple is fruit reality.

      Reality is always unforgiving and arbitrary.

  13. Il'Geller

    It is insanely, astronomically expensive to scan texts for further use, such as obtaining ad patterns, externally. Indeed, all words of the texts must be annotated, logical connections between patterns and parts of texts must be established, which costs absolutely incredible money. It is much cheaper to process texts on user computers and then receive patterns, for example, for advertising, already from them directly.

    1. doublelayer Silver badge

      Ah, you're back. I thought you left.

      Standard problems with your repeated comments apply: not text under consideration, these are images. Also of no relevance as we're not talking about advertising and the problem is privacy.

      1. Il'Geller

        Any text in the AI system has the significance of advertising: it is delivered only to whoever wants to read it. Any image in the AI system is annotated with text, delivered based on 1) this text and 2) the image’s specific characteristics.

        The problem of privacy in AI does not exist, on the one hand. On the other hand, it does not exist either. A text can be prepared on a personal computer, becoming a set of incomprehensible, not-readable patterns. Thus, the absolute confidentiality.

        At the same time, the AI means total control over information: texts can inevitably be censored easily and immediately. No confidentiality at all.

  14. Cliffwilliams44 Bronze badge

    If you would not trust the NSA with this technology, why would you trust Apple!

    Just sayin'

  15. martinusher Silver badge

    Of course they're scanning for 'content'

    A computer is a machine, it knows nothing of 'child sex' content. It can be given a filter that looks at content and can decide whether the content belongs in particular category according to that filter. Bit it still knows nothing about child sex, that's just a label we humans give to content that is identified by a filter. This should be obvious to everyone at Apple so it should not come as a surprise when the definition of unlawful content gets adjusted to scan for unlawful political content or undesirable thought.

    They might also ponder the question "Quis custodiet ipsos custodes?" ("Who watches the watchmen?")

    1. Charles 9

      Re: Of course they're scanning for 'content'

      To which one has to ponder what happens if the answer comes back, "Testudines omnes descensus." ("Turtles all the way down.")...

  16. Anonymous Coward
    Anonymous Coward

    It just a regime tool

    The abuse angle is just the implementation vehicle

    Trust Apple, et al?


  17. nagi

    Nothing shows just how prone even Apple themselves think the tech to false positives than having a non-zero amount of 'allowable' detected material before reporting.

  18. Dave 15


    This is just not acceptable. One of the reasons to avoid any cloud storage at all. In fact these days corporations must be wondering whether ANY connection is actually sensible.

    1. Crypto Monad Silver badge

      Re: 1984

      Except this isn't about cloud storage. It's about the device in your hand being controlled by third parties. The device you paid for and own.

      Going forward, to be permitted to have such a device, you consent to all your use of that device being scanned and analyzed and reported back to HQ if found to be "anomalous".

      1. Anonymous Coward
        Anonymous Coward

        Re: 1984

        In that case, we're screwed, because not even rolling your own software can save you. Radio chips are by necessity regulated by the government (because they operate on a government-owned medium) so can be mandated to phone home at the physical level, no exceptions (because pirate or roll-your-own chips would be by definition illegal--and good luck getting the technical skills needed to make your own cell-compliant radio hardware without drawing attention--it's a select-enough club to be easily Big Brother'ed).

  19. Dasreg

    These pages always make me wonder about electronic voting

  20. heyrick Silver badge

    What worries me

    Is the idea that an image with minor changes can result in the same hash. Well, what exactly constitutes a "minor" change? When it comes to hashing, either it's the same thing or it isn't (collisions aside). When it comes to images, there's really only the concept of "similarity". You might have noticed when looking for an image on Google or Bing that their suggestions for similar images are pretty good, except for the ones that are often so different you wonder how the hell anything thought it was similar.

    Now, let's do a little thought exercise. If I have a picture of a naked twelve year old (think of the children, etc) and I change a couple of pixels, it ought to hash as the same image, right? So I take that and change a couple of different pixels. Same image, yes? How many iterations should I go through in order to have a picture of my cat enjoying snow for the first time have the same hash as a naked child?

    1. Charles 9

      Re: What worries me

      That's what a Generative Adversarial Network can do, actually. I have to wonder if someone's willing to put a GAN to use to turn innocuous images into "illegal" ones...

    2. DevOpsTimothyC

      Re: What worries me

      Imagine I have monkey selfie picture and I whitened the teeth a little or changed the eye colour from yellow to be more orange or more brown. From an MD5/SHA256 hash point of view it's a completely different image. If I simply cropped the image or added a border it would also generate a different MD5/SHA256 hash, however I think everyone would agree that it's essentially the same image.

      If I added clothes and hat to the image I think we'd all agree that it's altered enough to be a different image.

      While I don't know how such content type fingerprints are made (for images, video / audio) I think we'd all agree that with sufficient effort algorithms could identify the first set of changes above as the same image, but the second set of changes as a different image. After all we can do fingerprint, facial and iris recognition and you're not going to get the same pictures for them.

  21. Anonymous Coward
    Anonymous Coward

    They don't care

    Apple knows all this, but they don't really care. All they want is to curry favor with the U.S. government in order to gain political support in the event China clamps down on them. Its management, being gay, feels a heightened sense of responsibility since many homosexuals believe they need to "turn" young boys to their orientation by introducing them to the pleasures of the flesh early on.

    1. Jimmy2Cows Silver badge

      Citation desperately needed

      Was with you on the first two sentences.

      Then you went off at a bizarre homophobic tangent. If you have evidence of such allegations please reference them, or better yet send them to the FBI.

  22. Zippy´s Sausage Factory

    Of course it's too late

    The fact that this exists won't be lost on many authoritarian regimes, who I'm sure will want to draft laws not only mandating it on all client devices in their country, but also using a list of content that they deem people shouldn't have. For example, in China, I'm sure this will include a certain picture of a man standing in front of a tank, or pictures of the Disney version of A A Milne's beloved bear.

    1984? No, that'll be banned too, because it shows that this sort of population monitoring results in the overthrow of said government...

  23. Hakhenaten

    CSS Technology does not exist

    The "The Risks of Client-Side Scanning" paper talks about CSS (client side scanning) ‘Technology’ as if it exists (and then has a big hissy fit over it). But it does not exist — it would have to be an OS service (likely internal, private), that could be employed by entitled process (originating from capitulating vendor or inserted by OS exploiter). However, CSS as an approach certainly exists. Apple's CSAM image db match detection uses the CSS approach but is custom code. To redeploy that code (use a different neural hash db, change the detection thresholds, change the command & control) you would have to do quite a lot of hacking, but then you've lost bulk distribution. In no way is it flip a data switch to redeploy for different purpose.

    Apple's CSAM image detection arrives via OS distribution, integral with the OS. That they deployed a CSS approach changes nothing. They could already have included code as a result of capitulation in previous versions, and they could do in future. That they are tackling their obligation not to host CSAM images does not tell us if/when they will/have capitulate(d) with other jurisdictional pressures around the world. And if/when they do, it will be inclusion and distribution of another chunk of custom code targeting a different need in a different optimal way. It wont be trying to make a helicopter out of a boat.

  24. dave 76

    I'm normally fairly happy to upgrade to Apple's new releases when they come out but I've decided to not upgrade to iOS15 - which also means that I won't be replacing my iDevices like I had planned.

    Yes I know that this scanning has been put on hold, but I bet that it is still in the software and just requires a simple switch to turn on. I'm not happy to give them the opportunity.

    I am just one person with a couple of devices, but if sufficient people jump off the upgrade train at this point, it may force Apple to consider how it assures people that this functionality will not be slipped in the back door.

    1. Anonymous Coward
      Anonymous Coward

      Perhaps, but what if their math tells them they have enough and shouldn't risk government wrath?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like