back to article Apple stalls CSAM auto-scan on devices after 'feedback' from everyone on Earth

Apple on Friday said it intends to delay the introduction of its plan to commandeer customers' own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative. "Previously we announced plans for features intended to help protect children …

  1. Anonymous Coward
    Anonymous Coward

    "Apple would refuse such demands"

    Don't piss on my cornflakes and tell me it's raining

    1. MrDamage Silver badge

      It's true.

      They will refuse such demands. Until such demand is placed in the way of a work order, with payment offered.

      1. doublelayer Silver badge

        Re: It's true.

        Or on the door of their local office, with the locks changed. There are very few organizations that stand in the way of autocratic orders and set up their systems to facilitate doing so. Apple isn't one of them. Their record is better than some others, but far from perfect.

  2. mark l 2 Silver badge

    "Could governments force Apple to add non-CSAM images to the hash list?" the company asked in its interview of itself, and then responded, "No. Apple would refuse such demands and our system has been designed to prevent that from happening."

    But Apple aren't creating these hash lists, its done by a third party and will be constantly updates when new images are identified by law enforcement. So it would be trivial for government agencies to create a new hash, claim its of some abuse image and get it added to the list Apple uses.

    1. martinusher Silver badge

      There is no inherent property of information called "nasty child porn". Its something we -- we humans -- use to define for a particular class of information. So its easily changed to scan for anything from 'nascent terrorist' to 'fluffy bunnies'.

      Child porn seems to be the 'go-to' any time a rationale is needed to justify creating some additional governmental (or even extra- governmental) powers. It is almost as if the entire child porn universe was created with the express intent of providing such justification. (I hadn't heard of it for my first 60 years or so then suddenly its everywhere......has to be dealt with.....imperative.....future of civilization is at stake.......)

      1. John Brown (no body) Silver badge
        Childcatcher

        Clearly this is an attempt by the New World Order of Satan Worshipping Paedophiles to gather as much child porn for themselves as they possible can while at the same time backdooring billions of Apple devices so they can control the sheeple!

    2. elsergiovolador Silver badge

      "Could governments force Apple to add non-CSAM images to the hash list?" the company asked in its interview of itself, and then responded, "No. Apple would refuse such demands and our system has been designed to prevent that from happening."

      Take notice of careful dishonest and manipulative language. They don't actually answer the question. Of course government can force them to do it once they have this capability and any system they have to prevent it would be deemed an obstruction.

      1. David 132 Silver badge

        To loosely paraphrase Pratchett, you can’t really say to a government “oh yeah, you and whose army?” - because they only have to point a finger out the window towards the barracks…

      2. tubedogg

        "Of course government can force them to do it once they have this capability"

        This is the weirdest part of this to me. Everybody is acting as though no government in the world had the power or the will to force Apple to do scanning, on-device or otherwise, of anything they wanted prior to this. Yes, the fact Apple has a system for it makes a demand simpler. (Not that this kind of a system is that hard to build if you aren't terribly worried about making it extraordinarily accurate.) But such a demand was never prevented by the simplicity of compliance. To claim otherwise is to ignore how Apple behaves in China, which is to say following local law ("demand") and giving up control. As I mentioned in another comment, they don't even control their own iCloud servers there. After threatening to leave the market, they ceded to Russia's demand to preinstall certain apps, which is not something that happens anywhere else (except maybe, again, China) to my knowledge.

        "They don't actually answer the question."

        Just because you think that in the end, they wouldn't be successful in refusing, doesn't mean they didn't answer the question. The question was "Could governments do this?" and the response was "No." They answered the question, even if you don't believe the answer.

        1. doublelayer Silver badge

          They lied. That's not the correct answer and they know it. You've specified why we know it's a lie already--they've repeatedly agreed to use their tech for repressive countries before, so we know they will do it again.

          Which is why the on-device scanning is a problem. If Russia came to them and told them to write an on-device scanner, they'd grumble for a bit, maybe question it, then do it. But it would at least be public. Now, anybody can just slightly adjust the code they've already written. It's much easier to abuse and their use of it at all proves that their likelihood to grumble is quickly eroding.

      3. katrinab Silver badge
        Megaphone

        Or if you want an actual working example, the English courts forced ISPs to use their child porn blocking filters to also block access to the likes of PirateBay.

    3. Anonymous Coward
      Anonymous Coward

      bullshit. It isn't law enforcement that's creating the hashes from the images, its a private, non-profit organization.

      1. tubedogg

        NCMEC is a quasi-governmental organization in the US. You are correct they are technically a private nonprofit organization, but they are, for example, the only group allowed to legally possess CSAM in the US—meaning they have a specific carveout in federal law. They are very closely linked with the US federal government.

        That was the source of the initial claims that there were Fourth Amendment implications to the scanning. Which to be clear, my understanding is there is absolutely no validity to those claims because of the way this would be structured. And if there were, the entire system of scanning and reporting CSAM that already exists would be unconstitutional, and it has survived for ~25 years already.

        1. Anonymous Coward
          Anonymous Coward

          August just ended, and Apple is doing an about face, so I assume they saw the August sales figures and didn't like what they saw.

          This is what happens when you don't trust your customers, but do trust random quasi-governmental organisations with secret lists and NDA protected algos.

          Customers don't buy your kit.

          It's all too late now, one of the creepy surveillance states (UK or Australia most likely) will pass laws requring Apple run that code with *their* secret list. Everyone thinks of China and middle eastern countries, but the initial enabler for privacy invasions is usually some fat miserable cow in a loveless marriage in a 5 eyes country.

          And Apple's computers too will suffer, because an Iphone is just a different type of computer using the same damn services.

          WTF were they thinking. "here pay $1000 for our device you damn pedos".

    4. Trigonoceps occipitalis

      I'm' thankfully, not inclined that way. If I was surely this would make me try to produce my own porn and not just download existing pictures. Result, more children abused?

      Of course most users seem to be inadequate individuals and it is criminals trying to make a profit. Special offer, pictures not sold elsewhere (not lying).

      1. Anonymous Coward
        Anonymous Coward

        Artificial intelligence cures all ills, haven't you heard?

  3. gnasher729 Silver badge

    Two things that would be totally acceptable: 1. Apple has the right to keep illegal materials off its servers. 2. Most people don’t want to look at some illegal materials at all, and would be happy if illegal materials can’t get on their phone.

    That’s what they should have done. 1. When the phone uploads images to Apples servers, the phone just refuses. (And you have the choice to send the photo to a manual review).

    2. When my browser downloads illegal images, it refuses and returns a status 403 instead, if I opt-in. Nobody learns about it, Apples lawyers make sure that I haven’t legally downloaded such an image, and opting in shows that I’m legally not attempting to download anything. So I have strong legal protection, and again nobody learns what’s been filtered out.

    1. tubedogg

      #1 would almost certainly breach their reporting obligations under federal law for CSAM. Telling the phone to just not upload it means they have knowledge of its existence. (Not directly, but by building a system that knows what to look for and then flagging it to just not upload, they're intentionally preventing receiving material that would have to be reported, which is probably not kosher.)

      #2 would, too, because if there's a match to the point where something is preventing the download, there is awareness of a URL ostensibly containing it.

      1. Anonymous Coward
        Anonymous Coward

        "because if there's a match to the point where something is preventing the download, there is awareness of a URL ostensibly containing it."

        And here you are doing the slippery slope in real time.

        Now the *URL* itself is the thing you must never see?! OnlyF**s and Po*nh*b both got hit with such broad, overreaching, false, child porn claims. I've blanked out the names, lest iPhone users read them, visit those sites, and get inspired to diddle kids.

        "#1 would almost certainly breach their reporting obligations under federal law"

        *This* exactly: they might say "Apple'll review the reports to eliminate false positives", but this will be the first thing thrown out the window when challenged.

        Man this slope is slippy.

        The real problem with saying "this image is child porn and cannot be shown" in the browser, is people would be able to review the false positives.

    2. Demmers

      "Apple has the right to keep illegal materials off its servers", "When my browser downloads illegal images" - but how is it known to be illegal? That's the sticky point that's easier said than done. YOU know what's illegal, but to a computer, it's just random bits of, well, bits, hence why anyone can upload anything.

      This is why the CSAM database exists, but to fit your preference, it would have to exist on your phone scanning your browser, which brings us back round to square one.

      1. Anonymous Coward
        Anonymous Coward

        Not quite.

        On your first point, stopping 100% of illegal material with no false positives is impossible, but stopping most known illegal CSM material is a feature that is long overdue. 1.) we don't always know what's illegal, and 2) by your statement, "we" would already be well afoul of US law if we look at it and decide it WAS illegal. 3) My computer IS able to identify things on my behalf without me looking at them, and potentially intercede. Hashes are wonderful things, and there are other tools that can be brought to bear, several of them are outlined in Apples proposal. With some javascript or a few browser tweaks you could abort the image/media download before enough arrived to be legally actionable.

        On some level your second point seems to acknowledge this, as putting some of these feature on the device is one of the better options(space and resources permitting), but it isn't the only one. You could also push a bunch of that logic off the device if it has an internet connection.

        One other handy part of the defensive blocking is that people can also fire off reports where the offensive content was coming from, which outside of CSM would be handy for harassment and women receiving a landslide of unwanted D-pics.

        Still, Apples proposal was too flawed to stand, both in technical operation and transparency. This is a much harder problem then they gave it credit for, and their hubris runs strong at the best of times. If they don't switch this to open development and public review, they will keep getting their faces burned off. You can't black box this kind of thing.

        1. Anonymous Coward
          Anonymous Coward

          Re: Not quite.

          How is it hard? Scan the public web and close the pedo sites! What has this to do with scanning iPhone users *private* photos or even their iPhone?

          You call these hashes, but the image matching algos they're using are super fuzzy, that Microsoft one they use, reduces images down to a 26x26 mini-icon grid FFS. Imagine comparing two icons and deciding if they're the same!

          I assume Apple didn't want to expend hundreds of megabytes on MD5 hash values (which would require a hash for each know variant), hence they went with a bit of AI. Sure its fuzzier, sure it reports more false positives, but they can just hire a few teleworkers to review their customer private photos, and apparently cannot see any issue with that.

          1. katrinab Silver badge
            Meh

            Re: Not quite.

            I'm pretty sure they already do that. I'm also pretty sure that paedos don't use websites to distribute their photos.

    3. Falmari Silver badge

      People have rights too!

      @gnasher729 “1. Apple has the right to keep illegal materials off its servers.”

      They do but that right is not a pass to do anything they want. People have rights too. I think the right not to have spyware install on their phone to scan images using the phone owner’s resources trumps Apple’s right.

  4. Ian Mason

    Hey Apple!

    You took your reputation as "more on the side of your customer's privacy than any other vendor out there", you doused it in petrol, and set light to it.

    "pause"? - Don't make me laugh cynically in your face. You think that "pausing" the rollout will help mend your reputation? Nope, you're toast. If you ever run another "privacy focused" ad campaign like you did recently in the UK you'll just remind customers that you're a bunch of hypocrites. The only thing that might save you is an about face, adding security mechanisms to your products that make it impossible for you to ever try adding anything like this again, and *proving* it to the public.

    You've already, to my personal knowledge, cost yourself sales from this - i.e. people have said to me "I'm not buying any more Apple kit because of this." and one person I know is in the middle of eradicating Apple products from his house (quite a few) as a direct consequence of this.

    1. tubedogg

      Re: Hey Apple!

      So just to be clear…Apple thus far has not enabled CSAM scanning on the server side, which how they've gotten away with that is unclear because it's a requirement under US federal law and probably elsewhere.

      So you'd rather they go along with what every other cloud company already does and scan it on the server side without explicitly making users aware that's what's happening?

      I get the implications of having an on-device scanner. It is absolutely not the privacy nightmare people are claiming, because there's no breach of privacy in scanning material you were uploading anyway. I do understand the possibility of a government requiring it to be adjusted to scan other things. But…

      I also understand that if people are uploading all their photos to iCloud anyway (which is what's going on here), there's nothing stopping a government from requiring server-side scanning from any company, Apple included, for all these various nefarious purposes that people keep mentioning. Apple's not even allowed to run their own servers in China; neither is any other foreign company, to my knowledge. Beyond trying to promote local competition, why do you think that is?

      Further, I understand that there was never anything stopping any government from handing Apple on-device scanning code and forcing them to adopt it for sales to continue in that area—or even just passing a law requiring that device makers do it themselves. The idea that literally the only thing preventing this has been Apple not developing on-device scanning is absolutely absurd. It's not like they invented the idea of it, nor are they even going about it in a particularly novel way as far as the matching goes.

      There are problems with lots of things Apple does. The fact that people have seized upon this one (on-device scanning, not discussing iMessage AI recognition as that's a separate thing) as the end of them is mind-boggling to me.

      1. MrDamage Silver badge

        Re: Hey Apple!

        > "There are problems with lots of things Apple does. The fact that people have seized upon this one (on-device scanning, not discussing iMessage AI recognition as that's a separate thing) as the end of them is mind-boggling to me."

        It's not that mind boggling at all. Pretty much since the beginning, Apples catch cry has been "more private and secure than Android. You can trust us with your data." Now they have shown that not only can you not trust Apple, they consider the phone that you own, really belongs to Apple.

        > "So you'd rather they go along with what every other cloud company already does and scan it on the server side without explicitly making users aware that's what's happening?"

        I'd rather use a non-US provider, like Proton, so I can easily encrypt my cloud-stored files.

        1. tubedogg

          Re: Hey Apple!

          "Pretty much since the beginning, Apples catch cry has been 'more private and secure than Android. You can trust us with your data.'"

          So let's compare apples to apples (no pun intended). Google Photos' cloud storage scans uploaded photos for CSAM. Is that invasive of privacy? Apple's plan is to scan photos that are headed for iCloud Photo Library, the iOS equivalent of Google Photos' cloud storage, but to do it in such a way that they don't have the results of hashes unless it matches known CSAM. Is that invasive of privacy? More or less so than Google Photos' scanning?

          "Now they have shown that not only can you not trust Apple, they consider the phone that you own, really belongs to Apple."

          No, the server that the photo is being uploaded to belongs to Apple.

          Let's use an analogy. If you were using a third-party photo upload app on iPhone and it had CSAM scanning built-in, would you be upset and claiming you couldn't trust the photo upload app? Would you be claiming that the photo upload app maker is seizing control of the phone that you own?

          Apple has a photo upload app called Photos. They intend to scan for CSAM. How is that different?

          "I'd rather use a non-US provider, like Proton, so I can easily encrypt my cloud-stored files."

          There are plenty of US-based cloud storage providers through which you can store encrypted files, and plenty of non-US-based cloud storage providers through which you can store unencrypted files. None of which is terribly relevant to the current topic, because you can't use iCloud Photo Library with a provider other than Apple, you never have been able to, and it wasn't something that is in any changed by scanning for CSAM, whether on-device or otherwise.

          1. tip pc Silver badge

            Re: Hey Apple!

            As far as I know, other than this apple made thing there are no on device csam scanning systems on any hand held or laptop.

            This is a huge change that could see you incarcerated over a false positive that you have no control over.

            An example could be, you live in a country not subject to US laws, your phone flags a bunch of stuff but you have no clue, you travel to the us and get arrested on landing.

            You have no idea what’s going on as they start barking acronyms at you.

            They won’t show you the photos and you won’t know any details like dates, locations etc.

            Years later, on release there will not be any evidence and you will never prove your innocence against the judgement of the US based on data they will never let you see.

            your entirely reliant on the us justice system not stitching you up to meet some target.

            What Apple have done has more problems related to freedom and oppression and will just move csam to other platforms. Csam won’t cease to exist because apple will scan on device.

            The money would have been better spent on social programs creating safe environments and fostering the view that csam is abhorrent.

          2. doublelayer Silver badge

            Re: Hey Apple!

            "So let's compare apples to apples (no pun intended). Google Photos' cloud storage scans uploaded photos for CSAM. Is that invasive of privacy?"

            It's something you know when starting to use the service, so a lot less. More reasons later.

            "Apple's plan is to scan photos that are headed for iCloud Photo Library, the iOS equivalent of Google Photos' cloud storage, but to do it in such a way that they don't have the results of hashes unless it matches known CSAM."

            They are going to have the images. So they can recalculate the hashes any time they like. Your phrasing is incredibly misleading because it sounds as if they're keeping data off their servers when they're doing nothing of the kind.

            "Is that invasive of privacy? More or less so than Google Photos' scanning?"

            Yes and more. It is more invasive because it scans on a device which you own, not the server that they own. It gives them a probe into the device they don't own which can analyze information to which they have no right, so it's really easy to overstep the boundaries they currently claim to adhere to. Google photos only scans what comes to them, and they tell you they're going to do so. Definitely more invasive.

      2. doublelayer Silver badge

        Re: Hey Apple!

        "So you'd rather they go along with what every other cloud company already does and scan it on the server side without explicitly making users aware that's what's happening?"

        Over on-device, yes. Because then I can decide not to upload things and they don't run their scanner. Although an end-to-end encrypted backup would be even nicer.

        "Further, I understand that there was never anything stopping any government from handing Apple on-device scanning code and forcing them to adopt it for sales to continue in that area—or even just passing a law requiring that device makers do it themselves. The idea that literally the only thing preventing this has been Apple not developing on-device scanning is absolutely absurd."

        No, it's not. There is either code running on the device or not. If it's not, and the government mandates it, we can do what we want to about this. We could try to block the law. Or change the law. Or find a legal reason such a law is not permitted, which works better in countries with constitutional privacy systems but could also be enforced under human rights treaties. Or not buy any new equipment. If it's Apple doing it without any law, we have no control over it at all and moreover, no knowledge of how they're using it.

        1. Anonymous Coward
          Anonymous Coward

          Re: Hey Apple!

          If you don't like the choices offered (Android style, where the servers can view and scan everything or Apples device side solution) then push for the ability to make other choices.

          It seems most of your issues with Apples approach amount to either not trusting what the black box code is doing, or that new and worse versions will be slipped onto your phone at a later date.

          A better solution to that would be to force Apple to submit to some transparency, and empower the owners of the devices to decide what does and does not run on their property. Win that fight, and Apple can design or claim whatever it want's. They may choose to block your access to some of their online services like iCloud storage as a result, but if that is transparent, and the choice you make, so be it.

          Instead of fighting Apple on the big picture issues, we are bickering with each other in forums and nitpicking the wrong details. We should be leaning on Apple and congress to retake control of our own property instead.

          If we are going to have to fight, pick the right fight, the one where we actually win, not the one where even if you are left standing you just lost less, or we all fight each other and get no where.

        2. tubedogg

          Re: Hey Apple!

          "Because then I can decide not to upload things and they don't run their scanner."

          If you have iCloud Photo Library (iCPL) turned on, photos get uploaded. If Apple did server-side scanning, you wouldn't have any more choice than you would have with on-device scanning, because iCPL uploads all photos, and always has.

          Therefore, your choice is to use iCPL or not. Claiming that if Apple did server-side scanning you could choose to not upload things is simply not how iCPL has ever worked. (And conversely, if there are photos in something like WhatsApp that aren't stored in the system photo library, they wouldn't be subject to scanning [by Apple; they are absolutely scanned by Facebook/WhatsApp, and for far more nefarious purposes] in any event, so again the iCPL scanning being on-device has zero effect on the control you have over uploading something or not.)

          "Or not buy any new equipment. If it's Apple doing it without any law, we have no control over it at all and moreover, no knowledge of how they're using it."

          So if the government mandates it, you could choose to not buy any more equipment, but if Apple does this, you still are required to buy an iPhone? Your control is not buying the product. You said it yourself, but then go on to claim that somehow because it's Apple you can't do anything about it.

          And do you honestly think if it was a government-mandated thing it wouldn't be completely classified as to how it works and what it does, even in countries with constitutional privacy systems? It would be far more of a mystery than it is right now, absolutely guaranteed.

          1. YetAnotherJoeBlow

            Re: Hey Apple!

            "And do you honestly think if it was a government-mandated thing it wouldn't be completely classified as to how it works and what it does"

            Either you honestly do not know or you are trolling which is it?

          2. doublelayer Silver badge

            Re: Hey Apple!

            "If you have iCloud Photo Library (iCPL) turned on, photos get uploaded. If Apple did server-side scanning, you wouldn't have any more choice than you would have with on-device scanning, because iCPL uploads all photos, and always has."

            Would it surprise you to here that, on my iPhone, ICPL is already turned off? Because it is. For other reasons, but I didn't want that on. As for other things scanning, I make decisions on whether to install apps based on the scanning they're going to do. WhatsApp is not to be found on my devices.

            The difference is that, if they're scanning my content on their servers, then they need to take lots of steps to start scanning data I never put on their servers. If they're scanning on my device, it's a two-line change in their code to scan all of the photo library and a few more lines to start scanning everything else. The traffic would already be expected, the software couldn't be disabled, and therefore the risks are much higher. And as I said at the beginning, an encrypted backup where they can't scan either is still my ideal solution.

            "So if the government mandates it, you could choose to not buy any more equipment, but if Apple does this, you still are required to buy an iPhone? Your control is not buying the product. You said it yourself, but then go on to claim that somehow because it's Apple you can't do anything about it."

            Yeah, that was poorly phrased. I'll try again. If the government mandates it, then there is information which can permit it to be avoided. It is not forced on my equipment until the software update which includes it, which I can refuse. Whether a device does it or not can be determined when purchasing a device, and therefore it can be avoided. If Apple does it without any mandates, then it's sneaked into devices (like they've already done) and control is much less.

            Another point is relevant in this part of the discussion, which is that no government has mandated this yet. I would prefer to deal with a government-mandated version than an Apple without-legitimacy version, but at the moment, we could deal with neither because the government hasn't passed any such law. You're making me choose between a -95% option and a -85% option when a 0% option is also available.

            "And do you honestly think if it was a government-mandated thing it wouldn't be completely classified as to how it works and what it does,"

            No, I don't. The law which inserts it has to be public. The system used doesn't have to be, but to mandate the installation of code requires a law which can be investigated and challenged. Some countries might try to push through something without telling anyone, except, oh right, they haven't.

    2. tubedogg

      Re: Hey Apple!

      "The only thing that might save you is an about face, adding security mechanisms to your products that make it impossible for you to ever try adding anything like this again, and *proving* it to the public."

      How exactly do you propose that Apple create an OS that prevents them, the OS developer, through a mechanism they cannot break through, from making changes to it? Or more specifically, not even making changes, but from implementing a very specific feature?

      I missed this bit the first time around and I have no idea how, because even setting the current debate aside, what you propose is impossible short of a device literally never contacting its maker…which seems like quite a bit of a problem for getting security updates, not to mention new feature updates.

      "i.e. people have said to me 'I'm not buying any more Apple kit because of this.'"

      As much as this whole spectacle surrounding this is absolutely absurd, this is actually good, because it proves my point. If you don't like what they're doing, move on. There is zero reason to make all these ridiculous claims along the way.

      1. YetAnotherJoeBlow

        Re: Hey Apple!

        "There is zero reason to make all these ridiculous claims along the way."

        Yes. I agree; like:

        The failure rate is only 1 in a trillion.

        1. Anonymous Coward
          Anonymous Coward

          Re: Hey Apple!

          "3 in 100 million per 1 image" according to Apple's claim for its scanner.

          So 10k images per customer, potentially 3 in every 10000 Apple customers are falsly accused.

      2. Ian Mason

        Re: Hey Apple!

        > How exactly do you propose that Apple create an OS that prevents them, the OS developer, through a mechanism they cannot break through, from making changes to it?

        How? Very simple, require a trusted third party to review their code and certify that it not sliding any back doors in. Add usual code signing mechanisms so that an iDevice won't run anything at system level that's not signed by "The committee of security experts for keeping Apple honest", just as Apple do for third party developers. The irony here is that there's enough hardware security on current iDevices to make this practicable.

  5. MachDiamond Silver badge

    It may not be as specific as you think

    Rob Braxman has done a good job in presenting some ways what Apple is proposing actually works. Look him up on Odysee (or YouTube if you must). It looks a lot like the age old excuse of something being "for the children" when it's yet another trojan to get the army through the gates.

    Rob shows that it isn't just hash lookups. For a hash to be identical, the images would have to be identical. There are already programs that can analyze photos and suggest the content with probabilities. He shows examples where the analysis reports the image is of a female in swimwear/underwear, posing suggestively, age, race etc. The image was a of a woman in a bathing suit, on a beach in what I'll call a yoga pose. There are a couple of more examples and they are so close it's scary. With the processing power going up and up on mobiles, doing this sort of analysis on a phone is easier and easier. Apples outline shows that they phone would do the analysis, create a hash of the description, build a locked thumbnail and upload to iCloud. If the program thinks you have too many suspects, it will rat you out and somebody could then review the thumbnails. The problem is they'll never have the human resources to do that so some bot will be making the decision about whether you are to be 'swatted' or not.

    It doesn't have to be CSAM. The program can be instructed to look for A/V files from political rallies/protests. It might look for landmarks to track your location even when you are out of range or in airplane mode. Perhaps they'll trigger everybody's phone to be on the lookout for certain people when the phone is detected to be in a certain place at a certain time. The technologies are a spy master's dream come true.

    My new phone is going to be deGoogled as soon as it arrives. I don't think I have anything to hide, but I am not interested in letting The Man forage through my private life and wrestle me to the ground if there is anything about me they object to. They less they know, the better off I am.

    1. yetanotheraoc Silver badge

      Re: It may not be as specific as you think

      Absolutely a trojan horse, the think of the children gambit makes that blindingly obvious.

      "... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

      Collect input, don't make me laugh. Just a cooling off period before they proceed, their hope is the criticism will no longer be boiling over but even if it is they will go ahead. I'm not sure what their actual business endgame is, it's not child safety but that's how they plan to get the iVictims to accept Apple commandeering the devices for Apple's benefit.

      In 2021 I have three Apple devices. By sometime in 2022 I will have zero, and in the meantime I will be very cautious about OS updates.

    2. tubedogg

      Re: It may not be as specific as you think

      "Rob shows that it isn't just hash lookups."

      Apple has said from literally the beginning it isn't just hash lookups, because yes, obviously, if you hash two different things with the same algorithm, you're going to get different hashes. It's essentially hashing using AI for fuzziness—fuzzy meaning that a photo that has been cropped, turned grayscale, or had its resolution changed generates a hash that matches very closely or spot-on with the hash from another copy of the same photo that hasn't been altered.

      No idea why you think this is some revelation this "Rob" guy is proclaiming to the world.

    3. Anonymous Coward
      Anonymous Coward

      For a hash to be identical, the images would have to be identical.

      Bullshit.

      1. NetBlackOps

        Re: For a hash to be identical, the images would have to be identical.

        Yep. Hash collisions are inherent to the mathematics of hashing algorithms.

  6. elsergiovolador Silver badge

    Pear shaped Apple

    If Apple cared about privacy, they would have offered a hosting account with their device, some sort of a VPS with attached storage or a dedicated server for professional users. They could have preinstalled operating system with the user's keys. With the right operating system and software, a user wouldn't notice a difference, but the "someone else's" computer would actually be theirs (at least for the duration of rental).

    The fact that they do the cloud like everyone else, it is just cheaping out on security and then fundamentally breaching user's privacy by making them all suspect of a crime. You know in the US when prisoner is released the may be required to install scanning software on all their devices. Apple essentially treats every customer like a prisoner on parole.

  7. Anonymous Coward
    Anonymous Coward

    Howard Beale

    "I'm mad as hell, and I'm not going to take it anymore."

    *

    ....and I'm going to do something about it, including avoiding certain behaviours:

    1. No Apple devices in my life.

    2. As few "cloud-based" services as possible. Perhaps only email. Cautious use of Google. Definitely not for backups.

    3. Private encryption of sensitive messaging BEFORE the message enters any channel (even so called "end-to-end encrypted" channels).

    4. Use of multiple "throw away" email addresses.

    5. At least two mobiles: the usual phone plus a seldom used anonymous "burner". (Bluetooth always switched off.)

    6. Paying attention to devices which "phone home" -- notably cars. I quite like my Morris 1000 Traveller!

    7. Pseudonymous and infrequent use of Facebook and similar "social media" (see items #4 and #5).

    8. Physical backups in a secure off site location. (Backups tested regularly to see that they restore!)

    9. Frequent use of locations not associated with personal accounts (such as internet cafes)

    I guess that's enough for now........and it probably only HINDERS the snoops. I suppose if snoops want to invade my life then they can do that.....but I don't have any obligation to make life easy for them!!!! Howard Beale did have a point!!!

    1. jgard

      Re: Howard Beale

      I'm fully against any client side scanning or intrusion into a person's data or device. This sort of scope creep on our privacy and freedoms is dangerous, worrying and anathema to a civil, free society.

      However, I can't help thinking you're going over the top a little bit here. Who are you? James Bond? Director of Mossad? Head of a Mexican drug cartel? As you note, those measures would only make it more difficult for the spooks, they could still get what they wanted if need be. But they're not gonna take even a cursory look anyway, unless your're strongly suspected.

      Your Facebook comments will be somewhat low on their priorities list (approx line 423,896,186), unless of course the spooks view you as a VIP / target. And as they won't, your countermeasures are crackers and totally disproportionate. Internet cafes and cars that don't phone home? That's real tin foil hat stuff dude. The biggest issue is in maintaining your digital charade, your life must be utterly exhausting.

  8. tip pc Silver badge
    Big Brother

    Studies show that csam is linked to other not obviously related material too

    Once they’ve scanned your photos for csam, they’ll look for other indicators in case your engaging with csam but not using your “smart” device to do so.

    Studies will show that abusers also look for certain not obviously related material’s, let’s say they are anti climate change, or support trump, drink moonshine, engage in other illegal activities, don’t go to church or are Roman Catholic.

    Anyone matching those attributes could be brought under further scrutiny.

    "Could governments force Apple to add non-CSAM images to the hash list?" the company asked in its interview of itself, and then responded, "No. Apple would refuse such demands and our system has been designed to prevent that from happening."

    Some double talk going on here but Apple stated they don’t control the list or generate the hashes so could not be forced to add anything. They’ve not explained what controls the agencies controlling the lists have and what safeguards they will have in place. Apple will however manually verify the csam by a human before passing on to those reporting agencies.

    What other information are Apple misguiding us from?

  9. Anonymous Coward
    Anonymous Coward

    Implementing Mandatory On-Device Client Side scanning of User's Data

    is Apple's Long Farewell Note to their (former) Customers and Short Suicide Note to their Shareholders.

    1. TimMaher Silver badge
      Windows

      Re: Implementing Mandatory On-Device Client Side scanning of User's Data

      Yup.

      As a long time fanboi I am now going to let the old stuff just re-record not fade away.

      There will probably not be any new stuff.

      Plus side? I don’t use ‘cloud’ infrastructure.

  10. Andre Carneiro

    Tim Cook is a clever man

    I'm just astounded that nobody at Apple actually thought this was A Bad Idea.

    I'm even more astounded that Tim Cook actually decided to go for something so utterly, mind-bogglingly stupid.

    Maybe I'm just naive.

    Also, I'm about 95% sure they're just waiting for the media attention to die down and then quietly implement it anyway,

  11. Eclectic Man Silver badge
    Pint

    "Apple – rather than actually engaging with the security community and the public"

    Nonsense, I'm certain that when Tim Cook read the Register's article (https://www.theregister.com/2021/08/17/corellium_apple_bounty/) and the comments on the associated forum he IMMEDIATELY had second thoughts.

    Chalk this one up as a victory for the Register and have a beer :o) -->

  12. Wade Burchette

    Waiting until the heat is off

    Just like Audacity with telemetry, a desire delayed does not mean a desire denied. Keep a watchful eye out for any every new terms-of-service from Apple. I am certain that Apple will try this unacceptable nonsense again, except later when people have moved on and quietly so they won't know.

  13. Anonymous Coward
    Anonymous Coward

    Perhaps people need to be more aggressive and start stating things like:

    "Apple is doing this because they know most of their customers are pedophiles and market towards them"

    Of course not true (Apple customers are certainly mugs but not pedophiles) but if we can start cutting into their brand identity, they might think twice.

  14. Lord Elpuss Silver badge

    "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

    IOW we'll wait until the hoo-hah has died down then quietly introduce it via the back door.

  15. Alf Garnett

    This is yet another reason to not own an apple device or to get rid of the ones you have and switch to Android. Also, anyone who wants to keep their data private should store it unencrypted on a system they don't own and control. There's no telling when some government will demand the big corporation hand over the data. The big corporation will because the people who make the decision don't want to go to prison or have their wealth confiscated.

    No, not everyone who wants to keep their data secret have anything illegal to hide. Someone may have a bid for some multimillion dollar deal on their device. That goes unencrypted to the big tech company's server as part of an online backup of this person's device. A competitor hacks or bribes their way into the system and steals the backup. It may not be anything that big you want to keep secret. Maybe the device is owned by a woman with a psycho ex who is hunting for her so he can kill her. He hacks her backup, finds out where she will be tomorrow night, and is waiting for her with a gun.

    Maybe you have evidence of the government you live under doing something dishonest, or criminal. Let's say it's a video of cops beating to death someone who is a member of the wrong ethnic group or political party. Your iThing is told to scan for and finds this video. The government is alerted, cops or soldiers come to your door and your body is found floating in a river or if you're in Russia, you're found dead from polonium or some chemical weapon that attacks the nervous system.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like