back to article Apple responds to critics of CSAM scan plan with FAQs, says it'd block governments subverting its system

Apple's announcement last week that it will soon be scanning photos on iPhones and iPads that sync to iCloud for child sexual abuse material (CSAM) prompted pushback from thousands of security and privacy professionals and a response from the company that attempts to mollify its critics. The iDevice biz revealed two child …

  1. razorfishsl

    It has nothing to do with kiddie porn.....

    They just want to be able to run their classifier over every picture & video in a users private piece of kit.

    it it designed to :

    1. set a legal precedent

    2. use existing material to train their A.I on other none related material.

    3. allow their staff to access private content to validate results.....

    think they over looked one small matter....

    for their staff to validate the results.... it requires them to load the "kiddie porn" onto a viewing device controlled by apple, to be viewed by staff employed by apple...

    or are they going to use a 3rd party?

    1. AnoNymousGerbil

      Staff probably in india working getting minimum wage and don't care what happens, just click "YES!" like those Google employed people manually verifying disputed copyright claims.

    2. Lord Elpuss Silver badge

      I think you're flat out wrong that it has nothing to do with kiddie porn. It does. And I'm certain that Apple has altruistic motives right here and now, and none of the scenarios you outlined are in their thinking at all.

      HOWEVER.

      I'm equally certain that today's motives aren't necessarily tomorrow's motives. And for that very reason, I would strongly vote against any initiative such as this one. My device, my data. I also believe that this is a violation of the fourth Amendment; unreasonable search and seizure.

      1. Irony Deficient

        I also believe that this is a violation of the fourth Amendment; unreasonable search and seizure.

        The Fourth Amendment only constrains the government; it does not constrain non-governmental searches by individuals or organizations.

        1. Paul Crawford Silver badge

          Re: I also believe that this is a violation of the fourth Amendment

          How long until it is done on the government's behalf?

          1. Cuddles

            Re: I also believe that this is a violation of the fourth Amendment

            Doesn't matter. There's already plenty of precedence allowing the police to buy data they would not be allowed to collect themselves. If Apple hoovers up everything they want and then the police ask for it later, everything would continue to be nice and constitutional.

          2. Anonymous Coward
            Anonymous Coward

            Re: I also believe that this is a violation of the fourth Amendment

            Yesterday

        2. Lord Elpuss Silver badge

          Re: I also believe....

          Apple and Child Safety

          ...To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

          NCMEC is a Government entity. 4A applies.

          1. Citizen of Nowhere

            Re: I also believe....

            >NCMEC is a Government entity. 4A applies.

            Not according to the NCMEC: "The National Center for Missing & Exploited Children is a private, non-profit 501(c)(3) corporation" (https://www.missingkids.org/footer/about)

            1. Lord Elpuss Silver badge

              Re: I also believe....

              NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

              Come on. If you're saying that 4A doesn't apply here, you're sadly mistaken.

              1. Citizen of Nowhere

                Re: I also believe....

                No, I'm saying that contrary to your statement otherwise, it is not a government agency. I'm not a lawyer, let alone a US constitutional lawyer, so I doubt my opinion on whether that amendment applies is of much worth. I'm fairly confident that working with government agencies doesn't make a private corporation a government agency, on the other hand.

        3. Anonymous Coward
          Anonymous Coward

          The Fourth Amendment constrains the government...

          The Fourth Amendment constrains the government, but only the government today has a right to search your properties (with a warrant, of course). Other organization and individuals have not that right, and if they do, they usually violate existing laws. Here even some of the searches US companies do on their employee are forbidden by the law because private guards have not such powers.

          Or do you mean any individual or organization can search you and your properties at will?

          I understand many believes an Amendment Zero exists, and that states that "Enterprises are above the law, and everything that makes one riches is legal even when it trashes other people's rights".

          1. Irony Deficient

            Re: The Fourth Amendment constrains the government …

            … but only the government today has a right to search your properties (with a warrant, of course).

            The government does not always need a warrant — if consent is given to a search, then no warrant is needed. A search in an open field does not require a warrant, but searching a house’s curtilage would require a warrant. With the exception of the contents of an arrestee’s cell phone, warrants are not needed for a search that is incident to an arrest. Warrants are not needed for searches at a port of entry, with the possible exception of electronic devices — some Federal circuits require reasonable suspicion for such a search, and some don’t, and the Supreme Court has not yet granted review of a circuit case to settle on one rule for all ports of entry.

            Or do you mean any individual or organization can search you and your properties at will?

            My meaning can be found in my linked comment above, and I’ll repeat it here:

            A legal defense against non-governmental searches by individuals or organizations would need to based on other areas of the law, e.g. on trespass or online privacy legislation.

            Because the Fourth Amendment only constrains the government, it is irrelevant with regards to non-governmental searches by individuals or organizations.

            1. Anonymous Coward
              Anonymous Coward

              Re: The Fourth Amendment constrains the government …

              You should look it the other way round. The 4th Amendment implies that only the Government has such powers, to be exercised with the limitations stated in that amendment - and under that Amendment other laws state what are rightful searches and who can perform them.

              It doesn't mean that everybody has a right to search everybody else unless the law says otherwise.

              When it comes to basic Rights, any law can only state the exceptions to them, not vice versa. Only authoritarian countries restrict and list the approved ways you can exercise your rights.

              Of course there are laws stating what you get if you violate someone else's right. They are there more to set up the framework to inflict sentences and fines to those violating them than to assert the very basic rights behind them.

              Or do you mean that even the right to Life is not protected but by plain law that can be changed at will by politicians?

              1. Irony Deficient

                Re: The Fourth Amendment constrains the government …

                under that Amendment other laws state what are rightful searches and who can perform them.

                It doesn't mean that everybody has a right to search everybody else unless the law says otherwise.

                The majority decision in United States v. Jacobsen suggests otherwise:

                This Court has also consistently construed this protection [from the first clause of the Fourth Amendment] as proscribing only governmental action; it is wholly inapplicable

                “to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any governmental official.”

                Since that protection is wholly inapplicable to a non-governmental search or seizure, even an unreasonable one, effected by a private individual, any restraint on searches or seizures by non-governmental persons must have its basis in some other source. Just because that basis is not the first clause of the Fourth Amendment does not mean that every non-governmental person has a right to search or seize anyone else.

                Or do you mean that even the right to Life is not protected but by plain law that can be changed at will by politicians?

                I have only been discussing the Fourth Amendment’s applicability, or lack thereof, to the Apple CSAM scan plan. Would you prefer to discuss the right to life instead? If so, do you mean in terms of abolishing capital punishment? Do you mean in terms of prohibiting abortions?

      2. Mark 65

        Believe what you want but “think of the children” is the thin end of a privacy invading wedge that will be driven home with a sledgehammer.

        Whilst they’re on a crusade to eliminate the sharing of this material they seem to be missing the real crime is in the physical not electronic world where the children are subjected to such abuse. I’m sure their woke little TOTC department may feel all warm and fuzzy if someone with such images on their device were prosecuted, however it did fuck all to prevent that crime happening in the first place and, given the fanfare of the announcement, will just push these vermin onto other platforms if they ever used Apple in the first place. Meanwhile your average Joe is getting their material searched for no real reason other than the oft touted “nothing to hide, nothing to fear”. Tell that to a dissident.

        I also believe the false positive rate they tout will be utter bollocks.

    3. RegGuy1 Silver badge

      Upvote -- you can spell 'none'.

      (Now there's a rarity.)

  2. Anonymous Coward
    Anonymous Coward

    Govs tell apple they restrict apple markets unless they give in, or that there be leaked pictures of someone at apple with eppstein or something and they give in faster than anything...

    There's no way of verifying trusted way where or what that HASHDATA has as source images.

    Besides that all one has to next wait is some neat crypto UNlocker (or just hoax of that) for appleOS that tells "gib uz bitmonies or else we UNcrypt some nastyimages to your photo library" and see how that goes among the users...

    There's so many ways one can think to exploit this system it aint even funny and they can't be that blind they don't see those.

    I do not trust Apple in this kinda things, just remember few years back FaceTime issue where one could spy others through that without people being aware. Things like that can prevented easier, but if' there's basically whole system-wide "backdoor" to something like this, how long they think it takes it to be exploited...

    I was just about to buy M1 macbook air, had been looking it last week but hadn't been pressing the BUY yet (deal ends 15th so I was waiting if anything better comes alone). There is no way I will now do that. I'll stick to my macmini until it's done and start to slowly jump the ship to other platforms. For phone it will hurt most because I have been using iOS since the beginning...

    1. Anonymous Coward
      Anonymous Coward

      I have an M1 Air but I will now never install Monterey. Hoping for great things from Asahi Linux.

      1. DS999 Silver badge

        Why not just refuse to enable iCloud? If iCloud is disabled, so is this scanning thing which only takes effect for photos immediately before they are uploaded to iCloud.

        1. Anonymous Coward
          Thumb Down

          For now. Don't tell me they won't find an excuse to scan device storage for non-iCloud users.

          1. Anonymous Coward
            Anonymous Coward

            For now. Don't tell me they won't find an excuse to scan device storage for non-iCloud users.

            That will require a change to their T&Cs at which point you can choose to stop using the iThing.

            1. Anonymous Coward
              Anonymous Coward

              or dont do that but just change in code and later tell "oops it was just a bug honest guv!"

              1. MrDamage Silver badge

                They'll hire the same two "rogue engineers" that led to Google Streetview cars scanning all WiFi networks, and VW t falsify emissions tests. Those two crafty buggers seem capable of bypassing the most strenuous QA testing before shit gets rolled out.

            2. Anonymous Coward
              Anonymous Coward

              Nothing to stop them putting that change in the metaphorical bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard’.

      2. Anonymous Coward
        Anonymous Coward

        Looked into that linux but I think I just skip whole thing and just get something else instead.

        It's pity because I was really interested getting one as portable dev. system with long batterylife now that covid stuff is relaxing a bit. Price was right also with 999€ on top of what I'd get 100€ off extra so 899€.

        Oh well, bigger issue will be what I'll be getting to replace iPad at some point and iPhone and what devices will I suggest for those that I have to deal with daily basis. Sofar Apple products been rather easy for several reasons. I guess I have to look into some more for Linux systems for "idiot proof" system for users that are not sharpest tools in the box.

  3. Snake Silver badge

    You asked for it, you got it

    You wanted Apple to be your nanny state for as long as you believed it was suiting your purposes (walled garden, Safari proxy, blocked apps, etc etc etc).

    You gave them approval, so now they are simply stepping up the level of nannism.

    Think of the children. You were OK with all that as long as you believed you had "nothing to hide".

    Enjoy Big Brother. You've earned it.

    1. Lord Elpuss Silver badge

      Re: You asked for it, you got it

      And you enjoyed the "freedom" that Android gave you - lax restrictions on the Play store meaning nearly 25% of the apps on there are malware or junk, a "we actively monetise everything you do so you're paying us for the privilege of being the product" policy, and a woeful handset support policy meaning many handsets will not be supported beyond a year from purchase, and some even shorter than that.

      Yeah, you go 'enjoy' your freedom while we iOS users are still enjoying a safe, secure and private experience on our 6 year old handsets.

      1. Snake Silver badge

        Re: freedom

        That's right, FREEDOM.

        Which always comes with personal responsibilities, responsibilities that Apple users are both too lazy and cowardly to do themselves as they act like children and expect someone else to do all the hard work.

        Like utter FOOLS, they believe that some big corporation has their ultimate interests at heart. And wantonly swallow everything that is presented to them.

        While Android has the freedom to actually take that personal responsibility in hand and select your own choice of privacy levels within the App control panel. Or root your phone and place an entirely different OS package on it.

        My boss is an Apple user. Like you. And, LIKE YOU when I told him last week that Apple would be looking at his photos...he came up with an excuse that since everyone is looking at spying on your photos anyway, Apple doing this is nothing special.

        Apple users are sheep. You deserve every loss of personal choice, for profit, that you've voted on with your wallet. Ben Franklin is laughing his ass off at you.

        1. Lord Elpuss Silver badge

          Re: freedom

          What other people do is none of your business, yet you seem to believe you have the right to dictate what's 'right' and what's 'wrong' for them and to insult them for their choice - based on nothing more than your own biased and unsupported opinion. That is arrogant and entitled, and makes you a thoroughly nasty piece of work.

          Take your poisoned view of the world and go 'enjoy' it in peace somewhere else - the grownups are talking here.

        2. Anonymous Coward
          Anonymous Coward

          Re: freedom

          While Android has the freedom to actually take that personal responsibility in hand and select your own choice of privacy levels within the App control panel.

          You trust google when it comes to privacy? LOL!

          Google admits it tracked user location data even when the setting was turned off

          https://www.theverge.com/2017/11/21/16684818/google-location-tracking-cell-tower-data-android-os-firebase-privacy

      2. DiViDeD

        Re: You asked for it, you got it

        Freedom comes with responsibilities. Google tracking can be circumvented, installing stuff from the Play Store requires a little discernment.

        Very different from the Apple approach of "It's fine, really - just open your gob and let us pour this stuff in. We'll tell you when you're full, trust us".

        If you abrogate your responsibility and hand over the job of keeping your own device secure to someone else, whose fault is it when the custodian takes advantage of that abrogation?

    2. Anonymous Coward
      Anonymous Coward

      Re: You asked for it, you got it

      You're exceptionally naive if you think google will not duplicate this CSAM system in Android, if it hasn't already done so.

      If there's one company that uses 1984 as a modus operandi, it's google.

      1. revilo

        Re: You asked for it, you got it

        Having been an apple fan until August 5th, I will change now to Android and ditch all apple stuff. It will hurt but there is no choice. This is a serious step of apple and trust in this company is gone. Yes, there is a danger that google will implement something similar. But at least there are alternative versions (google indepedent) which can be installed, at least on phones which are not cutting edge.

        1. Lord Elpuss Silver badge

          Re: You asked for it, you got it

          Sure you will.

        2. Matthew Elvey

          Re: You asked for it, you got it

          Ouch. Snake's comment hurt. Because it hit home. It's not like there weren't warning signs.

          revilo is right.

          Except that as far as I can tell, it's impossible to use just about any smart device remotely close to normally in the common sense of the word and be confident it hasn't been pwned.

          Open civil disobedience is another option. "They" can't jail all of "us". WFM. No one has come for me. As long as there are more with me than with evil, I'm good. It's things designed to make us fearful - like this announcement, like 2/3 of the news, that push people into accepting/supporting evil, but at the same time, it pushes people to wake up and act for good. The I'm mad as hell and I'm not going to take it anymore crowd.

  4. pro-logic

    Who creates the hash?

    In the end doesn't this come down to trust in who creates the hash?

    I assume: (and also hope) that Apple isn't getting a whole bunch of pictures to create the list of hashes, and like all other hashes you can't work out the contents from the hash.

    If Apple get's a list of hashes from a 3rd party, say the government, what's to say there isn't a bunch of hashes of Winnie The Pooh pictures? The 3rd party can assure Apple the list is only CSAM, and Apple can feign ignorance as A. A. Milne fans end up in the slammer.

    As an aside, what's the size of one of these hashes? I would assume the hash database it in the order of tens of GB.

    1. Flip

      Re: Who creates the hash?

      "Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."

      https://www.apple.com/child-safety/

      1. pro-logic

        Re: Who creates the hash?

        So Apple get a list of hashes

        Can't see anybody, anywhere globally, adding hashes of interest into a reginal database and then giving that to Apple.

      2. Anonymous Coward
        Anonymous Coward

        Re: Who creates the hash?

        NCMEC tipline will be flooding soon for all kindsa tips of images people will try to get images to database and hashed. Same for other places Apple will be pulling their hashes from and it'll be glorious mess :D

      3. elsergiovolador Silver badge

        Re: Who creates the hash?

        What stops Winnie the Pooh from pouring some communist money into one of those organisations and getting write access to hash database?

        1. Dan 55 Silver badge

          Re: Who creates the hash?

          Winnie the Pooh will have is own hash database that Apple will have to use.

      4. Neil Barnes Silver badge

        Re: Who creates the hash?

        For curiosity: how does the hash work? Under the normal definition of a hash function, it should give a different result if the two objects being compared are one bit different... which suggests that the appearance of an app to automatically add a neutral pixel at some location in all images will rapidly appear.

        Or is 'hash' being used as a shorthand for a much more complex process (e.g. softening images, resizing, and perhaps reflecting them) before applying further processing to provide a more robust match?

        1. pro-logic

          Re: Who creates the hash?

          Being pedantic, a hash is simply a function that maps an arbitrarily length of data onto a fixed length of data. The definition does not have any requirements around changing the hash with data change. A hash function could simply map all data to the value "1" and it's still technically a hash, just a terrible one.

          It just so happens that most hash function we think about in IT are designed to change when a single bit changes.

          As for how this specific hash function, Microsoft Research's PhonoDNA is the hash function that's used for this. It is apparently resistant to colour changes, resizing etc. how it works is practice is way over my head https://en.wikipedia.org/wiki/PhotoDNA

          1. Peter Gathercole Silver badge

            Re: Who creates the hash?

            The hash has been called a 'perceptual hash' by Apple, implying that it is not an exact checksum analogue for a file.

            Louis Rossmann on one of his recent videos (https://www.youtube.com/watch?v=9ZZ5erGSKgs) was talking to one of his regular commentators who appeared to know the technology being used, and it appears that it somehow measures various features in a photograph such that cropping, altering colours, inserting steganography etc. would not hide the pictures.

            I can imagine such a system, certainly running in the cloud with all the AI smarts that exist there, but I'm not sure that I accept that that will actually run on an i-device, at least not without consuming a lot of resource and power. I know Apple have been embedding their neural engine AI processor on their SoCs. Maybe it will only run on devices that have this available.

            1. Jimmy2Cows Silver badge

              Re: Who creates the hash?

              It'll be run on the iCloud side, when pics are uploaded. Not quite the same as "scanning your device", but, you know, click-baity headlines...

              If they do want to scan on-device pics, it's an easy step to a background service on your device constantly uploading your pics to the checking service, flagging any who's resulting hashes match the no-no list.

            2. Matthew Elvey

              Re: Who creates the hash?

              By the way, did anyone else notice that Google Image Search, which used to work quite well, now works extremely poorly? I typically put in small images I'm looking for a larger copy of. It used to be great at this. Usually found one if I thought it likely there was one on the inter webs. Now it's rare.

          2. Boothy

            Re: Who creates the hash?

            Seems the process at a high level is:

            1. Convert the image to black & white (their words, although I'd assume grey scale due to step 4).

            2. Resize to a fixed size.

            3. Split up into a grid.

            4. Each grid has a histogram of intensity gradients or edges found.

            5. The final 'DNA' (i.e. hash) is then generated from this histogram data.

            Seems the 'DNA' as they call it, is basically a collection of meta data, per grid of the original image.

            It doesn't matter what the size of the picture to check is, or if the colours have been altered, the resulting hash should be the same each time.

            They also talk about comparing 'similar PhotoDNA', so seems the hashes can be compared for similarities, not just exact matches. Just a guess, but this implies that perhaps cropped images, or composites with partial matches can also be compared.

            For video, they basically run the same process against a subset of frames from a known video. The comparison then seems to be to run the process against all frames in the new video, as they might have edited it, so the subset of frames originally hashed, may not be in the same place in the new video that needs to be checked.

            Edit: Forgot to mention, seems the resulting PhotoDNA data is very small, so you can have very large data sets, and search them very quickly.

            1. Peter Gathercole Silver badge

              Re: Who creates the hash?

              The smaller the PhotoDNA, the more likely there is to be a clash between pictures that are really different but which generate a similar or same PhotoDNA. This is always a problem when it comes to the size of the hashspace.

          3. Snake Silver badge

            Re: the hash?

            But that means fundamentally that the hash is USELESS.

            Take your CSAM image and add layers above. Add images on to, and above, the CSAM. Leave the base CSAM layer, set some layers to modify, other layers to completely overwrite. Save as layered TIFF.

            Sure, you'll both need a TIFF image viewer and the ability to turn off the layers to view the image that you are seeking. But the hash will be fundamentally different and the image will get a pass through the search filters.

            1. Michael Wojcik Silver badge

              Re: the hash?

              It's not useless because of user behavior. Some users might go through the effort of transforming images until the output has a PhotoDNA vector sufficiently far from the original. Some might even create software to do that automatically. Hell, the really smart ones would train GANs to generate similar images based on an existing corpus – that's a lot cheaper1 than producing real images with real children.

              But the vast majority will almost certainly continue to share images verbatim through various off-the-app-store-shelf messaging apps, and for them the PhotoDNA vectors will remain stable.

              If the match rate ever fell below whatever target Apple have,2 they can switch to more-sophisticated algorithms. Run each photo through an CNN stack to extract high-level features, compress those, and measure distance in a high-dimensional space, for example. We have lots of classifier architectures that let you split the process at arbitrary points so you're not transmitting the original data.

              1Is it less immoral and/or exploitative? There's a fun question for your ethics class.

              2And that's probably very, very low. I have no evidence to speculate on how much of this is Apple trying to placate governments and NGOs, and how much is virtue-signalling, and how much is just meant to be the thin edge of the wedge, and how much might even be some sort of misguided altruism. But Apple can't reasonably be expecting to have many true positives, and probably is hoping the positive rate – true and false – is very, very low.

          4. Manx Cat

            Re: Who creates the hash?

            How would CrApple react to a 'Pink Lady' image?

      5. Graham Cobb Silver badge

        Re: Who creates the hash?

        I assume there are test images for this (developers, testers and trainers certainly do not want real child abuse images hanging around on their test systems) and their hashes are included in the database.

        How long before people start sending those around? Will a real person check the image is really illegal before Apple terminate the user account and the police are called?

    2. katrinab Silver badge
      Megaphone

      Re: Who creates the hash?

      And of course there is that picture of Pooh and Tiger alongside a photo of Preseident Jinping and President Obama which upset the Chinese.

    3. AndrueC Silver badge
      Boffin

      Re: Who creates the hash?

      Several organisations have created them. I used them at a previous employer over 15 years ago when they were all MD5 (the issuers were just in the process of rolling out SHA5 versions). Even back in the early 00s the list was perfectly manageable.

      Our forensic software used it to highlight various categories of files including:

      * Known applications.

      * Known operating systems.

      * Known installers.

      (All the above it treated as chaff and by default hid it to avoid wasting the investigator's time)

      * Known 'bad stuff' which it highlighted in red.

      I was quite surprised just how performant the lookups were (obviously generating the hashes locally was a CPU hit but they were generated during the initial prep stage and stored the result in metadata).

      But such hashes are easily defeated by changing a single bit and Apple's seems to be something more powerful. I'm guessing the DBs that we used (mailed to us on CDs, lol) have likewise moved on to something better.

    4. Anonymous Coward
      Anonymous Coward

      Re: Who creates the hash?

      From what I've seen, we're not talking about cryptographic hashes, but perceptual hashes which are the outputs of neural networks trained to look for characteristics in images. Unlike crypto hashes where it is possible to grind through the algorithm to see how it produced a value, perceptual hashes come out of the black box of a NN - you just have to trust them that the perceptual hash is unique to a particular image (and slightly altered variations of it).

      Apple hasn't refusing shared its algorithm and training sets so experts can mark their homework. Indeed since it is illegal for researchers to possess child abuse images in the UK, (because it is a strict liability offence), they CAN'T be verified.

      So we have to assume that Apple has done its homework and hasn't produced another faulty image analysis algorithm like those that have given use previous privacy screwups.Which when we are talking about people's lives is a HUGE request on their part.

  5. Doctor Syntax Silver badge

    Trust. Hard to gain. Easy to lose. Even harder to regain.

  6. elsergiovolador Silver badge

    Copyright

    They use the CP as an excuse to limit the discussion about the invasion of privacy.

    This is the same when police is playing copyrighted music when being filmed so that people uploading their abuse to social media, will get their content down.

    Social media algorithms limit the reach of such content, preventing more people from being aware of what's going on.

    I was actually going to buy a new Apple laptop, but after this I am done with this company.

    1. TimMaher Silver badge
      Coat

      Re: New laptop

      Yup.

      From now on I shall not be updating the OS on any reasonably new kit and will be allowing the other stuff (out of support) to just fade and die.

      Mine’s the one with the key to the walled garden gate in the pocket.

  7. Anonymous Coward
    Childcatcher

    When someone uses "think of the children"

    you know they're up to no good.

    It's the first resort of those seeking unreasonable amounts of power.

    1. Alumoi Silver badge

      Re: When someone uses "think of the children"

      Erm, no. It's the last resort when everything else fails (convenience, terrorists, cyber criminals).

  8. Anonymous Coward
    Anonymous Coward

    It is time to monkeywrench the system.

    1. Get a copy of this hash database. This may be difficult, but it's certainly not impossible

    2. Widely distribute the hash database. It contains no actual CP, so it's perfectly legal to do so.

    3. Encourage greyhats to develop a massive set of completely innocent images that will hash to match the database.

    4. Widely distribute those images, so that essentially every device contains innocent images that match.

    5. Database becomes completely useless and worthless.

    1. Clausewitz 4.0
      Devil

      Quite good idea, specially if the DB does not contain the size of the file.

      Lets wait for the rollout to figure which algorithm is being used for the hashing, and how much processing power one will need to make some collisions.

      1. katrinab Silver badge

        Images tend to get re-compressed when imported to mobile devices. How would that work?

        1. Boothy

          Someone mentioned above that they are using PhotoDNA for the hashing.

          This basically converts the image to grey scale, resizes it, splits it into a grid, and produces a 'hash' from each grid square.

          So it's basically resistant to things like resizing and colour changes.

    2. DS999 Silver badge

      One big flaw with your plan

      With modern hashing algorithms deliberately creating a hash clash is computationally infeasible. The reason we long ago abandoned older technologies like SHA-1 is that they either had weaknesses or increases in computing power obsoleted them.

      1. FF22

        Re: One big flaw with your plan

        If the system is really using SHA or similar hashes on exact byte streams, then you're right about the practical impossibility of generating colliding hashes, but then the system is also utterly useless in what it's designed for, because a simple recoding of the pictures into another format, slight changes to gamma or aspect ratio, or even saving with a different compression, etc. will all change the bytes of the image, and thus allow evade detection.

        However, it's more likely that when Apple is referring to "hashes", they actually just mean some kind of fingerprinting technology, that's not working on explicit byte streams, but analyzes the contents and composition of the image (and Apple is only using the term "hashes" because "fingerprinting" would cause confusion and associations Apple doesn't want to foster). In this case recoding or distorting the picture will not hinder detection, so the system is generally fit for the purpose it's supposedly intended for, but the hash-collision considerations don't apply anymore either, and it will be a lot easier to find or even possibly generate images that will be falsely identified as matches by the system.

        The latter will also mean that no defense lawyer and court will allow anyone to be prosecuted just because a matching "hash" was found on their iPhones, and cops will have to somehow retrieve the actual images in order to prosecute someone, which however will clash with precedents set Apple prior to this and their refusal to unlock devices of suspected criminals.

        Either way it looks like this whole child protection initiative from Apple is either is an umbrella/cover operation for something more sinister and is designed to enable scans/searches way beyond just child abuse images to begin with (which would make a lot of sense), or Apple has been again proved to be incompetent at addressing some technical problem, and shoot itself in the foot when they failed to assess the backlash this was actually generating.

        1. Chris G

          Re: One big flaw with your plan

          @FF22

          I think you are correct, Apple are just using CP as the thin end of a potentially very large wedge.

          Perhaps they have been leaned on heavily by TPTB to find a less obvious backdoor for them and this is a toe in the water.

          Personally, I think it may turn out to be a big mistake depending on how the media handle this.

    3. hoola Silver badge

      Eventually, in the meantime thousands of innocent people will be fighting Apple closing their accounts and being reported to the authorities. Those authorities tend to come done with a sledgehammer using the assumption that all involved are automatically child-porn-miscreants. Only months or years do they eventually admit that there is no case.

      Despite the "innocent until proven guilty" these sorts of things have a habit of becoming public, revealing identities so that those affected have their lives destroyed.

      The entire thing is so open to error, abuse and incompetence it is a disaster waiting to happen.

      Remember that the individual is powerless to do anything to mitigate against this. Just because Apple are first does not mean that other providers will not do the same under the "think of the children" banner.

    4. Anonymous Coward
      Anonymous Coward

      Hang on. I certainly do not want Apple, Google, AWS, FBI, Scotland Yard, Flensburg und das BKA* to go through my files on my device by any means. As someone else said in this thread, "my device, my data".

      But why on earth would you want to break a system that allows law enforcement to filter out and find potential CSAM without having to manually go through every single image on a device? Naturally no-one is going to be convicted** on a hash alone.

      The way to solve this is by privacy regulation and vocal objections from customers and experts; not by breaking the tools available to prevent child exploitation.

      *) Go listen to some old Kraftwerk tunes. You haven't done that in a while. You've earned it.

      **) At least not in the legal system; I can't vouch for the court of public opinion.

      1. Richard 12 Silver badge

        An accusation will destroy you

        It doesn't matter that you're totally innocent, your DBS record will include it so you can't work in your chosen profession again ("hearsay" is explicitly included), the media will report it and gossip will convict you.

        A great many people have been driven to suicide by false allegations, and even more have had their entire career destroyed forever, losing their home and all their assets in the process.

  9. Anonymous Coward
    Big Brother

    Backdoors

    Governments have been after Apple to create a government controlled backdoor to combat the flavor of the month, currently child porn.

    This is Apple's response - they will create a backdoor but it will be controlled by Apple and the government will get all the results.

    Of course different governments have different priorities and, given Apple's actions in China, they seem to be willing to bow to government pressure.

  10. Anonymous Coward
    Anonymous Coward

    Room 101 awaits

    But remember, Big Brother loves you

  11. Anonymous Coward
    Anonymous Coward

    "In my opinion, there are no easy answers here," wrote Stamos

    Although, to be truthful, he started off with one. An answer so easy, it's cliche.

    "Apple will refuse any such demands". Sure they will, until the CxO level offices start having the Monday morning coffee meeting in the prison cafeteria. Attendance will be mandatory, and the newest prison ink every week wins a door prize.

    This whole mess is sooo reminiscent of the storyline in Terminator 2 to destroy the Cyberdyne chip so Skynet can never be built. It almost feels like a similar tipping-point right now, just without the cool special effects.

    1. DS999 Silver badge

      They refused FBI pressure before

      Why do you think they'd cave now? If you say "well if they have this thing in there they could expand it to do more stuff" sure that's possible. But they (as well as Microsoft, Google, Facebook, Twitter, Oracle, etc. etc.) could have added that "more stuff" to their software years ago and simply not told anyone.

      Why does announcing the (soon to be) existence of this make it more likely they will surreptitiously add greater capability, when the option of surreptitiously adding that greater capability always existed?

      1. Paul Crawford Silver badge

        Re: They refused FBI pressure before

        when the option of surreptitiously adding that greater capability always existed?

        Sure, but this is a custom system just for it and now nobody will question the processes and network traffic arising from it.

        But ask yourself the simple question "why on the phone?" If it is to protect the cloud service and ONLY ever done on syncing photos, why not scan it there and avoid the whole privacy blow-up as it is widely know the iCloud is not encrypted and has already been handed over on demand (as most cloud services will do). So why the extra network traffic for sending hash/fingerprint info and all of the battery life implications that go with client-side services.

        Either there has been a monumental cock-up in Apple's thinking, or they have some other motivation for doing it on your phone.

        1. albaleo

          Re: They refused FBI pressure before

          "If it is to protect the cloud service and ONLY ever done on syncing photos, why not scan it there and avoid the whole privacy blow-up as it is widely know the iCloud is not encrypted and has already been handed over on demand"

          That's not my understanding. I read repeatedly that everything stored on iCloud is encrypted. My understanding is that's why they are to hash pictures before they leave the device - so they can continue to keep encrypted content on iCloud. Can you point me to something that says I'm wrong about that?

          1. albaleo

            Re: They refused FBI pressure before

            It seems I am wrong about this. I've just read that while iCloud files are encrypted, Apple has a key to unlock them.

            1. TimMaher Silver badge
              Devil

              Re: Apple has a key.

              Not if you encrypt them yourself before transmission.

              1. Paul Crawford Silver badge

                Re: Apple has a key.

                But in that case it is not Apple's problem? Any request for decryption comes back to the phone's owner so at least they know their stuff is being looked at.

      2. Chet Mannly

        Re: They refused FBI pressure before

        "They refused FBI pressure before"

        Hard not to think this was a quid pro quo for that - we won't give you access to the device, but we'll scan it all for you and let you know if there's anything bad/relevant on it. After all if they find CP material they will notify the FBI straight away.

  12. The Central Scrutinizer

    Apple says they will never cave in to governments for surveillance.

    That's OK then.

    What could possibly go wrong?

    1. StrangerHereMyself Silver badge

      They already have, so it's a pointless remark.

  13. YetAnotherJoeBlow

    So...

    "Alex Stamos, director of the Stanford Internet Observatory and former CSO of Facebook"

    Alex who?

  14. YetAnotherJoeBlow

    If I sell or force a secure system that I can break into or abuse in any way then I am a conman simple as that.

  15. lglethal Silver badge
    Go

    If Apple was really interested in catching paedophiles and users of Child Porn, it's truly super that they've announced this well in advance, with information about exactly how it works, so that all of those people who use Child Porn on their Apple devices, now know to go out and buy non-Apple devices for all of their Child Porn antics.

    Yep, very effective at catching miscreants when you announce in advance when and how your going to try and catch those miscreants. I dont doubt they will catch one or two small fish - there are always idiots out there. But mostly they wont catch a damn thing.

    But then maybe that is the whole point - maybe there's been some powerplay behind the scenes where governments have threatened to take over the iCloud (in the name of "protecting the children", of course), and Apple is just preempting that by driving away all the paedophiles before the government's could make their move?

    1. werdsmith Silver badge

      On the face of it, forcing all the pervs and pedos onto to rivals is no bad thing for Apple. If you have products that offer strong security, then they might be attractive to the criminals leading to your products becoming known as the ones favoured by them. Which would make the products less attractive to the majority innocent market. This move by Apple pushes the stigma onto rivals.

  16. tip pc Silver badge

    Match existing hashes or try and understand hats depicted in the photo?

    From the initial press release I got the impression that every photo was evaluated to see if it depicted a scene of csam.

    From the faq, it reads like it’ll check photos to see if it matches existing csam.

    iMessage checks photos to see if it contains nudity which is different.

    Anyone have a link to analysis which definitively confirms if it only matches against existing csam or if it tries to understand if your photo is csam that has never been observed or evaluated before?

    1. Anonymous Coward
      Anonymous Coward

      Re: Match existing hashes or try and understand hats depicted in the photo?

      From the initial press release I got the impression that every photo was evaluated to see if it depicted a scene of csam.

      Correct. Every photo (that is going to be uploaded to iCloud) has its hash calculated on the phone. This hash is compared, still on the phone, with the hashes in the CSAM database that has been downloaded onto the phone.

      From the faq, it reads like it’ll check photos to see if it matches existing csam.

      Again correct - but by comparing hashes not actual images files.

      iMessage checks photos to see if it contains nudity which is different.

      That's a different capability that's also being rolled out by Apple. This checks 'selfies' for nudity and pops up a warning when they are shared (for phones that are on a family account). The warning (but not the picture) is also presented to the lead phone on the family account. The intention is to reduce the incidence of children sharing nude pics - either deliberately or because they are being coerced.

      Anyone have a link to analysis which definitively confirms if it only matches against existing csam or if it tries to understand if your photo is csam that has never been observed or evaluated before?

      https://www.apple.com/child-safety/

      The official Apple pages are clear enough. The phone can only compare hashes and Apple are not in charge of creating hashes - they come from NCMEC who generate them on their own computer infrastructure. So, no, the phone is not trying to determine if a picture is new CSAM, only whether it matches existing CSAM from the NCMEC archive.

      That archive is continually being updated of course and the updated hashes will be regularly downloaded to iPhones.

      Additionally - and this is the basis of many people's concerns - there is no way of knowing whether a second database of hashes has been sent down to the phone where those hashes are provided by $agency and the matches on the second set are sent to $agency.

      1. confused and dazed

        Re: Match existing hashes or try and understand hats depicted in the photo?

        Why does the compare have to happen on the phone then ?

      2. tip pc Silver badge

        Re: Match existing hashes or try and understand hats depicted in the photo?

        From the initial press release I got the impression that every photo was evaluated to see if it depicted a scene of csam.

        Correct. Every photo (that is going to be uploaded to iCloud) has its hash calculated on the phone. This hash is compared, still on the phone, with the hashes in the CSAM database that has been downloaded onto the phone.

        The official Apple pages are clear enough. The phone can only compare hashes and Apple are not in charge of creating hashes - they come from NCMEC who generate them on their own computer infrastructure. So, no, the phone is not trying to determine if a picture is new CSAM, only whether it matches existing CSAM from the NCMEC archive.

        so I missed the word "known". I had assumed it would evaluate every photo and try to figure out if it was csam (new or existing) or not.

        The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

        that says every photo uploaded has this voucher added. You will never know what the voucher says. You will never know if that photo was ever 1 of the 1 in trillion that is a false positive & some stranger will check your photo.

        I don't like it.

        I don't want children to be ever subjected to csam. I've still not been able to finish watching Save Me on sky, I've found it too upsetting.

        not sure this is the answer though, the scum who duo this will just use a different set of devices and online services.

        by the time a photo is taken its too late.

        1. Anonymous Coward
          Anonymous Coward

          Re: Match existing hashes or try and understand hats depicted in the photo?

          this "system" opens whole new line of scams, we have had cryptolockers and stuff but next there will be cryptounlockers that threat to deposit some nasty stuff to peoples icloud if they dont pay. just wait and see, it will happen. will be interesting when it happen, and those people sue apple for ruining their life with this kinda system.

  17. mihares
    Mushroom

    Think about it the next time you vote

    Governments, even if it has been very recently proven that is not necessary, are pushing for mass scanning user devices. Because they see that it’s easily done —think of how many mischiefs they’d catch if they were allowed to do so!

    They don’t give a damn about privacy if busting it is in theirs and not in Facebook’s or Apple’s interest.

    This mechanism will be rolled out in the EU to implement the famed copyright filter, which has been already mandated by the Union and it’s waiting for uptake in the various states.

    Apple was sick and tired of denying (or obliging) to requests along the lines “give us access or we’ll tell everyone you defend child molesters”. And this is the result.

    The “good thing” is that the machinery is based on a feature extractor that happens to be on the device owned by its adversary, so it’s just a matter of time I hope before some hack finds a way to break it or, probably, DoS it with a class of adversarial examples.

    Nonetheless, this was an answer to a request that is done all over the western world by the governments and that is: easy, gratuitous snooping on everyone’s device.

    Contrary to China, most of us can and do elect their representatives: watch what you vote the next time because this shit was asked to Apple by them.

    ~~~

    It comes down to trusting Apple: yes, of course. But it was already that way since the iSoftware is very much proprietary and very closed. This is a different but related problem: there are less mobile OSes than car brands in the DDR so the choice now is, effectively, having your pictures scanned by Apple or being key logged by Google.

    Or send them all to the deuce and have a dumb mobile phone, a tablet PC running GNU+Linux and a desktop workstation doing the same.

    1. Doctor Syntax Silver badge

      Re: Think about it the next time you vote

      Or a phone running LineageOS?

      I don't know about Google key-logging but I'm not happy about the way a recent update has resulted in all the F-Droid sourced apps are move off the favourites page every time it restarts which, with my frequency of failing to recharge, is fairly often.

      1. Chet Mannly

        Re: Think about it the next time you vote

        "Or a phone running LineageOS?"

        Yep it will be rooted Androids running open-source/modified builds as the only way to get around that. Wonder how long before the Governments then ban unlockable bootloaders...

    2. confused and dazed

      Re: Think about it the next time you vote

      The problem with our versions of democracy (or at least the UK version) is that we select parties. This means that get to choose between set of policies A or set of policies B every 5 years .... that's it. Even if the parties are different on the issue that your care about ....

  18. mark l 2 Silver badge

    Apples PR seem to contradict themselves, They claim its matching hashes yet can detect when the image has been edited or changed, and these two things are not compatible. As if just a few bytes of an image is changed it will have a completely different hash, and unless Apple have each image in the databases ran through even possible pixel changed, image resized etc and a new hash created to match each edit that will never work on matching hashes.

    So it must be some sort of AI image recognition technique they are using which I then dismiss their claims that its a 1 in a trillion chance of false matches. I wonder how long will it be before someone whose over 18 gets their nude images falsely flagged up as illegal and passed over to the FBI? Even if on examination the police see its a false match, it still means some officer is going to be viewing someones private photos without their consent to verify them.

  19. TRT Silver badge

    Did I see on another news story...

    that it was for images that are SHARED using iCloud? There is a difference. One story said "uploaded" to iCloud, the other said "shared using iCloud".

    Now I can see why Apple wouldn't want to be DISSEMINATORS (publishers) of CSAM, or indeed ANY illegal visual material... protecting their own backsides in a legitimate and ethical manner is an expected activity for a business to engage in. Not so sure extending the "Think of the children!" defence to stomping all over any and all remaining vestiges of the expectation of privacy is.

  20. Rich 2 Silver badge
    Big Brother

    Tank man

    So, what’s the hash for that Tiananman Square photo then?

  21. Paul Smith
    FAIL

    It is *not* your device

    A lot of the outrage here seems to be based on the mistaken assumption that people think they own their phones. At best, you have a license to use the physical manifestation of it. The OS, the apps, the infrastructure, and all the things that make the lump of plastic and metal useful, belong to somebody else, and yes, they can do what they like with it. If this angers you, don't take it out on me, just go back and read the terms and conditions that you accepted.

  22. DrBobK

    No one mentions US legal requirements to identify CSAM on cloud servers.

    To quote Forbes: "By law, American companies have to report child abuse and exploitation imagery on their servers to NCMEC, which then works with law enforcement on an investigation. Other tech giants do the same when emails or messages are sent over their platforms. That includes Google, Microsoft and Facebook."

    1. Anonymous Coward
      Anonymous Coward

      Re: No one mentions US legal requirements to identify CSAM on cloud servers.

      > To quote Forbes: "By law, American companies have to report child abuse and exploitation imagery on their servers to NCMEC

      That's true. So Apple must already be complying. So why is a new approach required?

      So many questions, so few answers - it's almost as if Apple and the relevant law enforcement authorities aren't being completely open and honest with us. That's okay - the authorities know best.

  23. esque

    Let's see:

    How many apps on the Apple app store are junk? Did you check? The answer might surprise you.

    There's a lot more malware on the Apple app store than you might think. Apple is just trying very hard to play it down.

    Nobody is denying that Google is trying to gobble up all the date. But this is known and accepted and allows people to make informed decisions and to do something about it.

    The false narrative that Apple is all about privacy when in fact they're not is lulling Apple users into a false sense of safety when they should be cautious.

    Also, on Android I have the freedom to not use any Google services. Try to get rid of Apple on an iOS device.

    And finally: If an Android device doesn't get any support from the manufacturer we are still able to install aftermarket systems like LineageOS, Sailfish or others. Try doing the same on an iOS device that Apple decides not to support anymore.

  24. Wade Burchette

    My concern

    This kiddie porn angle is the wedge needed to break open the dam. Start with something that almost everyone agrees is horrible, and then slowly expand that. Start with the child porn, and it will end with Apple blocking your device for wrong thinking. "You shared 'misinformation' about the vaccine. We are blocking access to your phone because obviously you want to kill grandma." Either you stop it now, or it will be expanded to blocking stuff that just Apple thinks is horrible.

    There is a precedent. Facebook, Twitter, and Google are already blocking "misinformation" with religious fervor. Whether you think it is misinformation or not is not the point. The point is that wrong thinking is censored instead of being exposed with verifiable information that it is incorrect. You won't convince the original person who posted the "misinformation", but you may convince those who read it.

    1. anonymous1726

      Re: My concern

      And it allows foreign intelligence agencies to easily frame *anybody* they want. How do we know North Korea or Russia hasn't planted any kiddie porn on UK citizens computers? If you're high profile and very outspoken about a foreign government, be afraid, be very afraid. This is especially frightening after the NSO Group revelations.

      1. cyberdemon Silver badge
        Devil

        Re: And it allows foreign intelligence agencies to easily frame *anybody* they want

        This.

        And what most people seem to be missing, is that this system could 'unexpectedly' throw up hundreds of millions of "violators/infringers" among the billions of IDevice lusers. Especially children/teenagers, who may have taken pictures of themselves in the mirror and not sent them to *anyone*.

        And especially anyone who has insulted The Powers That Be in their local jurisdiction, be it the FBI/CIA/RIAA/MPAA in 'murica and the west, or the CCP, or Mossad, the Belarusian regime or the Revolutionary Guard.

        There could be very few people who are not personally or by family/friend extension, affected by this.

        Obviously Apple could not possibly take action against every single one of these people - it would hurt their bottom line.

        So who decides who gets priority consignment to the Gulag? Apple?

        What's the price to put someone on the suspect list?

        Easy to arrange for an app to dump dodgy material on the local storage too. Even if they don't have many apps, just send them a whatsapp message with something nasty embedded in an unviewable portion of an innocuous cat video, perhaps.

  25. Ashto5

    Time to get rid of smart phones

    When is a smart phone no longer a smart phone, when it works against you!!

    I have been in the receiving end of the police with a speeding image and the prosecution that follows, your guilty end of story.

    That was until the day before when they sent me a copy of the image and I pointed out that my reg has a B and the image has an 8, no apology nothing they just hung up.

    A very narrow escape but that’s what happens AI / ML is not infallible and innocent people will be prosecuted withy the threat admit it or the sentence will be loads worse.

    See UK post office scandal if you don’t think that happens.

    Time for a basic phone methinks

    1. Anonymous Coward
      Anonymous Coward

      Re: Time to get rid of smart phones

      You make a good point. Considering my iPhone 12pro cost me £1000 and I can get a Nokia 3310 for £30 .... it is tempting. I guess that leaves me £970 to buy a decent camera !

  26. Anonymous Coward
    Anonymous Coward

    That's me done

    I've using Macs for 20+ years, iPhone - since it launched, but this high handed virtue signalling BS is it for me. How dare they say that they can use my phone against me, it's mine, it's not on loan for me to have a share of when it's not too busy spying and reporting back.

    I expected this sort of thing from Android, which was why I was willing to pay the Apple tax in the first place .....

    I have nothing to hide in the real or virtual world, but hands up if you'd be happy with a company kicking your door down in the middle of the night to search for evidence to the contrary ....

    1. Alumoi Silver badge

      Re: That's me done

      I have nothing to hide in the real or virtual world...

      That's what you think. A careful examination of your life (real or virtual) will stumble on something which might not be politically correct.

  27. IGotOut Silver badge

    Am I missing the obvious (along with everyone else)

    If its only scanning images in the phone before they get uploaded to iCloud....... Why not just scan them as they ARRIVE on iCloud?

    That is Apples service and they can easily say we don't want this on our servers.

  28. Anonymous Coward
    Anonymous Coward

    I still think this is all wrong

    And nobody should be prosecuted for viewing or possession of any form of information, unless they are distributing it or committed hands on abuse. Anything else is a hallmark of a police state, with potentially disastrous consequences for our liberty.

    1. Throatwarbler Mangrove Silver badge
      Holmes

      Re: I still think this is all wrong

      Whether you agree with it or not, the goal is to identify the consumers of child pornography and thus both reduce the consumption of it and potentially identify the originators, reducing the sexual abuse of children, which we hopefully all agree is a good thing. On the one hand, I agree with the principle that simply being in possession of child pornography is not prima facie evidence that you yourself are a pedophile. On the other hand, there are not a lot of other compelling, legitimate reasons to be in possession of it, so I can see why the authorities would take interest in someone who has it.

      1. Anonymous Coward
        Anonymous Coward

        Re: I still think this is all wrong

        goal is good, but results will be something they never expected.

        let's see how you like system after some nice cryptounlocker drops payload of nasty pictures to your icloud. this system is disaster waiting to happen.

      2. cyberdemon Silver badge
        Devil

        Re: I still think this is all wrong

        and the Road to Hell is paved good intentions.

        Even if Apple's intentions are good (Ha!) this has a MASSIVE potential for future misuse by dictatorships, Copyright/IP trolls, etc. It is the epitome of the "thin end of the wedge".

        But to anyone with an ounce of Insight into how the real world works*, Apple is using this as an impossible-to-argue-against (for you must be a paedoterrorist!) excuse to make their product more acceptable in places like China: Look: We can let you crack down on whatever you like.

        And to appease copyright trolls. "Here is a list of people with non-DRM copies of the album you just bought the rights to. Would you like their names and addresses so you can sue them all?"

        * aka Cynicism. Something which Positive Thinkers mistake for paranoia.

  29. anonymous1726

    Our phones should be working for us, not against us

    Apple could implement an opt-in system to locally scan all photos for CSAM. Instead of reporting them, it would tell you and give you the opportunity to delete them. That would be a much better compromise, although it's still a slippery slope.

    1. Throatwarbler Mangrove Silver badge
      Facepalm

      Re: Our phones should be working for us, not against us

      Strangely, opt-in programs for crime detection and prevention only catch the honest.

  30. MacroRodent
    Black Helicopters

    No upside for Apple, so... draw your conclusions

    As expected, Apple comes out looking bad in this, and I am pretty sure they knew it ahead of time. This means there must be mighty arm twisting going on behind the scenes. Speculating Apple has been told by U.S gov in no uncertain terms to do something about child abuse pics, with the threat of legislation to force mandatory backdoors, if they do not comply.

    1. confused and dazed

      Re: No upside for Apple, so... draw your conclusions

      You may be correct, but then why not say so ?

      They look foolish, hypocritical and downright untrustworthy right now. Need I remind anyone that just a few short weeks ago they were shouting "what happens on your phone, stays on your phone".

      1. Throatwarbler Mangrove Silver badge
        Facepalm

        Re: No upside for Apple, so... draw your conclusions

        What does the phrase "gag order" mean to you?

      2. Tessier-Ashpool

        Re: No upside for Apple, so... draw your conclusions

        I can't speak for America, but in the UK, if a phone manufacturer is compelled to introduce technical means to get at private information, said manufacturer would be committing an offence if they disclosed that order.

        Thanks, Theresa May!

  31. xyz123 Silver badge

    Apple's wording is VERY precise.

    It says it won't bow to "governments" but if an Non-Government Organizatio asks, they can happily give them FULL 100% access to your iPhone, and the NGO can just stream the data to whoever they want.

    think Cambridge Analytica on steroids.

  32. spoofles

    Apple has likely already done this.

    Hiding behind "But its for the children..." is just low and unfortunately typical since they believe this mollifies criticism.

    Citizen 4 already showed us the Apple was among the usual suspects in Big Tech taking $$$ to grant access to customer data.

    “I'm not upset that you lied to me, I'm upset that from now on I can't believe you.”

    - Friedrich Nietzsche

  33. Boris the Cockroach Silver badge
    Big Brother

    Kiddie porn? how noble

    Give it 2 years after introduction (we'll not worry about "kiddies on the beach" pics getting you on the sex offenders register .. and your house burnt down.. with said kids inside) and China will come along with "Give us access to the app on Chinese iphones or your can kiss goodbye to your chinese profits"

    And instead of kiddie porn , the pictures searched for with be anything along the lines of "free Hong kong" with the resulting offender taken away for re-programming in the benefits of communist party rule...

    Still its nice to know my Nikon D330 photos are safe from scanni.. oh I'm using windows to store them

    1. MacroRodent

      Re: Kiddie porn? how noble

      >Still its nice to know my Nikon D330 photos are safe from scanni.. oh I'm using windows to store them

      Better use a film camera to be safe... most of my pictures this summer were shot with a classic Asahi Pentax SLR.

  34. Anonymous Coward
    Anonymous Coward

    "a one in one trillion chance per year of incorrectly flagging a given account."

    Am I the only who tends to think when they see numbers like this that someone has just found some random numbers and multiplied them and come up with a number they like and published it.

    They don't know whether any of the numbers are relevant, they have no idea whether multiplying those numbers is the right function, but hey it comes up with a number they like so job done.

    1. Craig 2

      Re: "a one in one trillion chance per year of incorrectly flagging a given account."

      Also, how do the odds drop when you take the number of devices, multiplied by the number of photos taken, multiplied by time.. sooner or later those odds are going to start getting to numbers you wouldn't want to bet your life on....

  35. tip pc Silver badge

    the concept of csam is sickening

    by the time a photo is taken its too late.

    Societal norms need to be in such a way that the concept of csam is abhorrent and just does not happen. Must of us would never consider such a thing, systems need to be in place to make sure potential perpetrators find it abhorrent to.

    no idea how that happens, it just needs to happen.

    I'm not sure that the dominant privacy focussed IT company telling its customers that every photo taken will be evaluated against csam is the way to do it though, it'll just move the no-gooders onto other platforms and likely make them harder to detect.

    There where some current and recent Labour MP's who wanted this sickening behaviour to be permitted. I'm sure there is a wiki page detailing it.

    I'm so glad they are not in power.

    1. Anonymous Coward
      Stop

      Re: the concept of csam is sickening

      Citation or downvote.

  36. Anonymous Coward
    Anonymous Coward

    At it again

    Bringing up the subject of kiddie porn to stop people objecting. The idea will be "If you object to this, then you must be a pedo". Ignoring the fact you could grab any of the images they deem illegal and email it to that "fuck I've always hated" . It gets automatically downloaded, they're too busy to look or notice at which point its now synced to their iCloud and shortly after the FBI are knocking on their door.

    Not to worry, their innocent so they'll get let off eventually. That's not the point, they shouldn't have been arrested in the first place. Now they have to live with the aftermath. "Did you hear old Tony got arrested for having kiddie porn on his phone? He got released with no charge. I bet he's still guilty though" and that, that is one of the massive reasons this is a mistake. People don't forget, not to downgrade this argument but the group of people that comes to mind are comedians. No matter how innocent you are, if in the lime light they'll never let you off, they'll still point the finger much as they do to Cliff Richard. "Just banter innit". No it fucking ain't.

  37. Bartholomew
    Big Brother

    Apple's 1984 ad

    I remember at the time wondering which is apple, the woman throwing the hammer at big brother or big brother ...

    I guess with enough time they can be both.

  38. Marty McFly Silver badge
    Mushroom

    Isn't it obvious???

    iCloud is over priced bravo-sierra storage. Don't put your stuff there.

    And if you do put your stuff there, don't be surprised if it is searched - EULA gives them permission and all.

    For the pedophiles out there... You deserve to get caught if you use an iPhone and store images in iCloud. Not like this is a surprise and no advance notice was given. You are an idiot for not being aware of what is coming and continuing to use the platform.

    Any pedophile worth their wanker already has their content stored off-line and inaccessible. This solves nothing. But, oh, the doors it opens for abuse by those who seek power & control...this is too good to pass up. And such a good 'excuse' this is.

  39. Anonymous Coward
    Anonymous Coward

    So….

    Just disable iCloud then?

    1. Trigun

      Re: So….

      Many (but certainly not all) people who use apple devices tend to do so because they don't have to worry about the nitty-gritty of going through the settings. So they won't know about, and won't look for, the 'off' switch for iCloud.

  40. Kevin McMurtrie Silver badge
    Big Brother

    False statement of authority and accuracy

    It's all fine until you start claiming utter BS like "a one in one trillion chance" for false positives. If believed, that's the kind of statement that can put a lot of innocent people in jail. I dare Apple to prove that an iPhone or iPad can perform the hash with a one in one trillion rate of computational errors. I dare Apple to publish their algorithm for peer review. Can NCMEC claim they can process images with a one in one trillion chance for a mistake?

    Anti-virus software uses really big hashes yet corporations are regularly idled when their computers stop working. There's more than once place to screw up.

  41. StrangerHereMyself Silver badge

    Hard way

    Apple will find out the hard way that people are much less interested in the sexual safety of children than their privacy.

    I also believe Apple will eventually go down over this management error since they've painted themselves in to a corner. Backtracking on their decision would imply their profits are less important than the safety of children. So they'll keep at it even when their sales are going south.

  42. Trigun

    Thin end of the wedge

    What's on my device is my business and private unless the police have "reasonable suspicion" and get a search warrant. This action by Apple, although possibly well intentioned, is the thin end of the wedge. No one has a problem with CSAM being detected, but the scanning *must* be on the cloud end *only*. Also, as others have pointed out, this can easily be extended to non-CSAM images & videos. Maybe audio? Looking at you, Amazon.

    Privacy is not dead, but it soon will be if this kind of thing is allowed to come to "fruit"ion. I fear what Microsoft and Google will try to crowbar into their operation systems and on-system services once an example is set.

  43. Anonymous Coward
    Anonymous Coward

    Data Science

    The MCMEC gets all these hashes of innocent image from innocent people, along with hashes of CSAM from other people - so they can find people who have some images in common with those non-CSAM images from the "other" people, and thus put the innocent people on a "suspect" list ... ?

    I'm sure there's a lot of interesting "data science" you could do with collections of hashes from people...

  44. Chatter

    Jeff

    I am in favor of an corporation in the US, Europe, Canada, or Australia conducting searches of information they have access to. So long as those searches are limited in scale, are ordered through request and authorized by a state supreme court, monitored by their government as an oversight, are not specifically requested by the government and the government is not an active member of the search, and positive results are fed directly to the national police. The national police must cooperate with all other nations national police to advise the offending holder of the information collected in writing, no immediate arrests nor can the national police use the information in the court of law. The results are there and can be reviewed by all publicly elected politicians. Cannot be made public to any new agency of any kind. Violations of any of the above would mean a prison term of 6 to 8 years without parole in the states maximum security prison.

    When server and network security is like a safe with four dial locks and politicians have sufficient limitations that being a politician is not a carrier we can discus changing the rules.

    It is way to easy to get things on your computer that you did not put there, on all platforms, to say that an individual is totally responsible for the data on all of the storage devices the manage. This is if indeed you can say an average reasonable citizen actually actively manages there data at all. On top of all of that we have no computing devices that are useful that are also connected to the internet.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like