back to article Apple says its CSAM scan code can be verified by researchers. Corellium starts throwing out dollar bills

Last week, Apple essentially invited security researchers to probe its forthcoming technology that's supposed to help thwart the spread of known child sexual abuse material (CSAM). In an attempt to clear up what it characterized as misunderstandings about its controversial plan to analyze iCloud-bound photos for this awful …

  1. elsergiovolador Silver badge

    Pear

    Wow the level of contempt from Apple is astounding.

    "Look it was reviewed by researchers! They say it's safe! What else do you want you stupid customer?"

    1. Lord Elpuss Silver badge

      Re: Pear

      Your characterisation is an insult to the value of a good vendor/security researcher partnership.

      1. Anonymous Coward
        Anonymous Coward

        Re: Pear

        At no time has anyone complained that the scanning code might be insecure. This is a deflection. It might be insecure, but the security was the least of the problems with what they're doing there.

        Obviously, Apple could simply do the scan on the device, put up a dialog to tell the user what its doing. No need for hiding it, if the user and Apple have such confidence in Apple's system.

        "Mandatory child porn scan in progress. Please note, you consented to this search in the EULA when you installed the latest upgrade."

        "these 3 images have been detected as child porn and will be inspected by our staff, if in their opinion the images are illegal in nature, then they will be forwarded to authorities in your country. [X] add your explanation here:... [X] add other images you wish to be taken into consideration. "

        There, done.

        They have confidence in their AI based scanning code, then it should not be an issue to telling the customer what their software is doing behind their back. With or without the proxy consent of Correllium, (as if any Apple user agreed that Correllium could consent on their behalf anyway!).

        What is the problem here, Apple keep assertings its all good and works perfectly then there's no need for this weird deflection, pretending the problem is 'security' as if you're marketing against hostile users too stupid to see through that claim.

        1. Lord Elpuss Silver badge

          Re: Pear

          You’re creating strawman arguments. There’s no deflection, and Apple making the code/process open to researchers isn’t just about security; it’s about promoting understanding of how the process works, in an attempt to reassure both users and research professionals that nothing untoward is going on ‘under the covers’.

          Personally I have no doubt the process will be secure and will work as Apple intended. My PROBLEM is that even when working as described and intended, I find the process highly intrusive. It gets a solid ‘No’ from me, even without any concerns regarding misuse.

          1. Henry Wertz 1 Gold badge

            Re: Pear

            Yeah there is deflection. The concern is about invasion of privacy, and that the neural network could misidentify things. (Edit: I see it's using hashes rather than neural network type setup.) They have deflected to "here, check out our source code and look for security flaws".

            1. Lord Elpuss Silver badge

              Re: Pear

              Nope. It's not deflection, it's broad-scope analysis beyond simple security. It's about understanding the technical framework, evaluating against current objectives and validating the design principles against misuse. Code security is just one component of this; go back and read the PDF again.

    2. Anonymous Coward
      Anonymous Coward

      Re: Pear

      I propose El Reg to ask Apple for permission to audit their code.

  2. sqlrob

    Look, Squirrel!

    The client can be 100% secure and do everything it says on the box. Unless this also includes auditing how hashes get in the system AND keeping that audit 100% up to date, it's really kind of pointless and doesn't prove much.

    1. DS999 Silver badge

      Re: Look, Squirrel!

      If Apple or someone else was able to surreptitiously add hashes that would identify other images, what problem do you see?

      Is there some other class of known images that Apple could want to check for on the sly that would be detrimental? It isn't like this system can identify new images, all it can do is match to existing ones, so it is really hard to see how this can be abused. So Apple adds a hash of a known image of the Taj Mahal (not ANY picture of the Taj Mahal, only one that's near identical to a specific preexisting one that the hash will match) and finds out who is passing around that particular picture? For what end?

      The other possibility I guess you could worry about is that some bad actor third party could add hashes - though if they can do that they probably can modify iOS itself in which case you're fucked no matter what! But let's say all they can do is add hashes. So they could add hashes for common internet memes and create a huge number of false positives, overwhelming Apple's ability to manually verify that they are really CSAM. That would bring down their ability to identify CSAM, or in other words put us exactly where things stand now.

      Am I missing something? Can you come up with a scenario where what you suggest would be harmful?

      1. doublelayer Silver badge

        Re: Look, Squirrel!

        "Can you come up with a scenario where what you suggest would be harmful?"

        A repressive country, the Democratic Republic of Tyranny, has a protest. People take pictures during the protest and share them with those in other areas. People in those other areas see that they are not alone in their displeasure with the government, and the government feels that protests are likely to occur there. The DRT government tasks a group with collecting those images wherever they have been shared. It tries to block those images in their censorship system, but at least it can't track down those who have it. Enter Apple's system. The DRT government sends the hashes of those images to Apple and gets a report including the identities of all people whose devices contain that image. That would include the person who originally took it (was at protest, definitely guilty of high treason), the people who sent it to others (promulgated information contrary to the government, also high treason), and anyone who received a copy and retained it by choice or chance (just normal treason).

        The DRT would have several ways to add this into Apple's system. The easiest would be to call them up and tell them they had to put in the image. If they called the wrong number and got someone who would complain or, it's imaginable, refuse, they threaten to confiscate Apple's assets and cut its business; Apple quickly caves. However, there is an easier method. The country likely has some police system which investigates child abuse, or at least a police organization which can pretend to investigate it. They submit the hashes saying that it is abuse material. If Apple includes it, the DRT gets what it wants. If Apple doesn't include it, the country can go out in public and accuse Apple of being biased and failing to protect children when given information to track; Apple quickly caves.

        1. David 132 Silver badge

          Re: Look, Squirrel!

          Excellent example, but I have to point out that ever since the Glorious Proletariat Revolution of '77, the country's official name is the Socialist People's Democratic Republic of Tyranny. People who question the validity of that name are given complementary re-education and training at the People's Happy Learning camps in the interior of the country.

        2. Anonymous Coward
          Anonymous Coward

          Re: Look, Squirrel!

          Democratic Republic of Tyranny

          Funny, we're still called U.S.A.

        3. DS999 Silver badge

          Tank Man

          OK so I could definitely see China wanting to block circulation of that famous image of that man standing up to a tank in Tianamen Square. Because there's basically just the one iconic image that exists it would be easy to match with a hash based system.

          Imagine if a similar protest happened today. Thousands of people in the crowd, every single one of them carrying a camera. There will not be just one iconic picture, there will be thousands of pictures and videos of that event. It would be Whack a Mole trying to block them all, because there will always be another person coming out the woodwork who took a picture that hasn't been shared widely yet. The hash based system totally fails here, because they are all different photos - and that doesn't even get into whether such a hash matching scheme could work for video.

          1. doublelayer Silver badge

            Re: Tank Man

            No problem. If the group finds a thousand images which have been widely shared, that gives them thousands of targets who took the pictures or stored them. Let's say they only succeed in finding a hundred of them. That's enough people to achieve several goals:

            1. At least a hundred people who took pictures and shared them is a hundred dissidents who can be removed.

            2. Those hundred can be questioned to find more. Some will comply with questioning.

            3. A hundred is large enough that people will notice that the government was able to track them down. That's a good advertisement that protesting can end badly for you.

            Even if there are more pictures, that gives them quite a large head start. If there are, they can add them to the filter later when they are found.

            1. DS999 Silver badge

              Re: Tank Man

              How are you assuming they can find out who took and shared the image? They can only match them to images that have been hashed, so they have to be identified as "here's an image we don't like". So if you have a copy of such an image, and send it to me, they can know I have it. But they don't know who I got it from and certainly don't know who originally took it.

              Being able to trace back to the source is certainly possible but this hashing scheme doesn't help that process at all. It only helps stop the spread once they have found the image in the first place.

              1. Falmari Silver badge

                Re: Tank Man

                @DS999 "Being able to trace back to the source is certainly possible but this hashing scheme doesn't help that process at all. It only helps stop the spread once they have found the image in the first place.Being able to trace back to the source is certainly possible but this hashing scheme doesn't help that process at all. It only helps stop the spread once they have found the image in the first place."

                Of course it helps the process, it is the very start of the process. The hash identifies an image which may have been posted anonymously. But now they know the account details and therefore the identity of those who have sent it to Apple's cloud. From those who have been identified they can now backtrack and maybe find other images to hash.

              2. doublelayer Silver badge

                Re: Tank Man

                It gives them a list of people who have the image. That likely includes the person who took it (sort by date uploaded, pick the first). However, even if it doesn't, they'll be happy to target those who received it as well, who could, under questioning, disclose the person who sent it to them. If the source of the image is their primary target, it's just traversing a tree. Since those who received the image are probably also targets, it's traversing a graph. Even if the source evades discovery, there are lots of others who won't.

    2. Anonymous Coward
      Anonymous Coward

      Re: Look, Squirrel!

      Like "Certificate Transparency" on the internet?? Auditing is not transparency.

      They swapped Certificate Pinning, which forced the browser to reject a cert being swapped in for the actual *real* cert, with a system where your browser reports every site it visits it Google and Cloudflare and Digicert.

      And they don't fooking check the certs, they just say "yeh, the cert authority sent us a copy of that cert its sending you". So they fingerprint your browser, and log your IP, and from the hash of the cert they determine which site you visited and log it for their records.

      A security measure turns into a privacy attack, while you were not looking.

      80 million certs plus this year alone, are there 80 million new websites? No. Do they check those 80 million certs? No! its impossible to know if the cert authority has correctly issued it, they only see the cert not the data used to get it issued.

      Did those certs intercept TLSs connections? Are they being used as an attack on encrypted traffic by governmental agencies abusing some mass collection warrant? They have no idea, because the content was never checked.

      Auditing is not a solution here.

      Tell the users if your algo flags their images. The only auditing that counts is the users.

  3. Doctor Syntax Silver badge

    How do you audit a precedent?

    1. Lil Endian Silver badge

      Yes, it's a diversionary tactic to bypass the scrutiny of precedence.

  4. cornetman Silver badge

    They don't really address how they are going to handle governments making them use the technology, once it is up and running, to bend it to their own ends.

    Like scanning for distributed pictures of Winnie The Pooh or whatever is the demon de jour in the Western world.

    1. elsergiovolador Silver badge

      Rug

      This is an equivalent of a rug company sending a Roomba equipped with sensors to sweep your house looking for traces of drugs and then reporting you and we are at a point discussing what if Roomba will start collecting DNA samples from the rug instead of rejecting the whole idea altogether.

      1. cornetman Silver badge

        Re: Rug

        Don't get me wrong. This is an awful idea and Apple are going to regret going down this path.

        I would be interested to know where this came from originally. I cannot believe that they are that stupid that they didn't realise how this tech would be bent to the ends of the likes of China at the very least. It will be a case of do it, or you don't sell in China.

        1. Anonymous Coward
          Anonymous Coward

          Re: Rug

          @cometman

          I would suggest they have copied it from Google, Facebook and Microsoft (see link). The only difference being, Apple is open about it despite the shitstorm they must have known it would bring down on them.

          Can I point out that despite using Mac computers, I have never had an iPhone. Never had a smartphone of any brand. I have a perfectly good original Nokia 3310 and when I am out with my mates, I sure as hell don't need the internet with all it's bollocks of interruptions.

          Repost https://www.bbc.co.uk/news/technology-58206543

        2. tip pc Silver badge

          Re: Rug

          “ I would be interested to know where this came from originally”

          For a business that has continually got things right, certainly over the last 10 years (see share price, dividends and profits), it does seem strange that they’ve screwed the pooch sooooooo badly (what a terrible phrase).

          If it was obvious to me it was obvious to Apple that it was a terrible idea.

      2. Anonymous Coward
        Big Brother

        Re: Rug

        Drugs are passe.

        The vector will be spousal and child abuse with the FBI requesting' that Roomba search for and report on any blood traces it finds.

      3. Anonymous Coward
        Anonymous Coward

        Re: Rug: We've been here before.

        https://en.wikipedia.org/wiki/Stasi.

      4. Alumoi Silver badge

        Re: Rug

        Damn it. man! You were not supposed to tell them!

    2. DS999 Silver badge

      This system can't scan for "Winnie the Pooh"

      Only specific images of Winnie the Pooh. If there's a meme image circulating they could identify that, but if there are hundreds of different memes it would only match the ones they have hashes for.

      1. mark l 2 Silver badge

        Re: This system can't scan for "Winnie the Pooh"

        Not according to Apples PR dept, who claim their magical technology only uses hashes of photos, yet can detect similar looking photos or where its been edited. So I say these two are not compatible and it must be analysing the photos using AI to pattern match them rather than just hashes to see if they match know abuse images.

        Plus why does this need to be done on device, if its only for photos uploaded to the icloud, why not just scan for the photos when they hit Apples servers and leave the privacy in place on the device?

        1. Anonymous Coward
          Big Brother

          Re: This system can't scan for "Winnie the Pooh"

          Doing on device means that images can be encrypted in cloud: only on device must they be decrypted. If Apple can decrypt them in cloud then this is a backdoor: the sort of thing governments would like.

          1. FILE_ID.DIZ
            Facepalm

            Re: This system can't scan for "Winnie the Pooh"

            The iCloud data is encrypted at rest, but with Apple's encryption key.

            Source - https://www.apple.com/legal/privacy/law-enforcement-guidelines-us.pdf#page=11

            1. Graham 32

              Re: This system can't scan for "Winnie the Pooh"

              "can be" being the important phrase. Some journos have suggested this csam move is a precursor to full iCloud encryption with user-owned keys.

        2. Irongut

          Re: This system can't scan for "Winnie the Pooh"

          > why does this need to be done on device, if its only for photos uploaded to the icloud, why not just scan for the photos when they hit Apples servers and leave the privacy in place on the device?

          Your suggestion is allowing the security to gallop out the open stable door. Plus it would mean Apple handling CSAM and having it on their servers, by scanning on device they can prevent the images getting to any Apple owned equipment and prove the images were in your possesion. Something that may become important when it comes to a court case.

        3. DS999 Silver badge

          Re: This system can't scan for "Winnie the Pooh"

          Not according to Apples PR dept, who claim their magical technology only uses hashes of photos, yet can detect similar looking photos or where its been edited

          You aren't understanding what they are saying, or what the technology is capable of (see https://www.apple.com/child-safety/ there is a lot of information available)

          It can match the same photo if it has been modified (cropped, resized, quality changed etc.) but not a different photo of the same thing. If you and I are standing next to each other and take a picture of the same thing, they are subtly different just from the angle alone, let alone if we have different phones or different settings on the same phone that result in markedly different output. They will not be matched.

          This is designed for taking a bunch of KNOWN existing child abuse photos and matching them, even despite cropping or changing the size/quality level which is done all the time on the internet for photos. It simply won't work for similar photos not based on the same original, the hashing depends on details of how the image compression was done that end up very different between two originals of the same subject.

          1. tip pc Silver badge

            Re: This system can't scan for "Winnie the Pooh"

            If you and I are standing next to each other and take a picture of the same thing, they are subtly different just from the angle alone, let alone if we have different phones or different settings on the same phone that result in markedly different output. They will not be matched.

            The whole point of Apples technology is that it detects those 2 photos as the same. The clues are in what they’ve said. The tech is designed to resist defeat by edit changes. No doubt if there wasn’t sunch a stink we’d be hearing by now how their tech can use the example hashes to detect new cases.

            I’d rather there was absolutely zero csam, it’s beyond deplorable. By the time a photo is taken it’s too late. They need to be stopped before.

            1. DS999 Silver badge

              Re: This system can't scan for "Winnie the Pooh"

              No it is not. Those are two different photos. It won't detect them as the same. Maybe read Apple's documentation I posted a link to above, you don't understand how it works at all.

  5. revilo

    who audits the hashes?

    Do they really believe we have an IQ of 50? Of course, one can audit the software which produces the hashes or compares them. I trust that they can program this correctly. But nobody can audit the smut which actually feeds the hashes. Or does anybody believe that the database of smut pictures is passed around to security researchers? Craig Federighi is an intelligent person who knows that he is misleading the press. The system design by definition to be not auditable. The basic fact remains that every user is subjected to a police software, treated like a criminal, gets a hash of kiddy porn pictures loaded on their machines and will be completely depending on the goodwill of the folks feeding the offensive database (which is not apple). In the future and some countries this will certainly also include politically offensive documents. Apple is misleading us also because it would technically be no problem to compare even encrypted files on icloud with an offensive database. Nobody would object to such checks. That the police software has to run on every users machine is completely new and unacceptable.

    1. DS999 Silver badge

      Re: who audits the hashes?

      So how is this different from the photo scanning that Google, Microsoft, Amazon, etc. clouds are already doing? You don't know what they are looking for, so they could already be doing all the terrible stuff you imagine Apple will be doing.

      Other than not using the cloud at all with any product, there is no way to avoid this if you believe it will be used for terrible ends. Check that, other than not using any sort of computing device at all there is no way to avoid this, because given what you believe you would also believe that Apple will check photos even if they aren't uploaded to the cloud, and that Android and Windows will do the same. I suppose you could use Linux, but you better compile it yourself from source - after checking the source, and checking the source of the compiler you used, and reading "On Trusting Trust" and realizing that even checking the source code isn't good enough if your level of paranoia is permanently set at 11.

      1. Anonymous Coward
        Anonymous Coward

        Re: who audits the hashes?

        @DS999

        Assumption alert: Your paranoia does you credit......but you are assuming that the material being scanned is in a widely accepted format. Bad guys probably use private encryption before anything enters any public channel!! So......good luck to the snoops, any of them......NSA, GCHQ, Apple, Google.....

      2. Anonymous Coward
        Anonymous Coward

        Re: who audits the hashes?

        Yes, it's the Apple customers fault for being paranoid.

        If they're not happy with Apple running their AI pattern matching software on their private photos, and then sending fuzzy matches up to their staff and contractors and teleworkers for review, then they should not be using Apple products.

        If they're concerned that Apple has removed their privacy right with this suspionless search, and apparently this is legal in the US, then they should imagine what else other US companies are doing behind their back?

        All those US backdoors in Google's cloud, Amazon's cloud and Microsoft Clouds etc.

        All that slurpring of private data, for anyone in a three letter agency to have a read through if they're bored, or you said something to upset them.

        Today its the iPhones, but tomorrow it will be scanning iMac's SSD drives, and if you don't like it, don't use those Google, Amazon, Microsoft or Apple kit.

        It's good that you try to drag all US cloud tech down with Apple, DS9999.

      3. Kabukiwookie

        Re: who audits the hashes?

        Other than not using the cloud at all with any product

        BINGO.

    2. Kabukiwookie

      Re: who audits the hashes?

      Do they really believe we have an IQ of 50?

      Of course they don't think that everyone has an IQ of 50. They're only targeting their existing and future customer base.

      It is a shame that this will probably mean that after Apple's glowing example, other manufacturers may be pressured by politicians to set up some similar scheme.

  6. Lil Endian Silver badge
    Thumb Up

    Not Just iPhones

    "I also think it's interesting that they're offering research grants towards doing research for any mobile devices and not just iPhones."

    Well, naturally Corellium wouldn't have a singular focus on Apple for any reason would they?

    *cough*

    So, now it's the third party researcher that chooses the target. Corellium covered. GJ Corellium :)

    Edit: Being less cynical, it is a Good Thing (tm)

  7. Anonymous Coward
    Anonymous Coward

    Encryption? Apple AI? So what would this be?

    Is it some banned material? Is it even a photograph? Or is it just the output of a random number generator....designed to confuse?

    *

    To get to the point.......the bad guys can also do this sort of private processing BEFORE material enters a public channel! Maybe Apple AI can figure it out!!

    *

    b4fRrc9IIRWWKz1i4v0AKkvuF5cZQbEmRDRuQb2uKdL9NzPVoR+yRzpeVhDkCgV5d5uUeLR5Apoq

    KjYxv1oQQFMww0EU8ocZSqRz5679PAMoZrYgrJqtxRfjd6fPyvfEZy/GQ3QJUsSYt4YsOb0RJT+Y

    j7aH9JzGynPqc1gbZApWHBYtHK8S2Jzo/RXT4/h804YE6tGtPImpIdCYB2EMSGdxdOdhn7wxYk9Z

    ojmUwVxMe/KrYOiP6pa8PJqzfYXlwJc9mh4sPxYoZrYgrJqtxXZbctND9BJ9h0yVI0RmkQyYt4Ys

    Ob0RJT+Yj7aH9JzG23cZ1ToAVNhFZm5r5cw4Ibut2ofEb/0y/oEMELwOqb5+roWbGvlE6LemzRhR

    whsg26Mql2LiNxiV7HAdM9sMWAvl49PioSINe49vENgB+XrUsy8BqzzOd+JrM2K8OX0E1jhPFOfI

    HBBL3gpmfE5mxnCspVAIeZbcksqE/V9CZM55eg+uhCx3iEhQnQjoKXspwbsvS7dfc2lNQEC9Q/p8

    XYE848NWOFL3S1LS7bErr51TjoSmQ+tHv05c/gLQM04FcC1RRC0hBuAoI8uGroI8kQjSS1WtfvHl

    ed2pSJUeHBEkoEcdAXldc+zylIqePZtZQvC/mIYsB9VcyXDRpIym0yV1y/KqttwsY1wrBrqipDBH

    ArObIOE1XixU9XEaglYvgOIqR/UP+ATuYcHN6Oal4Bifd2YJW4ZV8fPYwmx0JHbN6nnEUfVsdnHV

    012ezfWzlevFC2yHTuN8oAwFr4hrztVB8WGu3suCRiaxjzj0+AyPKNHTdxscp3htQGfy0zauhHsY

    EVGVTot1AZyH+0VFPVg0NAmgkA5yHwLb46p83Y5e20WqGtcAGAQIzo25Dsfo6Wg0JKbBOCIxN7tC

    blfLJk2uUuP4XUnuGxYltSImA6xshodIjQGfU8XPhSSzyawpSGknvbS4wHKVP9x/tfqo+BSRaUOP

    RAfAG6LN/dk4U6Vd39VJaGmy0weOBXtcz5aObhGmd9BfDAojRMdVr9iiMh/Kcr0C+d9nvPjDisZW

    EuKR764BpcA+am2xG0vtotx+rw0O2b4jE1k8OJiQ3QM+bhcvcAuCkYrW3YuJweve3+X+4DNzadQp

    8ChunAMPgb7t4QtjWo/xAlxuMSA2DKO4ZLHz1MwRayv0HhdIQswSXiLIa10wgHHpR1YrR6bf85SN

    DhiZoGeud/GoDhu46QgkGrnXntETNwrL1H2UriPT9q+a1NIHunrux+XT90VUo7te4VTX9eO1rqVS

    Do06UUblBkHqT8kvRT9f72+zQIhU2Ez6kyDELfNXEUUx2Rj+fp8Epc3JL6Fg/9tuQTkP1Jd2/Fgi

    Q6ZhJYn4sZ6YqEDj1XSpLCTDeoI33w9uLSztIgzjNqjOQb7vX9rVc+S7ZQREFiCgYpn7d0mRiwUf

    61uIKnAIVOzAAmFB9CU+BNV9XxsMkSnlg3eJp9YNBY/o6+vPwgpdNljJU6Q83QTMOBrCx+8MKzSQ

    QfROUPkJsYiJqPprHXYfIomvd0swRPh5y+UQDavjkgOxFxGx1FwgFsQfV+loV4WwpKo5JgS+awB5

    azYg4LUKh2dEJxcG4xUOTwIiQJzjoXYhkhB+636AayHBM1EIZ5oybpvsiDJgOLkThBwCKls2zUng

    ZqAfAuogRJVCUG9fLBgOGKVz5gkC3gtT/L7WOG+/HWyn8K6AT6sGbkiS6bh/pR9bs2I8ZtC+pw/1

    jSSKZYysMzruc7iCMBCxlF0AJtfS1Po/Mr7B96liCdSEp1RO0GvXIlX3lmCM00GVsm0IPp67YRbb

    xKlT5RtP2uAbes/CJxqbGMdd7oDCcQwsZFronspMJLYSs4QL0tAi1BggEGMmMSGm51jRaD5/2aBR

    6d0/UEi4nbYRFMTvQ6C4ls8gazRjUKwYRdqNuRr6czJb+z4SB+scGqjY2THTPmkM4b8vJ73cIzOT

    hyk9ja7f0bNP0RyqwpkE39IbfFWJrh5cvEcfj4rh7ci3UPG6sNqbt34LFxfHR1zkztqgMjrgXKWV

    InOBZc6VTNm0OAz0CRDnoN/h3EoB5UYSbq94BYxJFDv3dXPt94qUf8B2Gz1+jbVhMjd6HCH3W/19

    8HzIBkSNRPfW9UFB1lLlBv0SjOL9j2Q7uRFp06cBuzgapPDq9pZnPKD3pgXgPuikM1rNagQaEwcZ

    y+ldKFuXiCTqM9rXRHCqXLmnuTdss+mrd92cqH/hYyKxfeb/Ocb5CGEq/FlsD5ESt81P3ZuCxpGn

    kK5S8q56IFpz4eVDW2FZrY03s2M6pewzj68K0RAWoUHLzaS9DDV1Y2jo1DsxTPMeMmxcjWhrEocO

    kEZqySbEFZ/FHE+vTHLcdZp6TLDbJq6g6V6qq996tV4h5igZhwAwOf8ijmxL89RZCoP3mI37HKv9

    OnRvgkU5SsgN0SV13Du0+G0IPp67YRbbLPC3PmEKPIvSnG1g66AgdTuO83hJqt38JCzkNLwWeE+u

    dJUUq4shFxs48J0SfOCyV6Z8zgOOa/GD2sIwslt0F72gnfQ0ursM8E82Twd0aakIlqXMeeNpVwpD

    bsiJAYvopDusl47CyMlN8BXvBDHl/oOy8QhbDtCdtFnVPtacVXWFfbegqc4VcuCJIlgt9afCKu4j

    r4qi/w5dDCkTDtafAAD7EOYbW+qy0eTP1Mm9IZNbr4Gqcpnl3aGy5MHRrjXemPjW8GU6RHQHcNCd

    8HZ/w3t2hpIFyoQ0VtvPjrgD3pJ0tzP9O4dtS62ehGNm43SyNynIHQ72ZPvVJnQVV1FzNZy+dshS

    /3XltmFnZcdDnc1GP+4ZtRcWCncKGBJMvtUrNcBNzwEL3wv2vxL89v7mQgoVZAPc16cuZbkvzWwX

    2gA8GZG2tU767eT/MXtgwY34bLCAA6QKfhzuYXTCpKyKgs82Kw0EmFW52vEZKKlWpUQybQFJXctB

    1mEo/2R2jM19qvc2uSn642EFm9QcbLVKTCgeEvSa2rE380/zfoHLOWdE+DQAUKUPlE42OcORpQ5U

    +sUj6hEB6CMlS1UCwZASQ6GXuAppn9JCJRNFcChqBvorVjPtAM7Tf9v2O6Wwd8StWvSTr23H/OQs

    l8dTXEmi8EfkOiDtMOmRowiq538qOnL3c8pPhLxwepIbDzpx5omACqP5CvotKn5pHBrIIkPlUjSm

    bdPwExGBNIfrHYfcevZe0i8sl/Yh7csbkvQdFjXkJCSktU69+762OYECRFukbTMOUE/pdzsNzNnu

    SF147lTndw4pTnw64aKT2wxE80voDxqJEs0aFWsUwhAVtZazG/rDT2Rz7vhoeG1AZ/LTNq60Ts+R

    dcbQmjBUiVjJgEwgk/fHWjxEFNw3HRTx9fia2GUMt2vQBCDCI44xwplzog/Ov8SwKLxOfYWZQ3bf

    /mjDTfEFJmYCgr5d3YRMz9A7RoFXXRh6PQlCxRhiQCZeGCIjfy9oXAFlsHlT0iRk/ClRClmNYK7c

    jfNfyzEuup/n5CgW5AxFyGA9P64CRlupxsBTVAWNwqZawVwDf4VXP3ExpoP9omQ81Ppg945dTwl7

    xfaVvt6S31VEnYZ0dLMPmOqxN7rI3AVl+S2izBV8hYbJbmooq2VMHsiuZ2aubh/lsmnQ1p1PZI4f

    stCKxGjkKAwygWHr67JrQOUQ9V03HcGx4lGZw7j6fHrULK9vHbCDEogHK4/SnO9g1ziks26H+WfD

    qAeAYdwjwBYG3XEG0DUuOuTXSiifItXgO/s5u5d1zXhtQGfy0zauFGC2ON3WXULuT3tUXQlmw+DQ

    BRJkDX5nOXptQMLGsmu1cIdrk7w+MLi7Zc85b506389OnxImKhk1yl7gwmT3/ICN8d+CFyqfk5zB

    Q88DEoDz+nst0KjxEtyClSFddmPq/0siZByzY5AKBFLCAk7EhFrPUaFNpLRO/cTkDrsZX2P2SQ5S

    CUMrz1WT48r4JSsz2PaoD+k8pfceuHSV2R0cQLoRzwitcY33wh0A9gK62gNBMs6CbWDbRRcsnpDQ

    wMaEYQdpRb6kQ0yzqAj9DI5hRD7/s1SMM1DrDI4ICKDlGC9CVgFOwZaqfA+6GZmeb3HowT8npcF+

    Tni8g2986JJjUbcKnOmVG+CBlb6j9ww4JarLe0SAklL21WiZOigwXIdpMs6Hr8xCi41ZHbidybTA

    tZMFX5tHgLmZDj+KDD6t91918Qqefh5Hgx3ZiSlAEBVXvvtao68ZuXqkqjgKtnPgCCwHuYJlNkRL

    k64Kxck0jWHtKbUuMk2LqU1LH6q9hNaIh1p4Y6XMj3k6HgBMy3K2zD7goL8Rf99TpJonlJBbPDIf

    2omJWyYHmp9p/xGYporaDeIJ+oun6NN4X3LRFpqNr3gdjAMocNHNuJOi7hoc0Jlszfc7wz9+Jlk5

    r6vTWSV58i6gctqsB1MsQ06Ggh8qImWg3y96w8LUfJjGDm+9JisBQ5WcPvzkXDBE65DYtsDaTwsq

    k7PaZl29yfNZZGQlha88nwOiWD5/AT2xv1kUgg4lKIVOMTFDSes0attI6Z/Pgk4aysb3fhWYt0Hp

    HvY48jZGJEmIVGL99iHhZrV6+MTT9ZwuQSQqkv5eOSplG5FW9i2xqGDsMPKyMieteNSJzkl4Vxee

    yXGhDj0KmmkeAEPuuLYkmtttTlQTdUmV9Vt/chu7rq+Tz/knFEfKuHC62JN4bwvXxMxLTdqrAGoM

    inL5y57WbUT2EJs06TaQaeFZnpfmrX2h1ZPS4zu6T/Dbff2QNHHbjH9bgcxbj0J4g327f+a9urzi

    jB8qnuphEYluKlykA0/L9YbZD8KIAx7NLTjLxvHpkBhuNuvcbdPafTMEaKQd57wfffDvnybBYh1h

    5RoQoborhc83B8mnSofLFTIULUUma0yHzB2bxspomxQ6306EFBy4lP3r+NrsdTtm4XC8v/PnL4SA

    4ipH9Q/4BDp0KH61BKSYO5C+ovvKzCvI7vIBlCFPsW2d1V6BMeD128Hu1/p8ThPqWykXhbwds2bz

    p81AxfDQv0P4h90jfRoqYcIxCuQBIqCWIafq4LBCxaAUFTrzHkWMu1SQKeXMOxftfqLECuvfaDen

    NHFrV4afMR2q5t/ipwMy9a0KLFMP0yiCqfQH1voGfHCh0IN8mpk6G+CGavbvoiNerfs1+m8ckOaP

    RZNF4Ck8fD5J5wGlpr/N1+SeECGgLNtzZ1BHn33WazW4MEfeyP/hXJa2Oz19rQTEqgwoa+8NKYs1

    P1L93Ev+zG2HkZOwsjPAIccXteZUhPO6kNpRKFrvRNfRyli4Njdh9aqPxyfnYFNETao20poouYgR

    TTDCVamcest9O67lJgXgIHjyj6r3M8NSlpmwAXGt9ftqoyGWTqgrams8a4EP6qo2QhO00ww03HZT

    xyFYbyvRKul/VLoR7O/RTfVZC3La9XTMIhm45o7TyiWwjcf10o6NPC69jA1lU5Wr6Z0DH7gxh72S

    +gm0fQ/bACITTHz/H8BRCzRqAJLLvpwD82oKTdjo3wG2yWA91g0rIKrWrRwGD0/3ym3liAFo9Oni

    lQOdQrMnz16nuUoz0IFM+p8fC18swHW17cSga4Z28YrCoRH8EKZ4qPH3hJnhuwtAuL2gcxO1oc3P

    QHXhwTevdkp/puaAWjSmsRXD6AkPO5kOxagqKO/WtymeM4kdEvMsyF4QVPIcJpPH745vAVfc3TCR

    5Gqp92kFyPFjxo09pZFAeJt3X5RycQxCdtasZS6JPtpFhEjLJcNIN3LRFzjgBrbymWa4Waa1XYmr

    ym7hhTBV4po3JdmwDMP5zhRR7f7cZZcIjQ7ezzTNs2LZLlUnS/i1SkEKtF18bnuK0dPSJHCnufWi

    VY2TuT6UombweZzJEy5hG7+OIn1vH9ONSFdNpnUWDRL8tbfLWoLsrUPlcaDaKm6nPW8G01/OMTYH

    ETBgGiWu59oXi4OYJJYa92w/CUH4iy0fUyWTNPy5FiPKoA0j982NsFLGfSY6bl4d94fNcBUFM2Z0

    qfLvHEqx93Atucpv4lpfvxoYHsv5qMB0mWnS6VPm4a+n0M5UpCiYeQ/8fEl2FoIVMOCQ86OEBF+1

    ufduKEkyrgawWq/t3h0IMRqxNRzgbSkUVQyh1XyeWp4+m+V+0I5OdpGHsvCrW4ysKxGXUYK5WiqM

    m8bc35fMyqr3Nrkp+uNhFF9g7UcZdMGlV6niSbRJPACYbt3WoTI2QbaqYFcXrpYxMIp8N2yAAlK8

    PXe4HRrYwImRdNepkOUxnZJvtvMCsK+iQRZ1aBaAP1hO5yvJMFlbF/zzbNbKvzJgH+Ouyam0Kn5W

    sYgIa5MJ44rE6gghJxlJjHxgY0OGfJUj/s8EGCxudyH92jLJXiG4JZ1T1Pf0c85IHFkWLrG05pYD

    Dp2dSl+Eg2vsWf4fXBL01gxpqQApO5WUKq5fjxfyIKrSIv5nUZs6pSbVmnvKek52MZiQ1MmmFbi2

    sT0O7192VPslBVZAcpST3h9RrSeqwbsmo8TyAHSsu/G3FNWI0Mr8VhSsUS0e/WM4+1Nppikx0xpN

    c5YX4t6OAO8jVwq0X9kklkpvrrwdwSFhAIJUmlp1/psrcv1R43qSBGlHCXF1aYyA47IAUEggBHBf

    JfbWRkVM3OO8wSMBZkNd3qOWdVtHkPelI/zF2YC+0x6r1+oi7s+2j7CvOa06MvU/Bc43gYNsXbXx

    ZNQ2BvBHkpaTuwBBRIzDkBESdZiBmNVXXM6AKKPv02Wm8FpZK1KhR/zFDdoAmf1uBFKxLUCFekID

    13HFCNnFAjcPxbwE0d6oBBaBGVFXy9P5ltbG+pXTwZzZqyU0V7YgFgpMsgzwMIa/PSA/PgT6haIl

    3tmyT+J/knD/RgVqs8K+XKzqlcmAoR5XpE+GBjTgjLT7jnCbO1tGvrueoVWEXxMFhYOikB+5KUsT

    E5IQHzvqLdVaA1kcM9lLZL1FgnZyYjuBtnhIq19R8rnMsffwRK6bxPyGzaOR8ldPtVIJLIs0UHtA

    OFUMHE88mwXbRTdjuY4NDyTZxaE8dyH2xBlv7VPNU6QGCoJjuCZxFaA7yJyZL11peMbLFd1yqLfX

    /MctKCBnGTQAd5B+NRtQf/nSFgND9bjroPGhNE9meS59qfNm8ky2/l8/gaZWWTMauVR14PRP3BQz

    cNHrog0QUNzbPwuP4fV9Ft03cHyYRwLEguYqRgeMUllDrj6b5X7Qjk52GGKFUOYsLwfupk8zzr3a

    i275PcXAL124NvnLkHXOd/3QUACBp+q0fPREy2t0qX6/YMPJkie6UYi81kp2aNAnXomYvJqRPCZ5

    2uRKy0LQaGhIWTj8KWMUAIyZQwtE9csjy37VLzoM1Xehhl89Twv1gw9tL8YDU4vSxPnIJ+XGHuoA

    OZStKRRn9ldPENUR4bLHnjhciSVerJvcdKXLafJoIH4dneMQM6BXXH18QPZNt0YEQou6d6WYs+qu

    ahloW3WOq59jPZzBBEs0UHtAOFUMHLfL1VfyxkQi17c3rnk5mwXvb7NAiFTYTIywRc2KdQpLbvth

    U46Ko7vsJU3+yMOeHDGEURtcAGa/FEKl7nf9WwGb4E8aWFxOvk4C7awFUMnmnuRY8LkfoIZpBLpD

    kgXdFSAT46YCt4CqGpNhXD/ts2gUZYHLwZpIJJRw2t+iCH3nidb7LKJva0DCqYK9UstqfvD8YRKK

    cRlmtQDoHVDVZX1hc3xUhXQ6+4hpRGk/4wU36Jhn8vW7mTU=

    *

    Let us know when (and how) to unpick it!

    1. find users who cut cat tail

      Re: Encryption? Apple AI? So what would this be?

      You wanted to know? All right then.

      You have two options. Give us the decryption key. Or spend a few years in prison. You don't have the key or say it doesn't exist? Then your options are limited to the second one.

      1. Alumoi Silver badge

        Re: Encryption? Apple AI? So what would this be?

        Obligatory xkcd: https://xkcd.com/538/

      2. Anonymous Coward
        Anonymous Coward

        Re: Encryption? Apple AI? So what would this be?

        @find_users_who_cut_cat_tail

        Multiple Assumption Alert:

        1. Is it even encryption? Maybe (as stated) it might just be a random stream in base64! How do you know?

        2. Since when is private encryption deemed to be illegal?

        3. If it is encrypted, how do you know that the message is not perfectly legal? How do you know it isn't a recipe for Black Forest Gateau?

        4. Even if snoops want to decrypt....the key isn't enough!! Is it Blowfish? Is it IDEA? Is it PGP? What's the algorithm? Maybe that is private too!!

        Still....a good try at scaring the commentards on El Reg!!!

    2. Anonymous Coward
      Anonymous Coward

      Re: Encryption? Apple AI? So what would this be?

      > Is it some banned material? Is it even a photograph? Or is it just the output of a random number generator....designed to confuse?

      Downvoted for unnecessarily shuffling electrons. One line was enough to make your point.

  8. Eclectic Man Silver badge

    Kids

    When a work colleague announced that his wife was having twins, and he would bring in photos, I made some rules:

    1. No nudity

    2. No shit

    3. No vomit

    4. No crying

    I don't know how Apple's software will distinguish between children / babies in images compared to, for example, teddy bears of a similar size (and you can get them legally in bondage harnesses, though I suggest you don't search for such images from a work computer). I expect that there may be a period of over-reporting of images, and I wonder whether the actual Apple employee who identifies an image referred for inspection as bring of child abuse will actually be identified to the legal authorities or if will just be 'Apple child protection team'. It will also be interesting to know how the various countries' law enforcement organisations will engage.

    New parents and indeed old parents often like boasting of their children's progress / humorous accidents (see 'you've been framed' for any number of childhood accidents caught on video) by sending images of them proudly holding up the cup awarded for second place, or covered in mud after falling in a puddle. Designing the algorithms to detect child abuse images rather than normal childhood activities and verifying it will be very difficult. After all being wrongly accused of child abuse is going to be very distressing.

    1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: Kids

      Apple’s software does not need to distinguish between photos you have taken and child abuse material because that is not how the technology works.

      A US child protection NGO called the National Center for Missing and Exploited Children (NCMEC) maintain a database of known indecent images. These are images which law enforcement have found on defendant’s devices, have assessed as being illegal and submitted to NCMEC. NCMEC use technology from companies such as HubStream and Project VIC to maintain this database. This includes a ‘voting state’ where illegal material has to receive a number of confirmations before the item makes it into the production database.

      Typically MD5 and SHA1 were used to hash the images and that hash set (not the images) given to law enforcement agencies, digital forensic experts and internet companies so they could block content.

      NCMEC also run CyberTips which allows internet companies to tip off law enforcement about illegal activity on their platforms.

      The problem is that compression, resizing or minor edits of images mean they will not generate the same MD5. Microsoft developed PhotoDNA as a way to match images with minor changes. They licensed it for child protection work. The problem is it’s not very good.

      Neural hash is Apple’s version of PhotoDNA. It is a mechanism to detect known CSAM where minor changes/recompression has occurred. It is presumably better than PhotoDNA (or may be about the same but means Apple don’t have to cough to using an MS technology).

      Neural hash is not about looking at a new image, eg a teddy bear in BDSM gear and saying ‘is this CSAM’.

      Apple will not have access to the CSAM. They will have licensed the hash generator to NCMEC who then run it over their data, giving the hashes (which cannot be turned back into the images on their own) to Apple.

      So the database is compiled by an NGO with multiple law enforcement agencies input.

      There are no guarantees ‘bad’ or erroneous data will end up in the hash database but this is hardly Apple’s fault. They can’t hope to generate their own data without serious efforts (the database represents over a decade of law enforcement work) and legal issues (Apple cannot curate it’s own CSAM collection). A mitigation against the bad data is the fact there has to be multiple hits before you’re flagged for human review.

      I wish the so-called journalist of technical outlets such as this one would do some research and report accurately rather than chasing click bait headlines. It’s fine to point out the many flaws of this situation but they’re not giving the technical readers the full picture.

    3. Kabukiwookie

      Re: Kids

      And what sort of person would be attracted to a job that requires one to ''verify' possible pictures of this type?

    4. Anonymous Coward
      Anonymous Coward

      Re: Kids

      >being wrongly accused of child abuse is going to be very distressing

      And I wonder who may be sued for a wrongful accusation. Political hay may be made out of this sort of thing: e.g., one might not want to run for public office even if exonerated.

  9. Anonymous Coward
    Anonymous Coward

    What about the training data?

    So the code can be analysed - what about all of the training sets that have been thrown at the system? Neural net code can be analysed and verified, but the secret sauce is how it has been trained.

    And here is the big problem - child pornography is a strict liability offence meaning that researchers would require special permission from the government to even obtain copies to repeat Apple's experiments.

    1. Anonymous Coward
      Anonymous Coward

      Re: What about the training data?

      It’s not really a training set in the normal ML sense. They have generated their ‘neural hashes’ for every image in the database rather than built a model. Apple’s algorithm is really only good for detecting almost identical copies of known abuse images. The data set is compiled by law enforcement. If you don’t trust that data, that’s hardly Apple’s fault. It’s the same data used by every content provider doing similar scanning.

  10. martyn.hare

    Ignore the technical aspects for a moment…

    There’s a giant elephant in the room nobody wants to talk about. The fact these checks could actually be harming children more than not having them.

    The theory is that infringing material gets added to a database, allowing other abusers to be rapidly caught through illicit possession of child abuse images. As in, if you catch one child molester, you should be able to bag a whole bunch of them from the trading of images. In fact, law enforcement have themselves been known to use honeypots to aid in trapping and catching these people. This doesn’t sound like a bad idea at all on the surface, in fact, it’s practically a law enforcement wet dream, where investigators can share simulated abuse images (cruelty-free, no children harmed in the making of) to try to catch sickos before they can do real harm to real children.

    There’s just one problem with this approach when automated at scale and it’s not a small one. This creates a ‘need’ for new images to be created and shared privately, as older images are now more risky to possess since the longer an illegal image is in the wild for, the more likely it is to have been added to a database. Add to this the fact that images depicting abuse which don’t involve real, living humans are also added to these databases and you now have a situation where sickos now have an incentive to sexually abuse more children since it could be perceived as a safer choice (for the abuser) depending upon the situation.

    It’s very easy to say how many people got caught in possession of CSAM but it’s hard to say how many additional children will be molested as a result of interventions like these. Due to a lack of transparency with the public (good luck getting FOIA answers) there’s no way to assess if this measure even does protect the children as claimed. In theory, it should, however, in practice, I very much doubt it.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like