back to article Apple is about to start scanning iPhone users' devices for banned content, professor warns

Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain. The neural network-based tool will scan individual users' iDevices for child …

  1. Blackjack Silver badge

    If this doesn't make people stop using Iphones... then what?

    1. gandalfcn Silver badge

      Given all others follow what Apple does, and always have done that is rather a silly thing to say.

      1. Blackjack Silver badge

        Apple: We care about your privacy that's why we made Facebook and all those people whose revenue depends on tracking you angry.

        Also

        Apple: We will scan your photos, but is to look for banned content like people that abuse children. Do not worry, there is absolutely no way your photos that do not have anything that's banned content will be leaked and used against you.

        1. Zippy´s Sausage Factory
          Flame

          And, of course, don't worry. There's no way we'll give in to the Chinese government and let them control the list of banned images. Or the Saudi one. And even if we did, we won't just immediately hand over your name, phone number and current GPS location. At least, not until the new laws they're already sharpening get passed...

          1. David 132 Silver badge
            Big Brother

            Given that the CCP have passed laws that basically mean criticizing them, anywhere in the world, is an offence, better never ever have photos on your phone of Tiananmen Square Tank Guy, or the Tibetan national flag, or Uighur cultural imagery, or the flag of the independent nation of Taiwan.

            Or hope that those at Apple responsible for this get beaten with the Cluestick.

            1. John Brown (no body) Silver badge

              "Or hope that those at Apple responsible for this get beaten with the Cluestick."

              I wonder if Apple Legal are aware that Apple have the ability to scan data on a users phone? I'm sure various law enforcement agencies who spent large sums of money having suspects phones "cracked" because Apple said it was not possible for them to do it, will all be mighty pleased to hear the news.

        2. Jimmy2Cows Silver badge

          And of course, we'll never get it wrong, and send your details to law enforcement without checking, since we're not allowed to check as that could risk viewing illegal images.

          But don't worry, the police will definitely verify what's found really is illegal before smashing your door in at 4am and terrifying your children. Because somebody needs to think of the children.

          1. W.S.Gosset Silver badge
            Alert

            Yeah, that's what I see as the BIG risk in the short- to medium-term :

            Your life gets suddenly tipped upsidedown and turned into a living hell, because an AI+corpus scores a False Positive on some phone photos or a funny gif someone sent you.

            1. The Man Who Fell To Earth Silver badge
              FAIL

              What could possibly go wrong?

              All you need to know about AI:

              https://www.newscientist.com/article/0-tom-gaulds-attempts-to-create-a-sarcastic-ai-are-really-genius/

    2. Anonymous Coward
      Anonymous Coward

      Actually it's a cunning Apple marketing ploy ...

      ... If you stop using your Apple phone now then you'll be put on a Paedo Terrorist Watch list.

      ... If you're using a competitors phone that doesn't implement this technology then you'll be put on a Paedo Terrorist Watch list.

      ... so you better sign up now with Apple to show that you're 'Clean'.

      ... The Watchlist may then later be extended to include anyone who expresses any opinion that AI could possibly construe as opposition to The Party in power.

      Nothing to Heil nothing to Fear.

      Now just waiting for Priti Patel to jump onto this Band wagon.

      1. Anonymous Coward
        Anonymous Coward

        Re: Actually it's a cunning Apple marketing ploy ...

        and Nicola Sturgeon, who will demand all "adult material" is included, what with the SNP policy to ban what THEY deem "pornography" (strong public opposition and attendant loss of votes being the only barrier at the moment to them, though they are committed to "altering public attitudes to make such material socially unacceptable") for those who doubt, read up on Equally Safe...a policy on the surface with laudable aims (making women and girls as safe as men in day to day life, however no word of "free" and there's the rub......) but with some seriously Orwellian concepts tucked away.......outlawing "pornography" (as women who participate "harm other women indirectly") and "exploitative sexual practices" (again what THEY deem expolitative)

        No wonder they are so touchy with comparisons to dystopian regimes and a certain German government....bit too close too home for comfort likely......

        1. Bbuckley

          Re: Actually it's a cunning Apple marketing ploy ...

          And once they march through the 'obvious crminal' material, next will be identifying those that do not agree with 'diversity' (as defined by them) and so on and so forth until we really get to the Orwellian future that this inevitably leads us to. Or. We (the people who object) take up arms and destroy them and free ourselves. World War III anyone?

          1. katrinab Silver badge
            Unhappy

            Re: Actually it's a cunning Apple marketing ploy ...

            Or in England, not having sufficient numbers of Union Flags in the background of every photo ...

            1. Fruit and Nutcase Silver badge
              WTF?

              Re: Actually it's a cunning Apple marketing ploy ...

              "UK government spends more than £163,000 on union flags in two years

              Purchases have increased across departments, revealing embrace of the flag under Boris Johnson"

              https://www.theguardian.com/politics/2021/aug/06/uk-government-spends-union-flags-two-years-boris-johnson

              "Let them eat cake[wave Union Flags]" says World King Boris

              ps

              @katrinab - I see you've been downvoted by the resident Boris fan club

              1. Fruit and Nutcase Silver badge

                Re: Actually it's a cunning Apple marketing ploy ...

                Yep, the ever vigilant BFC springs into action!

            2. Claverhouse Silver badge

              Re: Actually it's a cunning Apple marketing ploy ...

              Or in England, not having sufficient numbers of Union Flags in the background of every photo ...

              .

              Unlike, say, America, very few English households display or even have a Union Flag.

              .

              .

              I have one somewhere, maybe rolled up in a box in store, but it's a small one, probably left over from a Boy Scout village hall, or a small boat in the 1930s to 50s. Not that I would fly it anyway as a jacobite.

              .

              .

              The American version of Boy Scouts are actually chargeable with respectfully disposing of people's Old Glory, via the Flag Code with no desecration. Just one of the many reasons we consider Americans to be nuts.

              1. Charles 9 Silver badge

                Re: Actually it's a cunning Apple marketing ploy ...

                One group that will tend to possess a Stars and Stripes will usually be veterans. They are often gifted one upon retirement or passing. As I recall, it's usually one that had been flown over the U.S. Capitol for a day.

        2. Anonymous Coward
          Anonymous Coward

          Re: Actually it's a cunning Apple marketing ploy ...

          As safe as men?

          We need to implement laws that shave an average 5 years off a woman's life and increase their risk of suicide and heart attack then.

          We also need to ensure that women have more industrial accidents too.

          Only then will women be as safe as men.

        3. LDS Silver badge

          with some seriously Orwellian concepts tucked away.......outlawing "pornography"

          Never read "1984", eh? It's exactly with pornography (among other things like sport) that the Party ensure the Proles are not an issue, without even the need to deploy telescreen in their houses.

        4. Trigonoceps occipitalis Silver badge

          Re: Actually it's a cunning Apple marketing ploy ...

          Of course it is illegal in the UK to have a photograph of a perfectly legal activity - two consenting 16 years olds going at it like rodents faced with extinction.

          1. Fruit and Nutcase Silver badge

            Re: Actually it's a cunning Apple marketing ploy ...

            The Rats! Mr & Mrs Rabbit want to know how rodents have taken their #1 spot in the nookie leaderboard

      2. David 132 Silver badge
        Happy

        Re: Actually it's a cunning Apple marketing ploy ...

        Apple Watch: monitors your exercise level and number of steps taken.

        Apple iPhone: Is a Paedometer.

        1. Ken Moorhouse Silver badge

          Re: Apple iPhone: Is a Paedometer.

          They've shot themselves in the foot. People are going to start referring to the "iPaed" (unless they are concerned about people thinking they are incontinent).

      3. Version 1.0 Silver badge

        Re: Actually it's a cunning Apple marketing ploy ...

        Maybe the marketing folks just think that this will stop the priests using Android phones, "You can trust me, because I use an iPhone"

    3. 45RPM Silver badge

      As I understand it, this technology has been developed because Apple has built its service in such a way that it can’t scan in the cloud so it has to scan on device. Everyone else, Google included, scans in the Cloud. One way or another, you’ll get scanned - it’s just a question of where.

      In the case of Apple’s system, violating images can be decrypted by law enforcement - not all images.

      And yes, I can see how this might be expanded to include other ‘crimes’ (inverted commas because the definition of what is criminal depends on the state - in one country, for example, it might be criminal to blaspheme but not to stone someone and vice versa). I hope that Apple will be restrictive about what it scans for - terrorism, paedophilia and that’s about it, but pragmatically speaking, with governments clamouring for a back door into our devices this seems like a sensible middle ground. Let them see what is pertinent to the case, and nothing else.

      1. sabroni Silver badge
        Facepalm

        re: this seems like a sensible middle ground.

        A third party scanning my personal device is a sensible middle ground?

        No.

        It's not.

        1. 45RPM Silver badge

          Re: re: this seems like a sensible middle ground.

          A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met.

          The way government legislation is going worldwide a tech company has several choices in order to ensure compliance…

          1) build in a back door which allows law enforcement agencies (and anyone else who gets the keys on the dark net) to scrobble whatever they like from your device.

          2) store everything on the cloud and scan it there (this is the most common choice that big tech makes, especially since they’re already scanning for the purposes of advertising)

          3) scan on device and only provide keys to law enforcement for data which is illegal (this is the route that Apple has taken)

          4) give up, and only make dumb devices with minimal functionality.

          Honestly, it’s Hobson’s choice. I don’t like any option, but option 3 seems to be the least worst option. My principal concern with it, as I said previously, is what constitutes ‘illegality’ - but that concern applies to all the other options except 4.

          Genuine question. Leaving aside any issues of platform partisanship, if you were a big tech making the revolutionary new SabroniPhone, and assuming that you didn’t want to get legislated out of business, which option would you choose - and why?

          1. Steve Graham

            Re: re: this seems like a sensible middle ground.

            Your belief in the ability of Artificial Stupidity to recognize unequivocally illegal content is contradicted by real-world experience.

            1. katrinab Silver badge

              Re: re: this seems like a sensible middle ground.

              Picking up on this, anecdotal evidence seems to suggest that children from Thailand and other countries in the region are more likely to be victims of this particular type of child abuse. Children elsewhere do get abused, but it appears they are less likely to be photographed while being abused.

              Does the "AI" decide that any child who looks Thai must be a victim of child abuse; and therefore reports any Thai familily taking perfectly normal photos of their children doing perfectly normal things that children do?

          2. Anonymous Coward
            Anonymous Coward

            Re: re: this seems like a sensible middle ground.

            What other crimes might Apple users have committed?

            Why stop at calling your customers pedos, why not also call them terrorists? Perhaps they have Jihaddist images? Perhaps 911 images in a folder labelled 'favorites'?!

            Perhaps they have animal abuse imagines? Maybe you could save a cute puppy from their horrible abuse.

            Why stop at text scanning of messages they send, why not also scan their emails? Not just the attachments, you're going to scan the text so why no email text too? All those forbidden crime code words these cosa-nostras might use.

            Why stop at a US provided test model, why not also scan a Chinese one, provided by China? For images that are illegal in China? Tiannemen square tank boy?

            Or a Russian set provided by Putin? Images of opposition propaganda and other illegal images?

            Perhaps images of Mohammed? In muslim countries, so Apple users can be stoned to death.

            Isn't that also a crime your customers might be committing in those countries? Shouldn't those users be stoned to death?

            Are they drug dealers? Maybe scan their messages and images for drugs and drug related paraphernalia?

            Is there anything suspicious about where they go and who they meet, maybe pass their GPS track for approval too?

            I mean these Apple users, they's such scum, that Apple needs to protect the world from them, maybe we just stop and search everyone with an iPhone. Just in case.... oh, right, that's what they're doing here.

            1. CountCadaver

              Re: re: this seems like a sensible middle ground.

              Or LGBTQIA folks in many non western countries

              Images of Pride flags, support for trans people

              People belonging to the "wrong faith"

              People who have committed apotasty

              People who are members of the "wrong party"

              People oppose / support independence

              People who criticise "the leader"

              Its a very very steep and extremely slippery slope towards the horrors of Airstrip 1.....that or the UK as envisioned in Futuretrack 5 (Robert Westall - young adult before young adult), a very underrated near future dystopia where citizens are constantly "psy scanned" for impulses such as violence, anger, suicide etc and lobotomised...

              1. djstardust

                Re: re: this seems like a sensible middle ground.

                Oh shit, Sturgeon's stazi will be knocing on my door soon then.

              2. Uncle Slacky Silver badge

                Re: re: this seems like a sensible middle ground.

                > People who have committed apotasty

                Mmmm, sacrilegious...

            2. Fruit and Nutcase Silver badge
              Joke

              Re: re: this seems like a sensible middle ground.

              For images that are illegal in China?

              https://duckduckgo.com/?q=winnie+the+pooh&iax=images&ia=images

          3. confused and dazed

            Re: re: this seems like a sensible middle ground.

            A thoughtful response. But I think Apple are big enough to just say not.

            They will not scrutinise what it on your phone, (whatever the method). Have the debate with legislatures out in the open.

            Today it's pedos and terrorists, tomorrow it's activists.

            1. Scott 26

              Re: re: this seems like a sensible middle ground.

              >A thoughtful response. But I think Apple are big enough to just say not.

              Except the Chinese ICT market is big. (I mean rally big). And Apple, ultimately wants a slice of that pie. And if the CCP says "you want to operate in our country, then here's some images we want you to scan for"... I bet Apple bends over.

          4. tip pc Silver badge

            Re: re: this seems like a sensible middle ground.

            I’d rather they be honest and state exactly why they are doing this, not try and dress it up as a positive.

            If there is a directive every vendor must comply with then they should just come out and say so.

            To comply with xyz we have developed blah blah blah.

            It’s the underhandedness that I can’t stomach.

            Be honest and let us trust.

            2 months time this will be largely forgotten and no one will care, people will be scrambling for the next super I thing with off the charts neural cognition etc etc.

            I might buy up vintage 2020 tech and hope the new oppression software won’t run on 2020 vintage tech.

            1. ForthIsNotDead

              Re: re: this seems like a sensible middle ground.

              Linux phone, here I come...

              1. cyberdemon Silver badge
                Terminator

                Re: Linux phone, here I come...

                Paedoterrorist alert!

                Banned not-apple-or-google device detected!

                Banned open source software detected!

                Banned anti-government sentiment detected!

                WatchList.push("ForthIsNotDead")

                1. ForthIsNotDead

                  Re: Linux phone, here I come...

                  @cyberdemon - yes - I suspect the powers that be will be making justifications for such tooling any tine now!

                  If Apple are planning to roll this out on their phones, then it just makes sense that they will also roll it out on their laptop platforms. Microsoft will surely follow on Windows. I mean, THINK OF THE CHILDREN!

            2. Bbuckley

              Re: re: this seems like a sensible middle ground.

              No more iPhone upgrades!

              1. Charlie Clark Silver badge

                Re: re: this seems like a sensible middle ground.

                I'm sorry Dave, I can't allow you to do that.

          5. FIA Silver badge

            Re: re: this seems like a sensible middle ground.

            A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met.

            A third party designed the software, initiates the scan, decidedes the criteria and can then decrypt what they find if they so choose. That sounds like a third party scanning the device to me.

            What about option number 5....

            5) Tell your customers, shout about it loudly, so public opinion can rally and tell their legislators what they think?

            I understand the situation Apple is in, but we as a society don't have to accept this kind of nonsense.

            There are many example of over broad early 2000s 'terror' laws since being used on regular folk; and plenty others of overly broad AI being used as a mallett. (Amazon firing workers by AI for example.... )

            1. Bbuckley

              Re: re: this seems like a sensible middle ground.

              Exactly. And how many shoe bombers have the US caught in the past 15 years? How many toothpaste bombers? This bullshit just takes the piss out of the ordinary law-abiding citizen.

              1. Cav

                Re: re: this seems like a sensible middle ground.

                That is flawed logic. Do you seriously think people will still turn up with shoe bombs knowing that they will now be checked? That bizarre logic disregards the fact the remediation for a problem actually works.

                It's like arguing that anti-virus software is unnecessary because you haven't been infected with a virus, because of the anti-virus... Malware changed precisely because anti-virus software was developed.

                The same applies to terrorists. Why would you stick to a method for which you know there is now a counter-measure?

          6. elsergiovolador Silver badge

            Re: re: this seems like a sensible middle ground.

            A third party isn’t scanning your device though. The AI on the device is scanning your device.

            "My client didn't stab him, Your Honour. It was in fact the knife."

            1. Bbuckley

              Re: re: this seems like a sensible middle ground.

              An intelligent knife your Honour.

          7. Bbuckley

            Re: re: this seems like a sensible middle ground.

            I would choose to invoke GDPR and tell the legislators to Fuck Off.

            1. katrinab Silver badge
              Unhappy

              Re: re: this seems like a sensible middle ground.

              GDPR doesn't apply to law enforcement activities.

              1. Wolfclaw

                Re: re: this seems like a sensible middle ground.

                Apple are not a law enforcement body and will be breaching GDPR if passing to a 3rd party law enforcement body unless the law changes or simplest way is to change T&C's of iCloud usage to get user consent, then GDPR is out the window.

          8. Geez Money

            Re: re: this seems like a sensible middle ground.

            "A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met."

            Give me a concrete and enforceable (from my end) guarantee that the AI will never make a mistake and send a non-abusive image off the phone and that the technology will never be used for anything else and maybe we can have this conversation.

          9. MrDamage

            Re: re: this seems like a sensible middle ground.

            Fuck off with your apologist bullshit. Magical thinking that only the good guys will have the decryption keys, and no-one else will figure it out, or there won't be a mail leak by a disenchanted ex-employee?

            Remember Microsoft's Secure Boot fiasco in 2016?

            Or how about when The Shadow Brokers auctioned off NSA tools and exploits?

            This shit don't work, and never will.

          10. Anonymous Coward
            Mushroom

            Re: re: this seems like a sensible middle ground.

            Did I ask the AI to be installed on my phone?

            No.

            It's a third fucking party.

          11. YARR

            Re: re: this seems like a sensible middle ground.

            The way government legislation is going worldwide

            Governments in authoritarian countries should have no influence over our data privacy in non-authoritarian countries. If the authoritarian countries insist on violating data privacy it should be done via an API that surrenders access to a separate government-mandated software application that does the analysis/spying function. By default, with no government-mandated software installed the API should be inactive and respect the user's data privacy.

            Given most western nations now have GDPR-like rules governing access to personal data, this processing should not happen without user consent. There should be a clear opt-in rather than a mandatory surrender of consent buried in the legal terms of service.

            1. Anonymous Coward
              Anonymous Coward

              Re: re: this seems like a sensible middle ground.

              "Given most western nations now have GDPR-like rules governing access to personal data, this processing should not happen without user consent. There should be a clear opt-in rather than a mandatory surrender of consent buried in the legal terms of service."

              But most of them have outs for when the governments themselves want access. After all, he who sets the rules...

        2. Bbuckley

          Re: re: this seems like a sensible middle ground.

          And when did governments ever behave sensibly?

        3. martyn.hare

          Re: re: this seems like a sensible middle ground.

          Please, just read the damned technical documents.

          Their system doesn’t scan photos which are only stored on your device because it depends upon uploading the on-device hash AND the encrypted payload to iCloud Photos for any actual matching to occur, which is then all completed server-side. Your device also can’t know if any given image is known CSAM or not without uploading all of the data,

          The technical paper says it only matches against the known CSAM database which is purely partial-matches based on hashing, not machine learning like people have implied. It’s Microsoft PhotoDNA but refactored to securely do half of the task on-device. It can’t match new images which haven’t first been added to that known CSAM database just like PhotoDNA can’t. But it can have false positives technically. To avoid that being a major issue, they implemented a threshold so one has to have at least more than one positive known CSAM match for anything to matter. To further ensure occasional false positives don’t colour any reputations, all devices produce synthetic false matches deliberately, such that pretty much everyone shows up as potentially having some CSAM on their devices at random at some point. The synthetic false matches do not provide valid data in their payloads and can’t be used as a way to unlock access to real images but cause enough flagging to occur to protect everybody against false accusations based upon numbers of matches.

          The design of this new system is clearly built with the idea of later deploying iCloud end-to-end encryption. Otherwise, they’d just be scanning photos entirely server-side after they’ve been uploaded to iCloud like the rest of the industry is already doing and they wouldn’t be making an announcement about it like it’s a big achievement. If they’re not going to implement full end-to-end encryption of the actual photos then this would be a pointless circlejerk as well as an unnecessary hit to PR since I bet most lay people don’t even know PhotoDNA is a thing.

      2. Polleke

        Please explain where your understanding is coming from!

      3. Anonymous Coward
        Anonymous Coward

        It specifically states in the article: "... it would be initially deployed against photos backed up in iCloud before expanding to full handset scanning."

        Therefor this initially at least, seems to be focused on scanning in the cloud, expanding at some later date to on device scanning.

        1. General Purpose Bronze badge

          not scanning your device, exactly

          What Apple's currently saying is "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image" (https://www.apple.com/child-safety/). That excludes scanning across the phone's library of photos, or across the iCloud Photos library, or the iPhone's iCloud Backup.

          The technical summary linked at the end goes into more detail. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

          As it uploads an image, the device runs the matching process and creates a "voucher" for it. The voucher includes the match outcome and an encrypted "visual derivative". When the number of vouchers indicating matches reaches some threshold, the "visual derivatives" in those particular vouchers are decrypted and a process of manual review and action begins.

          Whether the whole idea of checking your photos is good or bad, they've clearly put a lot of thought into avoiding scanning entire libraries.At this stage, anyway.

          1. Anonymous Coward
            Anonymous Coward

            Re: not scanning your device, exactly

            So it falsely accuses customers of being pedos PROBABLY, MAYBE, PERHAPS but its OK, the ones that flag more strongly goes to a subjective review of a person you don't know who will then make assumptions about the rest.

            So while a man holding a little boys penis in a photo will be labelled a pedo, a doctor holding a pediatric patient male genitalia prior to circumcision is not, because 'Apple magic cool something or other'*.

            * An actual real world example there, he's an excellent doctor by the way, not a pedo, or maybe he is a pedo I don't know just from the photo's of kids he takes as a doctor doing a before shot.

            "Whether the whole idea of checking your photos is good or bad, they've clearly put a lot of thought into avoiding scanning entire libraries.At this stage, anyway."

            So its OK, because they only look at SOME of your photos to make the decision as to whether you're a pedo, not ALL of your photos... at least at that stage. And they did think about looking at ALL your photos, so they clearly understand there's a problem here, so that makes it ok right?

            Cool, I'm totally OK, with people I don't know calling me a pedo in secret meetings based on limited info.

            TAKE MY MONEY! You had me at "VOUCHER"!

            Seriously, the world needs to be protected from me, and even though I'm not a pedo, if Apple says I am, then they must be right, and the world needs to be protected by putting me in some sort of magsafe stylish Apple shackles. Because after all, isn't that why you buy a phone? So that people can secretly look at your photos and make accusations against you behind your back? And if Apple make an accusation, who am I to say Apple is wrong and I am right? Did I make a stylish phone? No, I did not! Am I the third largest manufacturer in the world in the smartphone segment? No, I am not! I think that's conclusive then, I must be wrong. Slap those magsafe cuffs on, and take me away!

            1. Anonymous Coward
              Anonymous Coward

              Re: not scanning your device, exactly

              Agreed, however being dead set against child genital mutilation I would quite like that Doctor to end up in court.

              1. Anonymous Coward
                Anonymous Coward

                Re: not scanning your device, exactly

                Have an upvote, but there are clinical reasons why doctors must handle the genitals of minors.

            2. Cybersaber

              Re: not scanning your device, exactly

              "* An actual real world example there, he's an excellent doctor by the way, not a pedo, or maybe he is a pedo I don't know just from the photo's of kids he takes as a doctor doing a before shot."

              My son was circumsized shortly after birth, and there were no pictures taken. Maybe your doctor is related to Joey Tribbiani's taylor?

              1. sabroni Silver badge

                Re: My son was circumsized shortly after birth, and there were no pictures taken.

                Hiding the evidence of child genital mutilation. Good thinking.

                Not as good as thinking "I don't need to mutilate my child's genitals" though.

                1. Cybersaber

                  Re: My son was circumsized shortly after birth, and there were no pictures taken.

                  I did question the need to do that to my boy. I was told by my family doctor that circumcision was for health benefits, and the benefits were listed to me. Now, regardless of whether the current teachings of the medical community still reflect that, I was acting under the advice of my medical doctor at that time.

                  A century from now, people may very well demonize you for an action you took because it was the best informed decision you could make at the time - just like every generation since the dawn of recorded history has done to societies predating their own.

                  If you went to your doctor and asked him what you should or shouldn't do in response to COVID-19, are you wise to follow his advice, especially after receiving a convincing explanation of why that makes sense to someone who was just barely a legal adult (as I was when my son was born?)

                  Think of that before you demonize the rational, informed decisions and the people that made them based on what was known at the time.

                  1. tip pc Silver badge

                    Re: My son was circumsized shortly after birth, and there were no pictures taken.

                    "I was told by my family doctor that circumcision was for health benefits, and the benefits were listed to me. Now, regardless of whether the current teachings of the medical community still reflect that, I was acting under the advice of my medical doctor at that time."

                    Trust me I'm a qualified expert.

                    If you're wealthy you get different much more informed answers from someone who has actually considered the circumstances, understands all the prospective treatments and weighs those up against the facts at hand.

                    For us plebs you'll get what ever dictate from on high they are trying to push. If its your choice they will lay it on thick and every "expert" will state the same thing.

                    Turn 40, take statins. if your weight is more than some chart states then you're obese and will die early.

                    They won't take into consideration if you're active or not.

                    walking 40 miles a week & cycling 40 miles a week plus 10,000 steps a day does not count & they treat you like a 40 stone sedate bed ridden person.

          2. tip pc Silver badge

            Re: not scanning your device, exactly

            "As it uploads an image, the device runs the matching process and creates a "voucher" for it. The voucher includes the match outcome and an encrypted "visual derivative". When the number of vouchers indicating matches reaches some threshold, the "visual derivatives" in those particular vouchers are decrypted and a process of manual review and action begins."

            was just about to post much the same

            the crucial bit is that these new capabilities will be in a future iOS 15, OSX12 update.

            so potentially not invoked for now.

            The trick will be to stay on iOS 14 OSX11.

            Apple recently made a change so that older versions of code will still receive updates and its not mandatory to go to the latest major version if you don't want to.

            I'm running the next version betas on my MacBook Pro, phone & iPad.

            I'll need to research and see if they are safe or I need to down grade to iOS 14/ OSX 11

          3. Nifty Silver badge

            Re: not scanning your device, exactly

            "before an image is stored in iCloud Photos, an on-device matching process is performed for that image"

            Are you a WhatsApp user with an iPhone? If so, from time to time you may receive photos from anyone able to look you up in WhatsApp. Those photos are stored on your camera roll right now. Go and have a look. In the default configuration your camera roll is synced to iCloud for the last 100 or so images. (You can disable this feature in WhatsApp > Settings > Chats).

            Now have another think about the implications.

          4. Anonymous Coward
            Anonymous Coward

            Re: not scanning your device, exactly

            Thanks for the link, it's handy to be able to discuss this in terms of what they are actually proposing as opposed to all the arm waving conjecture.

            That said, based on the outline I am not too enthused about the process. ML just can't deliver on this, and it's a black box that even it's creators can explain how and why it thought it found a match. In addition human review is both taxing for the humans and prone to human failures.

            This system, even before it's scope starts to creep, will unintentionally destroy peoples lives, while still failing to catch and prevent all incidents of actual child abuse. The only question is scale. And on the first day, a politician will tearfully state, if it saves even one child....

      4. FIA Silver badge

        As I understand it, this technology has been developed because Apple has built its service in such a way that it can’t scan in the cloud so it has to scan on device. Everyone else, Google included, scans in the Cloud. One way or another, you’ll get scanned - it’s just a question of where.

        As noted in the article, Apple does scan what you upload to iCloud.

        But this is a world apart from device scanning.

        If I store some stuff in your garage, you have every right to know what's in there, or tell me to jeff off. But that doesn't give you the right to come round to my house and root through my loft.

        I really hope this dies quickly, I don't like Android. <sigh> Back to Nokia then.... I suspect the 3310 still has 50% charge anyway...

      5. Chet Mannly

        "Everyone else, Google included, scans in the Cloud. One way or another, you’ll get scanned - it’s just a question of where."

        Not true. If you don't store anything in your google cloud they can't scan anything. With Apple you don't have that option...

      6. Anonymous Coward
        Anonymous Coward

        Illegal search and seizure

        Interesting that law enforcement could not do what Apple is doing without a warrant. Perhaps this is another application of outsourcing.

        "It is a cardinal rule that, in seizing goods and articles, law enforcement agents must secure and use search warrants whenever reasonably practicable. . . . This rule rests upon the desirability of having magistrates rather than police officers determine when searches and seizures are permissible and what limitations should be placed upon such activities. Trupiano v. United States, 334 U.S. 699, 705 (1948), quoted with approval in Chimel v. California, 395 U.S. 752, 758 (1969)."

    4. Anonymous Coward
      Anonymous Coward

      If this doesn't make people stop using Iphones... then what?

      the fact that MS, google and the rest of the shitpack have been scanning your email (i.e. private correspondence) for years, has NOT stopped people using those services, so the answer is: nothing.

      1. FIA Silver badge

        Re: If this doesn't make people stop using Iphones... then what?

        I disagree, these things are on a scale.

        I know Google scan my email, but in return I get Gmail. I understand this and made the choice, and it's a choice I'm happy with.

        Other people aren't and choose not to use it. That's fine.

        My phone, however, is my digital safe. It contains a record of my life to a greater or lesser extent. It allows me to communicate with my bank, it contains all my communication applications and a good few months of my recent photos. It also contains the 2FA apps I use.

        Retrospectivly I do not want to give Apple the right to rifle through all that. (I don't use iCloud) That is not a choice I've been informed about.

        1. Ace2

          Re: If this doesn't make people stop using Iphones... then what?

          I flipping HATE that those Google bastards scan every email to or from anyone who uses gmail, since that picks up tons of my mail, despite no agreement from me.

        2. Anonymous Coward
          Anonymous Coward

          Re: If this doesn't make people stop using Iphones... then what?

          Just like vaccination it isn’t just your choice though is it. Anyone who emails you will have their conversation with you scanned as well.

      2. Chet Mannly

        Re: If this doesn't make people stop using Iphones... then what?

        I swapped gmail for another provider for exactly that reason.

        Judging by the user numbers I'm probably in the minority...

    5. Dimmer

      Not really worried about it. How long do you think the battery is going to last with this running on the phone? At some point they will achieve critical drain and we will have to keep it plugged in all the time.

    6. Anonymous Coward
      Anonymous Coward

      If anything it might cause more people to use iPhones.

      "I can't be guilty m'Lord, I use an iPhone, ask them if they found anything on my phone".

      If you're a perv, just keep a clean iPhone and have Apple as part of your defense.

      Apple is about to start defending more deviants than it catches.

  2. doublelayer Silver badge

    Two possible approaches

    There are two methods this could take:

    1. A model is created from the photos on Apple's end and the phone uploads its pictures to a server at Apple to do the comparison. This involves a mandatory leak of data which a user can't disable and, as Apple doesn't own the devices themselves, is currently illegal.

    2. A model is created by Apple and sent to user devices, which scans the pictures onboard and sends the result to Apple. This is more likely to be legally viable, but it is going to cause a lot of problems as the processors in a mobile device are a lot weaker than a server and most models for picture comparison are likely to be large. There will at a minimum be complaints about the network bandwidth and CPU time needed to run this check, especially as I assume the model will get run every time a user takes new pictures and whenever new source material is added causing a model update. In addition, they are either going to have a lot of false positives or false negatives, meaning they'll need some method of determining whether an image is a false positive. Automatic uploads are still legally questionable, so this might result in a lot of suspicious reports which can't be verified. With the alternative that it is mostly useless though, I don't know whether they will accept a high false negative rate.

    1. elsergiovolador Silver badge

      Re: Two possible approaches

      They are just going to change the license where somewhere in small print you'll consent to that.

      If you decline, then your phone will stop working.

      They are getting away with not allowing other app stores to run on iPhone, I can't see why they wouldn't be able to do this.

      They'll give a wink wink to three letter agencies about them being able to scan phones to their heart contents and it will be more legal than scrambled eggs on Sunday morning.

    2. General Purpose Bronze badge

      Re: Two possible approaches

      According to Apple, it's not your #1 and only partly like your #2. Yes, they send a database of hashes to your phone, but (they say), they don't scan the phone, they test an image as it's uploaded to iCloud Photos. In terms of bandwidth and CPU time, that's feasible. They say they'll start a manual review of an account when some threshold number of matching images is reached, not on each individual match.

      1. doublelayer Silver badge

        Re: Two possible approaches

        Yeah, that's number 2 exactly and the provisos still hold. In order to do the scanning, they will need to send each phone a copy of the model built from a big database. That's going to be a large file. Running it takes time. Not to mention that, although they're not scanning everything yet, there's little doubt that someone will find out that models get updated and they will need to use their new model to recheck the old pictures. Furthermore, there are people who don't upload photos to iCloud. I am one of those, mostly because I don't take many photos, but also because I have only the free storage and I don't want it filled with random pictures taken for temporary reasons. The scanning as specified wouldn't scan mine at all, so they're almost certainly going to change it so it does.

  3. Steve Aubrey
    Thumb Up

    Kudos to Gareth

    Subtle, but present, humor. Sly, even.

    Apple won't talk to El Reg, "so asking it to comment on this is a fruitless exercise"

  4. Peter Prof Fox

    A stalking-horse for copyright protection

    I have a copy (of uncertain provenance) of that iconic 1977s session by the Five-aside-archers with Beetle Wulvis on Bass Guitar. Sony Music just happen to have signed a hoover-up contract with somebody who claims to have some derived title to the tracks. In England they might sue me under copyright law but if it wasn't currently for sale they can't show a loss so that's two-fingers to them. (They have to demonstrate a loss.) But with other (c) enforcement regimes there are other consequences. Furthermore, having and playing (in England) is neither a crime or a civil tort unless they can prove I'm not entitled. (For example my copy came from Big Joey Frobisher himself.) But in this new world YOU HAVE COPYRIGHTED MATERIAL ON YOUR DEVICE! GO TO JAIL. DO NOT PASS GO. is the default position these mega corporations expect us to accept.

    1. MachDiamond Silver badge

      Re: A stalking-horse for copyright protection

      "(They have to demonstrate a loss.)"

      No, they don't. If the Copyright was registered in a timely manner, they can get statutory damages which are penalties defined by law, not commerce.

      I have images that are not up for license. If somebody stole one of my backup drives and published one or more of those images, they could be liable for up to US$150,000 for each infringement. They are all registered and I choose to not offer them for various reasons. I used to do much more journalistic work and I have images that are rather ugly for one reason or another. One of the rights included with Copyright is the power to say "no".

      1. Long John Silver
        Pirate

        Re: A stalking-horse for copyright protection

        You refer to USA law.

      2. Anonymous Coward
        Anonymous Coward

        Re: copyright protection..this is how it plays in the real world

        Got some good lawyers and the very deep pockets to pay for them? Otherwise your "copyright" is basically worthless. Even then you better hope you get a judge who knows even the basics of copyright law.

        I knew a starving artist type whose iconic image was blatantly ripped off by a software company. Those of you of a certain vintage will know both the image and the software. The artist was so tech illiterate that he only heard by accident about the rip off. He approached the software company asking for an exceptionally modest license fee considering how much money they had made (many millions) but despite them parading the ultra progressive politics in public they told him to get lost. A friend of the artist, a famous musician all of you would have heard of, was so disgusted by this shabby treatment of his friend that he put up the money for lawyers to sue.

        So the starving artist had deep pocket support and very famous backers with access to serious legal firepower. Plus a witness willing to swear under oath that they had heard first hand one of the founders of the software company tell the story of how they came to use the starving artists image for the software. It was done in the full knowledge of where the image came from and who owned it and they never had any intention of paying any license fees for it.

        Sound like a slam dunk? Nope, the first hearing was in front of judge who knew so little about copyright law that he threw out the case because the work had not being registered with the Copyright Office. Even though the very experienced copyright lawyer working for starving artist pointed out that was not how copyright law works.

        The case would have been refiled, everyone was telling the starving artist it should be, but the starving artist, who was genuinely one of the nicest people you could possible meet, decided that life was too short, and anyway, the fact that so many people had rallied around to support him was consolation enough. He was really touched the fact that so many had gone out of their way to help him.

        And the happy ending? Of course not. The founders of the software company despite promising the staff they would be going public so the employees would get a nice bonus for all their years of hard work sold the company and kept all the money for themselves. One founder used their new found (serious) wealth to bankroll one of the most high profile ultra left political organizations of the last few decades. Every time I see that organizations name bracketed with some demands for equity, rights, against "oppression" etc etc, I think of where that organizations money originally came from and all the people that were screwed over to make all that money. There was a very large number of them.

        Including that starving artist.

    2. Long John Silver
      Pirate

      Re: A stalking-horse for copyright protection

      That was my first thought too upon reading the article.

      The proposal as stated needs examining in the light of cynicism and pragmatism.

      I don't doubt most (near all) executives of companies tempted by this technology support curbing, by feasible means, the spread of illegal pornographic images. Doubtless as private citizens they would report suspicious materials via existing channels. That can be deemed a moral imperative.

      There is no moral, legal, or business obligation to set their companies up for systematically monitoring 'content' passing through their hands. There would be absolutely no commercial sense. If by chance they fall upon it then citizen obligations arise.

      Introduction of the technology for purpose stated in the article is most unlikely to be efficient/effective use of resource to tackle the underlying problem which is creation of recent (potentially traceable source) images for which there is prospect of the perpetrator manufacturing more through continuing abuses.

      Other readers have made plausible argument why using databases of images and/or an AI capable of differentiating hitherto unknown images depicting harm from an innocent photo would raise immense ethical and legal problems merely through fact of the screening process taking place. Add to that differing world jurisdictional criteria for distinguishing abuse from art (age being a factor too) then the only safe ground upon which screening could operate would be that of images so dreadful that agreement by legislatures is near certain. The last option may be justifiable but why oblige business to seek needles in haystacks? After all, one suspects the most prolific abusers to be phone and Internet savvy and therefore not prone to stuffing their material onto clouds. A similar point arises regarding the other weasel concept justifying surveillance: terrorism.

      Tracing and curbing copyright infringement makes more sense. Even companies not in the business of creating or distributing copyrighted 'content' could be induced to take part in screening customers' data. Copyright rentiers would pay a fee for the service.

      The shaky ground upon which copyright rests has been revealed by the Internet to all who care to look. Purveyors of film and music are fighting rearguard action. They are becoming ever more desperate. Unofficially streamed sport (consequent upon rampant price-gouging by official outlets) has joined the centre of attention. On the bright side, academic publishers have lost the battle but most don't yet know it.

      For some time I have thought the final battle will take place on the turf of proprietary operating systems. Here and elsewhere I have mooted Microsoft will discover a lucrative market by offering Windows, at least household versions, as Internet access guardians. 'Consumer' versions are pretty much battened down regarding scope for user modification and bypassing some features. For instance, updates, especially those purporting to correct security flaws, can, at most, briefly be delayed. 'Windows Defender' is almost mandatory and it should be easy to make it so.

      'Defender' represents an engine in place to serve copyright rentiers, surveillance unavoidable from advertisers with their trackers and from other agencies, plus snooping for illegal 'content'. 'Security updates' will be means to refresh hash databases and so forth. Returning data to Microsoft home poses little challenge because Windows is doing so all the time already. Given low cost high bandwidth connections and agile processors in devices the ordinary user will notice no difference. Those needing to push devices to their limits already have, or will, migrate to open source operating systems where users have complete control over configuration. 'Gamers' will stay with Windows only because Linux versions of popular games are sparse. Perhaps they will remain scarce when 'games houses' grasp the great potential for Windows to control DRM more than now.

      'Defender' upon recognising files containing unlicensed 'content' could have several options: disable playback/viewing, erase the file, and call home with details. 'Defender' could also render Darknets inoperable; either prevent Windows from running on them them or, with greater subtlety, make access slow or by other means unreliable.

      Introduced stepwise and without public fanfare these measures need not be noticed by most users. MSM, in thrall to governments and copyright moguls, is unlikely to raise a fuss. Muttering could be stifled by appeal to welfare of children (not just regarding abuse) and to fear of the ubiquitous terrorists lurking behind trees.

      That, I suggest, will be the final battlefield and I cannot predict the victor. One outcome will be surveillance societies with restricted freedoms and human culture all but dead; the enthusiasm with which people have adopted the chaotic Covid narrative makes this plausible. The other will be abandonment of rentier economics. This will release considerable opportunities for nations to use more productively the resource currently channelled, usually overseas, to parasites rather than genuinely creative people. Release of restriction on 'derivation' should lead to cultural renaissance: hitherto unparalleled innovation and reduction of grossly inequitable wealth distribution.

      -----

      Released under the Creative Commons Attribution 4.0 international licence.

  5. Ian Mason

    So Apple have solved the problem of what to do with that pile of cash?

    That'll be handing it over to all the people who sue them for this. If the police in most civilised countries need a warrant to search your possessions for unlawful material, what authority do Apple claim for this gross abuse of civil liberties? What theory of law do they have that they think they have carte blanche to start searching through people's phones?

    What do their marketing department think about all that money that they've wasted touting Apple's privacy credentials now that another part of Apple has just completely trashed those credentials overnight.

    Really Apple? With all your shouting about privacy I really expected better from you.

    Anyone who says "Think of the kiddies" and there's no doubt going to be some here: It's always used as the excuse for inserting the thin end of the wedge, and then at the first excuse whacking the other end with a bloody great mallet. A recitation of the evils that follow bending the rules of civil hygiene for some "special case" ought not to be necessary. But for those thinking "Well, it's only going to affect paedophiles" anyone with one jot of sense knows that if this is permitted then there will be another "good and worthy" case permitted, then another less worthy until it trickles down to the point where your phone's camera will feed the onboard AI, which will note the double yellow line you just parked on and immediately debit your bank account the parking fine and put points on your digital driving license.

    1. MiguelC Silver badge

      Re: So Apple have solved the problem of what to do with that pile of cash?

      Wanna bet that the next iOS update will only install after you accept new T&Cs that explicitly allow Apple to scan your device?

      1. Nifty Silver badge

        Re: So Apple have solved the problem of what to do with that pile of cash?

        This reminds me of when Apple disallowed Google Maps (or the two couldn't come to an agreement). I deferred updating IOS for the longest time ever.

        Here we are again.

        This will pass.

    2. CountCadaver

      Re: So Apple have solved the problem of what to do with that pile of cash?

      Govts in most "civilised countries" will simply just pass very broadly worded "child protection" legislation that explicitly allows this and creates severe legal penalties for attempting to block this or bypass this (same way as refusing a breath test in many places gets you in front of a judge) along with implication that you are a child molester....Nothing like public pressure to get the proles to comply for fear of ostracisation, violence, vigilante murder etc....

    3. GruntyMcPugh

      Re: So Apple have solved the problem of what to do with that pile of cash?

      Privacy is a fundamental human right. At Apple, it's also one of our core values. Your devices are important to so many parts of your life. ... We design Apple products to protect your privacy and give you control over your information.

      Privacy - Apple (UK)https://www.apple.com › privacy

      I guess there's something in the small print about Apple Privacy. being slightly different from how we understand privacy.

      1. DevOpsTimothyC Bronze badge

        Re: So Apple have solved the problem of what to do with that pile of cash?

        give you control over your information

        Your information, not your content :(

        Looks like Apple is the first big company to openly state it's starting down the thought police route. While the other companies might monitor and profile you I wasn't aware they also informed the authorities if they didn't like something you did (unless it was to steal their IP)

    4. Great Southern Land

      Re: So Apple have solved the problem of what to do with that pile of cash?

      There is another issue that noone seems to have commented on yet. Who will be paying for the data used to send this info to/from your phone? Guess what..... YOU will be.

  6. Boo Radley

    Won't Someone Think of the Children?

    It's nearly always the go to excuse for something that eventually ends up taking away people's rights.

    1. David 132 Silver badge
      Big Brother

      Re: Won't Someone Think of the Children?

      "We're only using it to scan for images of kiddie porn. Are you on the side of the kiddie fiddlers?"

      "We're only using it to scan for images glorifying terrorism. Are you a terrorist?"

      "We're only using it to scan for images of serious crimes. Why do you sympathize with murderers?"

      "We're only using it to scan for images of banned ideologies. What, are you a neo-Nazi?"

      "We're only using it to scan for images of racist or TERF nature. Such thought is wrong and is banned."

      "We're only using it to scan for images showing support for Antifa."

      "We're only using it here in China to scan for images showing you support the Hong Kong democracy movement."

      "We're only using it here in Spain to scan for Catalan separatist imagery..."

      I won't stoop to repeating the old Pastor Martin Niemöller quote cliché, but there's a definite slippery slope argument to be made here.

    2. Anonymous Coward
      Anonymous Coward

      Re: Won't Someone Think of the Children?

      For those who think this will only affect paedophiles, I suggest they read up on what happened to Julia Somerville at https://en.wikipedia.org/wiki/Julia_Somerville#Allegations in the pre iPhone age. If I were you I would make sure that you never take a photo of your kids on your iPhone after Apple introduce this unless you want to risk the knock on the door in the middle of the night from Constable Plod.

      1. Anonymous Coward
        Anonymous Coward

        Re: Won't Someone Think of the Children?

        From what I understand... at moment tech shall see if you have any of the 'known bad' images.... that depict TRUE CHILD ABUSE...

        But ... if it is based on a Neural Net.. learning that naked children are a sign of abuse... then there shall be a problem.

        1. Anonymous Coward
          Anonymous Coward

          Re: Won't Someone Think of the Children?

          I'm waiting for the moment it starts flagging sphynx cats...

      2. Mog_X

        Re: Won't Someone Think of the Children?

        Having Nirvana's 'Nevermind' album cover that's been downloaded from Spotify on your iPhone will be a hanging offence...

        1. Lil Endian

          Re: Won't Someone Think of the Children?

          The title sequence of "The World According to Garp".

  7. Anonymous Coward
    Anonymous Coward

    Don't use your iPhone in church

    You might accidentally take a picture of a couple getting married, but end up with a few baby angels in the background which Apple's AI might class as naked children and get you prosecuted.

    1. elsergiovolador Silver badge

      Re: Don't use your iPhone in church

      I think churches will be whitelisted... ;-)

      1. Clausewitz 4.0
        Devil

        Re: Don't use your iPhone in church

        The Vatican will move against this tech

      2. tip pc Silver badge

        Re: Don't use your iPhone in church

        Should be enabled on all religious peoples phones first.

      3. nijam Silver badge

        Re: Don't use your iPhone in church

        > I think churches will be whitelisted

        Yes, they have been for centuries, after all.

      4. Ian Mason

        Re: Don't use your iPhone in church

        No, that's "whitewashed". Even Jesus said so. ;-)

    2. gandalfcn Silver badge

      Re: Don't use your iPhone in church

      Only the brainwashed go to churches by choice.

      1. chivo243 Silver badge

        Re: Don't use your iPhone in church

        amen to that!

        1. gandalfcn Silver badge

          Re: Don't use your iPhone in church

          I note we have a lot of brainwashed godbotherers around. Mainly Septics I suggest.

    3. jdiebdhidbsusbvwbsidnsoskebid Bronze badge

      Re: Don't use your iPhone in church

      Surely you won't actually get prosecuted in cases like that: As soon as the photos are shown in a court to be innocent, the case will be dropped.

      But what worries me is that the route to that absolution will horrible for anyone falsely accused. Police will be knocking on your door at strange times, you and your family's computers will be seized, your employer won't trust you anymore and when your identity inevitably gets leaked, the local paediatrician* haters will be smashing your windows every night for months while the slow gears of the law grind away.

      For things like this, I'm normally of the "if you have nothing to hide you have nothing to fear" mindset. But as soon as I read that Apple will be using AI to find incriminating evidence, I worry. Unless they have real humans very early on in the image identification process, the false positives that the AI will inevitably throw up could cause a lot of hassle - the sort of hassle that can never be undone.

      *https://www.google.com/amp/s/amp.theguardian.com/uk/2000/aug/30/childprotection.society

      1. gandalfcn Silver badge

        Re: Don't use your iPhone in church

        Try reading what the process is rather than typing inapplicable comments.

      2. Lil Endian

        "the local paediatrician"

        Yes, that has been at the forefront of my mind too.

        Social perception when accusations of paedophilia have been made erroneously does not easily reverse, if at all. When idiot zealots can't even read...

        It's life destroying for those involved.

        Of course, there will always be cases of genuine concern which are shown as innocuous, in all areas of law. But when the case is child abuse related people assume "no smoke without fire" indefinitely. A tough situation indeed.

        1. 2+2=5 Silver badge
          Flame

          Re: "the local paediatrician"

          > Social perception when accusations of paedophilia have been made erroneously does not easily reverse, if at all. When idiot zealots can't even read...

          > It's life destroying for those involved.

          My wife is a teacher - the last thing I want to have to deal with is a false alert from some dubious piece of software that's been "verified" by a half-wit whose only qualification for the job of staring at other people's pictures all day is that he is too stupid to get any other job.

      3. CountCadaver

        Re: Don't use your iPhone in church

        HAHAHAHAHAHAHAHAHAHAHAHAHA

        You seriously believe that?

        Given how even in the UK that judges have an eye on politics and err on what they deem "the safe side" i.e. someone urinating in the street was put on the sex offenders register....gross but hardly rape, notice how insidiously worded "sex offender" has become, encompassing quite a range of stuff that a large percentile of the populace wouldn't deem "sex offences" and no one able to challenge it for fear of being classed as a "pedo apologist" or a "peeeeedo" themselves

        Giving people an unrestricted right to vote was a piece of stupidity....look where its gotten us.......

        Something to be said for those who make everyone else's life hell being "reminded of their place"

        1. tiggity Silver badge

          Re: Don't use your iPhone in church

          And given the massive closure of public toilets in the UK over the last decades, urinating in public often a necessity (with age you find micturition more frequent and unless you massively dehydrate yourself in advance (not a good idea) a long walk as a senior citizen often needs a "loo break" - & often alfresco if no facilities round e.g. countryside)

      4. MachDiamond Silver badge

        Re: Don't use your iPhone in church

        "Surely you won't actually get prosecuted in cases like that: As soon as the photos are shown in a court to be innocent, the case will be dropped."

        If you wind up in court as a defendant charged with possession of child pr0n, your life is over. That arrest will show up in the Big Data files that large companies use to track people working for them and job applicants. The file that says all charges were dismissed even before trial doesn't sit side by side with the arrest notice. These Big Data companies are also not subject to the same sort of laws that credit reporting agencies are. There are many more of them and you may not have heard of most of them. They have no legal requirement to share the information they show for you with you. They have no legal requirement to purge any information that may be untrue. All you will notice is that you are passed over for promotion over and over or you don't hear from companies you have applied to. You'd have to be very lucky to have somebody tell you that there is derogatory information about you that earned you a down check. At least at that point you will know your life is screwed and it's time to go into business for yourself.

  8. billdehaan
    Stop

    People went to digital photography to get AWAY from this

    Back in the 1990s and early 2000s, there was a "think of the children" panic in Canada, and crusaders went on the tear to get the police and government to "do something" to stop it.

    In the middle of this climate, I know of three cases where people ended up getting visited by police investigating them for alleged child pornography.

    One case was a Japanese anime, as in, a cartoon, with no actual humans being filmed, let alone children.

    The other two were the result of photo development. Those old enough to remember actual film cameras know that unless you had a darkroom, chemicals, and skill, you needed to go to a photo developer to convert your raw film into actual snapshots. Camera stores did it, of course, as well as specialty outlets like Fotomat, but one of the most common photo development places was, oddly enough, the pharmacy. And it was pharmacies that called the cops on two people getting their photos developed.

    The first case showed the shocking picture of a nude 5 year old boy with his pants on the sidewalk with a scantily clad 3 year old girl next to him. In other words, a 3 year old girl snuck up on her big brother and pants his swimsuit on him. Mom happened to be taking pictures of her kids in the pool, and couldn't resist getting a snap of her kids pranking each other.

    The second case was similar, with a grown woman in a bathtub with a 2 year old boy, who decided to make an obscene gesture to shock his mommy just as Daddy walked in. In other words, a typical "Jim, get in here and see what your son is doing" family moment.

    Fortunately, in both cases, the police officers were parents themselves and not idiots, and when they visited the families and saw that the kids photographed were the children of the photographers, they realized that the photo developers had completely over reacted. But as you can imagine, those families stopped sending their film out to be developed, and went to digital photography.

    Now, you don't even have to drop your film off to have busybodies report you to the cops, your camera vendor will do it as soon as you take your picture.

    There's no way that this won't be abused, both by companies, and governments.

    1. JimboSmith Silver badge

      Re: People went to digital photography to get AWAY from this

      Not forgetting Julia Somerville and her ordeal. https://en.wikipedia.org/wiki/Julia_Somerville#Allegations

    2. MrBanana Silver badge

      Re: People went to digital photography to get AWAY from this

      Back in those days, while on holiday in Sicily, in a very cold hotel bedroom, I had to go to the loo during the night. I don't wear pyjamas so made a quick dash. My wife snapped me, "in motion", on the way back. Boots processed the film, and placed a "this picture is underexposed" sticker over the offending area. Just to stress, it was January, very cold, marble flooring, so it was an unusually small sticker. Otherwise, a larger, overexposed sticker would have been more appropriate.

      1. Anonymous Coward
        Anonymous Coward

        Re: People went to digital photography to get AWAY from this

        A mate used to enjoy dropping off naughty rolls of film at those photo shops that had print machines in the window... sometimes he even went back and collected the prints

        1. GruntyMcPugh

          Re: People went to digital photography to get AWAY from this

          In the mid eighties I had a Saturday job at Woolworths, and we did photo processing. My boss had a knack for spotting nervous customers,.. so would make a little mark on the bag their film went in. When the pictures came back, he'd check for skintones. He wasn't often wrong. Apart from his wrongness in doing what he did, of course.

          1. Arkeo

            Re: People went to digital photography to get AWAY from this

            That's why I rarely use my cheap-a$$ Droid for photos--got a Nikon DSLR for that thankyouverymuch. I started to fear something like this would happen when Google nuked Picasa (local) in favour of GPhotos (online). My photo library (mostly sea- and city- and land-scapes anyway, occasional portraits) is safely offline. And why my ancient Picasa 3.9 installer is safely stored in every (offline, of course) backup I manually do. And I'm on Win11, but still using it. Call me old-school...

    3. Alumoi Silver badge

      Re: People went to digital photography to get AWAY from this

      So don't take a picture of your kid with the freaking phone, get a normal camera instead. You know, a specialized device for taking and storing pictures onboard.

      1. Dante Alighieri
        Black Helicopters

        Real cameras

        How long before iTunes starts scanning your whole hard disc / LAN for other infringing content

        no more baby pics on sheepskin rugs

        we all know how accurate the "AI" nipple detector is on faecesbook

        1. gandalfcn Silver badge

          Re: Real cameras

          How long before you find out what the process actually is?

          1. nijam Silver badge

            Re: Real cameras

            > How long before you find out what the process actually is?

            It's an AI system, so NO-ONE WILL EVER KNOW.

          2. Anonymous Coward
            Anonymous Coward

            Re: Real cameras

            Why don't they turn on the camera and remotely take a look at what suspected pedos are up to? Perhaps they're molesting kids WHILE Apple is busy going through their review process or paperwork or something.

            At least turn on the mic remotely and have a listen to save some kids from those pedo Apple customers!

            I really don't get what the problem is, if your not a pedo, hold up the phone and wave it around whereever you, are so an Apple employee can confirm "not-molesting' to his own satisfaction. Perhaps that Apple employee will be as professional as gandalfcn.

            1. gandalfcn Silver badge

              Re: Real cameras

              Is there a camera in the cloud?

        2. Fred Dibnah
          Linux

          Re: Real cameras

          I can’t install iTunes on my Mint PC.

      2. Anonymous Coward
        Anonymous Coward

        Re: get a normal camera instead.

        lol, where apple treads, other businesses follow...

        1. gandalfcn Silver badge

          Re: get a normal camera instead.

          Remember Windows?

    4. CountCadaver

      Re: People went to digital photography to get AWAY from this

      UK said anime would land you on the sex offenders register as a "pseudo photograph of child abuse imagery / indecent image" along with "making child abuse imagery" (courts ruled that a computer downloading a file counts as "making" but the public see it as "taking images of children being abused" and the prudes keep pushing the term "kiddie porn" to make a link between pornography and child abuse so they can outlaw pornography....

      We are headed rapidly into a fascist dystopia....worse than the Inquisition....

    5. MalIlluminated

      Re: People went to digital photography to get AWAY from this

      An important lesson from those days was “always invite the girl at the photo counter to your parties.”

      Today, digital cameras exist, as do multiple means of online and offline storage. I can’t see where this invasion of my privacy will actually solve the stated problem.

      It seems the real problem is that people can’t be trusted not to do atrocious things. I’m thinking of torture and blowing up people in other countries because you don’t like them. So I think when the government is ready to allow me to root around in their phones and cloud storage for immoral behavior, perhaps a compromise could be reached.

      1. werdsmith Silver badge

        Re: People went to digital photography to get AWAY from this

        It won’t solve the problem. It’s an attempt to keep offending images off Apple devices and Cloud so Apple don’t have the responsibility for the problem.

  9. HildyJ Silver badge
    Big Brother

    It doesn't surprise me

    Apple has a history of kowtowing to various country's demands for suppression of or access to data.

    This seems like an attempt to avoid a US mandated backdoor with a "limited" front door.

    I wonder if the initial research was done to help China monitor their iPhone users.

    1. Mark 65 Silver badge

      Re: It doesn't surprise me

      In great need of another OS that can run on Apple hardware or great hardware with another OS. No, that simply isn’t Android.

    2. Tom Chiverton 1 Silver badge

      Re: It doesn't surprise me

      They stood, rightly, firm before.

      What's changed for them?

      1. Cuddles Silver badge

        Re: It doesn't surprise me

        Money. Never make the mistake of thinking Apple, or any other large corporation, has actual principles. They never cared about your privacy, they simply calculated that claiming so was a good PR point to compare themselves to competitors (ie. mainly Google). As long as they thought it would be more profitable to make a show about privacy than to join in the spying themselves, that's what they did. Now, they think there is more money to be made by making a show that they're thinking of the children than there is standing up for privacy. Nothing changed other than the details of a cost/benefit analysis by the bean counters.

      2. the Jim bloke Silver badge
        Joke

        Re: It doesn't surprise me

        Apple used to have a reputation for supporting privacy..

        and Google wasnt supposed to be evil...

        Does this mean Greta Thunberg will be sponsoring NASCAR ?

    3. gandalfcn Silver badge

      Re: It doesn't surprise me

      :"Apple has a history of kowtowing to various country's demands for suppression of or access to data.:" As do most. Your point being?

  10. karlkarl Silver badge

    It is basically Apple's clone of Microsoft Defender at this point. It only scans for DRM cracks etc. Doesn't give a damn about viruses.

    It is called Microsoft Defender because it defends Microsoft's stakeholders. Not you. Apple is merely copying.

    1. Anonymous Coward
      Anonymous Coward

      citation?

      1. karlkarl Silver badge

        Citation? No need. You can easily run the experiment yourself. Just try to copy a bunch of cracked games onto your desktop and observe the results.

        You will experience similar to:

        https://www.reddit.com/r/Piracy/comments/9xuzp5/windows_keeps_deleting_crack_files/

        https://www.bleepingcomputer.com/forums/t/752884/windows-defender-results-are-very-confusing-when-detecting-cracked-software/

        https://www.reddit.com/r/TPPcrack/comments/3lgvyj/windows_defender_just_keeps_fucking_up_the_crack/

        Then try to copy and execute a known malware or trojan. Very likely you will observe nothing at all (or at least until files start to go missing and adware appears) ;)

        1. Strahd Ivarius Silver badge
          Joke

          You have this kind of issue only because you are using a hacked version of WindowsXP

          1. karlkarl Silver badge

            Which is worse in terms if privacy?

            A hacked Windows XP or a fully up-to-date Windows 10?

            I can never tell anymore ;)

  11. Phil Kingston

    "Apple will hold the unencrypted database of photos"

    That sound dodgy to anyone else? Are they getting some sort of special protection from law enforcement agencies who would otherwise be all over someone holding those images?

    1. gandalfcn Silver badge

      :"That sound dodgy to anyone else?" Only to the usual Apple haters and the terminally paranoid.

      1. Khaptain Silver badge

        Reply Icon

        :"That sound dodgy to anyone else?" Only to the usual Apple haters and the terminally paranoid."

        Are there really that many Apple Haters that actually own and use an iPhone ? Because if you dont have an iPhone none of this applies....

        1. Lil Endian

          Stepping where David 132 didn't go:

          First they came for the Apple users, and I did not speak out...

          1. Anonymous Coward
            Anonymous Coward

            First they came for the Apple users, and I did not speak out...

            hell, I cheered and clapped, and held a street party!

            But then, they came for the street parties...

            1. Lil Endian
              Pint

              Re: First they came for the Apple users, and I did not speak out...

              ...and there was no one left sober to speak at all!

        2. gandalfcn Silver badge

          I know a quite a few Apple haters who own iPhones.

          1. MrBanana Silver badge

            Apple hater - yes, because of the company's principles. Why do I have two MacBooks? Work says so. Although the most broken one now runs Ubuntu.

      2. Anonymous Coward
        Anonymous Coward

        @gandalfcn so you're fine if I nip around and have rummage through your bedside cabinet then? I won't take anything, I just fancy a look around, OK? You must be "terminally paranoid" to say no.

        1. gandalfcn Silver badge

          No comparison whatsoever, but it seems you are paranoid, so thanks for proving me correct.

          A better comparison would e those who but=y things that spy on them, like security and tvs with cameras.

      3. Cereberus

        Semantics

        Apple will not hold an unencrypted copy of the database:)

        Apple will have the ability to remove files from the database and decrypt them, but the database itself will remain encrypted.

        "Only to the usual Apple haters and the terminally paranoid"

        Does terminally paranoid mean you aren't paranoid enough? After all you were paranoid and thought everyone was out to get you. You were right but didn't take enough precautions and they got you.

        1. gandalfcn Silver badge

          Re: Semantics

          Did you have the pencils up your nose when you typed that?

          1. Tridac

            Re: Semantics

            Looks like the shill for a certain repressive regime is out and busy today then ?...

          2. davidp231

            Re: Semantics

            "Did you have the pencils up your nose when you typed that?

            Wibble.

            1. Lil Endian
              Pint

              Re: Semantics

              David!

              Did you post that from a small village on Mars, just outside the capital city?

              1. davidp231

                Re: Semantics

                "Did you post that from a small village on Mars, just outside the capital city?"

                Wibble wibble.

                1. Lil Endian
                  Coat

                  Re: Semantics

                  Ah! You do arithmetic too!

                  [Mine's the one with the ticket to Hartlepool in the pocket!]

          3. FozzyBear

            Re: Semantics

            Don't you dare belittle an Australian kids icon Mr Squiggle

    2. Splurg The Barbarian

      I would be very, VERY surprised if it is actually a database of photographs/images. It will mote than likely be a database of known hashes of offending images, which will be added to by law enforcement agencies who are examining hardware and identifying these images.

      Same is done in UK with our version.

      1. gandalfcn Silver badge

        Correct, but you are dealing with Apple hating paranoids. Who don;t even know what the process is. So much for the intellect and critical thinking abilities of a certain IT crowd.

        1. Tridac

          But we know that Apple has years of form for being a greedy, grasping and dictatorial company, but go ahead fawn over and worship them if you will...

          1. Splurg The Barbarian

            I certainly do not support Apple, far from it. I have many, many criticisms of Apple and the "cult". But anyone believing that Apple have been given a database of indecent images depicting the sexual abuse of children needs to give their head a wobble. The authorities, certainly in the UK which I have professional experience of the processes, have a national database of known images in which the hashes are stored NOT the images. This is what it is compared to and will be flagged up.

            MS do this in their cloud offerings and have done so for at least a decade. The issue for me is the fact that it appears from the statement, that this will be done on the DEVICE, to images set for upload to iCloud. Also once the precedent is set for doing this on an individual's device what is then next?

            1. doublelayer Silver badge

              You can't train an AI on hashes to detect new offending material, which is what they said they wanted to do. I don't think they're going to do anything other than create that model from the data, but if they're doing it at all, they'll need a method of running it on the real pictures. They could easily develop this on a database which they don't hold and from which they can't extract the images without sending out an alarm, so it doesn't mean they're storing it themselves or in perpetuity.

      2. Ian Mason

        You can't train an AI on hashes, it has to have the original images.

        In the UK at least, as originally put into law, mere possession with no regard to the intent of possessing such images is a criminal offence.

        This originally led to a regime of selective prosecution just to work around the sheer stupidity that the police were committing criminal offences by retaining the same as evidence. I believe that particular stupidity has been legislated away, but mere possession is still strictly illegal for individuals/companies whether they know they are in possession or not, and whether they are in possession for what anybody would see as a legitimate purpose (e.g. to create hashes, preserve evidence to hand to the police etc.). Witness the senior Met. police officer (Ch Supt Novlett Robyn Williams) who was prosecuted for possession when she claimed not to even know that someone had sent her the material.

  12. Chris Gray 1
    Stop

    Bandwidth!

    I don't own any Apple devices (surprisingly, I'm OK with the walled garden, but can't afford them and since I run Linux, which has poor support for getting images out...), and it now looks like I never will.

    My current cell-phone is an 8-year old Samsung S4, and my data plan is tiny by most standards. Any attempt by Google to do this sort of thing will result in me going back to a "feature phone", for purely financial reasons.

    1. gandalfcn Silver badge

      Re: Bandwidth!

      Good thinking.

  13. Anonymous Coward
    Anonymous Coward

    Double Plus Good

    Right Think is about to be Enforced

    1. Anonymous Coward
      Anonymous Coward

      Re: Double Plus Good

      Along with EngSoc and in Scotland NatSoc (wait that already is........)

  14. sreynolds

    So the 1984 ad was their mission statement?

    It always beings with the "best intentions". Do it for the kiddies and such. Obviously, if you are against this, then you must be a kiddie fiddler - or something like that goes the usual argument. It's the thin end of the wedge.

  15. Kevin McMurtrie Silver badge

    A new iPhone meaning for "jailbreak

    Take 50 photos of family kids playing. Find that they're all blurry and delete them. Get arrested by FBI because AI found illegal content and you deleted both their evidence and your defense.

    1. Anonymous Coward
      Anonymous Coward

      Re: A new iPhone meaning for "jailbreak

      It's a good idea, in principle, just like nuclear energy - can be a force for good but some people will find ways to make it otherwise.

      My initial concern, even before it gets deliberately misused, and like a few others have commented, is how many of the pictures of your own children/grandchildren will be treated. Like the ones of them playing in the bath, or toddlers on the beach, for example. Someone has already referenced the Julia Somerville incident; if it's all automated such cases will escalate before common sense gets a chance.

      1. gandalfcn Silver badge

        Re: A new iPhone meaning for "jailbreak

        "Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage"

        "comparing them against a database of known child abuse imagery. If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

        Since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry. But researchers worry the matching tool – which does not “see” images, just mathematical fingerprints that represent them – could be put to different purposes."

        1. Anonymous Coward
          Anonymous Coward

          Re: A new iPhone meaning for "jailbreak

          Problem is that will weaken Apple's 'sorry, we have no way to break in to the perp's phone' excuse to block unlocking/accessing data, as they have in the past.

          If Apple staff can review images then why can't <TLA> have access to everything at the drop of a court order

          1. gandalfcn Silver badge

            Re: A new iPhone meaning for "jailbreak

            Do you know the difference between a cloud and a phone?

        2. Anonymous Coward
          Anonymous Coward

          Re: A new iPhone meaning for "jailbreak

          No it looks for an AI model to *approximate* that image set. So yeh, naked kids in bath are likely a high false positive.

          I don't know how it works from there, you get flagged as a pedo or something? An officer visits and demands access to your phone on the basis that your a pedo and he has probable cause provided by Apple. They'll presumably pour over any images and text and emails and browser history and porn surfing and so on to asses how much of a pedo you are? Or perhaps you're some other criminal?

          I don't know if Apple will automatically pull your data from your phone for them, so this search might happen remotely without your knowledge. If you don't see it, it doesn't count right?

          But that's OK, they're protecting your kids from you, and you did mention how you like to take naked photos of your kids in the bath, under the non-de-plume 'Gandalf', which I have to say is awfully suspicion!

          1. CountCadaver

            Re: A new iPhone meaning for "jailbreak

            No you'll likely get your door kicked in and officers storming in and loudly announcing that your a suspected child molester....

    2. gandalfcn Silver badge

      Re: A new iPhone meaning for "jailbreak

      Thank for telling us you don't have a clue about the process.

  16. uncle grumpy

    After fighting off Ring and Alexa, my hardware has risen against me. Will there be any place to migrate too? Off course not, the dominoes will fall in rapid succession. Bill Barr must jumping for joy. Imagine the zeal the next regime will use, although I’m sure the current regime is supportive as well. I’m so pissed at the betrayal of these weasels I can barely see straight.

    1. gandalfcn Silver badge

      Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery.

      1. doublelayer Silver badge

        Yes, we got that. It doesn't mean what you think it means. The "neural" bit in the name is there for a reason. Because it's not just finding hashes. That's a bloom filter. This isn't.

  17. Clausewitz 4.0
    Devil

    Solution

    Just like what Signal Private Messenger did with "aesthetic" messages for Israeli-Cellebrite (this chapter isn't finished, expect more news), the solution here is also simple.

    Phone-Devs can embed hundreds of digitally-created-naked-children-fake-photos (not real ones) into files not viewed by the user, including fake geotags like for example FBI offices, Apple offices, or even the Pentagon.

    I do not endorse adult games with children, but this tech must go.

    1. Anonymous Coward
      Anonymous Coward

      Re: digitally-created-naked-children-fake-photos (not real ones)

      there are steps to make this illegal too. I mean, chopping off a digitally created man's head off, stamping on it, turning it into pulp (possibly to the sound of digital onlookers applauding, etc, - perfectly legal. Though you have to pay to play the... 'game' (yeah, let's call it a 'game', cause it's a game, advertised as a game, eh). Raping digital children, probably not quite illegal, but... Interesting, and not one-sided argument.

  18. Pirate Dave Silver badge
    Pirate

    "I don't know exactly what the neural network does"

    Those are very, very important to machine-learning for the T-800 class.

    So, err, Apple made a huge stink in the media for a few months about refusing to unlock a phone for the FBI (or was it the CIA/NSA?), part of which, if memory serves, they claimed was because of "customer privacy" and end-user "trust" of Apple. But then they take it upon themselves to scan everything on EVERYONE'S iPhone in the search for kiddie-porn? So if they find such pics, they won't alert the authorities, right? Because "customer privacy" and "trust" and all that. Right?

    I guess next year, they can focus on searching for pics of animals being abused. The year after, they can look for wives/girlfriends being abused. Followed by searches of pics of supporters at unpopular political party conventions. Damn, if they'd just turn on the mic as well and record every sound, they'd add a whole new dimension to their search endeavors, and would win great favor with the Party.

    Apple has gotten half of the population voluntarily addicted to the greatest societal-suppression device ever invented. If someone checked, I bet Stalin's corpse has got a full woody right now.

    1. CountCadaver

      *child abuse imagery

      Porn legal, child abuse illegal

      1. Splurg The Barbarian

        Yup. Worked as a Forensic Computer Analyst for a police force, anyone using the phrase "child pornography" was always corrected. The phrase was never used in the department by us.

        Pornography is legal and used by people for titillation. These are "indecent images depicting the sexual abuse of children". Always feel the term pornography diminishes a little the watching, collection,creation of these images.

        1. Anonymous Coward
          Anonymous Coward

          That's the problem with images that are sexual or not depending on the sexual arousal level of the viewer.

          In your heads you're professionals trained and infallible and not mapping your sexuality to your judgement. Yet I bet lots of images you class as kiddie porn will not involve actual sex, and the sexual component is in your (the viewer's) head. In effect you're sexuality is filling in the blanks to turn it into a crime.

          Can you sell your claim? Well you don't need to, because having classed the images as kiddie porn, those cannot be viewed by the general public to pass judgement on your claim. This is why an accusation does the damage here.

          "Pornography is legal and used by people for titillation. "

          Bestial porn? Lets call it "animal abuse" shall we? I'm sure it will be added to the image set at some point. Best to do the marketing now.

          Obviously if you're an iPhone user this is a real danger. They need to view their photos through the eyes of the perverted childless Apple employee who might be viewing it, and the officer keen to keep this suspicionless search going, and the AI that's been designed to return false positives.

  19. DS999 Silver badge

    I'll bet this has to do with Section 230

    The gist seems to be that it only scans photos when they are uploaded to iCloud (but does the scanning on your phone, before uploading) If you don't use iCloud, no scanning. That makes this pretty clearly targeted at Apple's (and their iCloud partners') potential liability if Section 230 changes are coming - which is likely in some form given that both parties in the US want to see reform. I wonder if they got a heads up from legislators about the content of a forthcoming bill?

    From what I saw in other articles, it will sort of keep a count of photos it thinks are potentially child abuse related and only have a human review them if there are a sufficient number. So a few false positives of your kid in the bath hopefully won't be a problem, and in any case Apple is only able to look at what is on iCloud, not what is on your phone itself (well, theoretically they COULD look at what's on your phone the same way that Microsoft could access everything on your PC since they control the software but that's not how this appears to work - it relies on Apple's existing ability to decrypt iCloud backups)

    1. gandalfcn Silver badge

      Re: I'll bet this has to do with Section 230

      Correct "Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery."

      but the Apple haters and terminally paranid just knee jerked rather than using their brains.

    2. Androgynous Cupboard Silver badge

      Re: I'll bet this has to do with Section 230

      Yes I was wondering that too. Apple will know that no matter how well intentioned this is, it's not the kind of stuff people want running on their phone - everyone else's is fine, of course. So I'd assumed this was more about their liability for storing the content. Suspicion largely confirmed then.

      1. gandalfcn Silver badge

        Re: I'll bet this has to do with Section 230

        Cloud, not phone

        1. Lil Endian
          FAIL

          Re: I'll bet this has to do with Section 230

          Instead of scanning images in the cloud, the system performs on-device matching...

          "On-device" is mentioned a dozen times in Apple's own statement.

          You clearly have not done any research.

          1. DS999 Silver badge

            Re: I'll bet this has to do with Section 230

            It does on device matching, but only of photos being uploaded to iCloud (to prevent such photos from being uploaded to iCloud)

            The connection to Section 230 liability couldn't be obvious unless they specifically called it out.

    3. Irongut Silver badge

      Re: I'll bet this has to do with Section 230

      Section 230 protects providers against dodgy user submitted content on websites. Your iCloud backups are not shared publicy I assume so it does not apply here.

      1. DS999 Silver badge

        Re: I'll bet this has to do with Section 230

        Section 230 is a lot broader than that, and would leave a lot more liability gaps if it was fully repealed as it is the only thing that protects anything done "on a computer". The phone company is not liable if I call up the president and make a death threat, due to a separate law passed many decades ago, before computers even existed. There would be no equivalent law protecting Google if I used Gmail to email him such a threat if section 230 was repealed. Whether the action is public or not has nothing to do with it.

        Transmitting child porn across state lines is against federal law. So without some type of liability shield for the carrier (i.e. like the laws holding Fedex and other delivery services not responsible for unknowingly delivering child porn polaroids from one of their customers to another) they would be in violation of the law.

  20. ThunderCougarFalconBird

    This can create an unwanted precedent. It is possible to push files to someone's device due to the web caching function all browsers utilize. There's an HTML technique that allows a website to pre-load images in preparation to display in a later page. Or not at all. I was able to push images to people's computers just by getting them to go to a survey page I set up. While they were filling out the survey, I was dumping hundreds of questionable images from the site "Stile project" on to their computers. Then I asked them if they had any NSFW images on their computers, they adamantly said no. I then went to the cache and pulled out all the images I loaded onto their computers.

    If you have someone you want to get in trouble, then you can do the same thing with this silly Apply scan. The bad part is that with my method, any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge...but a machine has no such consideration. Machines just do. they don't think. This can be a real problem

    1. Jason Bloomberg

      The bad part is that with my method, any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge

      That's where I hide my dodgy stuff

      Just kidding but any prosecutor worth their salt will be arguing that's what defendants do in order to gain plausible deniability of their crime, hoping to put their crime beyond reasonable doubt.

      If you are using an 'I didn't put it there' defence in court then it obviously hasn't convinced prosecutors and there's no guarantee it will convince a judge or jury.

    2. Irongut Silver badge

      > any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge...

      Any human finding a dodgy image in the web cache should realise that it was cached by the browser while the user was intentionally looking at dodgy websites. Oh dear your hiding place actually incriminates you more.

  21. Draco
    Windows

    I'm sure this builds on Apple's robust and secure Face ID tech ...

    ... and any false positives are a rare occurrence which, I am sure, you can easily clear up on your own.

    https://bit.ly/3fxRLsr

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm sure this builds on Apple's robust and secure Face ID tech ...

      @Draco, downvoted for mystery short-link that leads to who knows what (perhaps especially pertinent given the subject of this discussion!).

      1. Draco
        Windows

        Re: I'm sure this builds on Apple's robust and secure Face ID tech ...

        Here is the "mystery" link without being shortened:

        https://www.thesun.co.uk/news/5182512/chinese-users-claim-iphonex-face-recognition-cant-tell-them-apart/

        ---

        You can check a bit.ly url by appending a + to it. This causes bit.ly to show you the original URL and the date it was created.

        Mystery URL: https://bitly.com/3fxRLsr

        Mystery URL revealed: https://bitly.com/3fxRLsr+

        But ... don't take my word for it. Create a short URL at bit.ly, copy it, paste it into the address bar, append a +, press enter and see bit.ly reveal the original address and creation date of the short address.

        1. Anonymous Coward
          Anonymous Coward

          Re: I'm sure this builds on Apple's robust and secure Face ID tech ...

          Thanks for that, but, ugh, so it was a link to The Sun, that's dodgy content even more vile than I had thought of! ;-)

          (Actually, just reading the full link text says enough about what the article is about, that we wouldn't have to sully ourselves by actually following the link, thankfully.)

        2. dave 76

          Re: I'm sure this builds on Apple's robust and secure Face ID tech ...

          You can check a bit.ly url by appending a + to it. This causes bit.ly to show you the original URL and the date it was created.

          --------------------------------------------------------------------------

          It's not worth the effort, I just ignore all bit.ly links and never follow them. If it is important, send me the full link so that I can at least visually verify that it is going to the right site.

  22. Dinanziame Silver badge
    Unhappy

    How are they training their model??

    We know how the hash technique works, and that doesn't require Apple holding a training dataset.

    Considering even humans have been known to disagree on what was objectionable or not, I wouldn't trust an ML model to do the job... Especially when false positives have such consequences.

    1. gandalfcn Silver badge

      Re: How are they training their model??

      "Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery. If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified."

      1. Anonymous Coward
        Anonymous Coward

        Re: How are they training their model??

        So Apple staff will trawl though your private photos, violate your privacy, because their software made a false accusation against you to justify it. Potentially false flagging you as a pedo in their opinion, a life changing disasterous consequence.

        And when they fail to catch the first pedo with their review? The review will be the first thing ditched. All false positive flags will be passed, 'just in case' and all such customers will be labelled as pedos sans review.

        "National Center for Missing and Exploited Children"

        But these are preexisting images according to you, not missing kids. You keep changing your justification.

        Customers know they're not pedos, they don't know if Apples AI correctly identifies that, or if Apple's staff will falsely identify a person as underage, or label image as abuse just to be on the safe side, for Apple corporate policy reasons.

        Why would they trust Apple, if Apple does not trust them?

        Why would they trust you? You seem to keep changing your claim, and if you do reviews of images, you might be equally as prone to flights of fancy to justify a position you incorrectly took sans evidence.

  23. YetAnotherJoeBlow Bronze badge

    Be afraid...

    If I was a reporter using imessage, I would be very afraid - so much so that I would stop using iphones.

    1. gandalfcn Silver badge

      Re: Be afraid...

      Why?

      1. Androgynous Cupboard Silver badge

        Re: Be afraid...

        The Saudis chopped up Kashoggi, a Washington Post journalist, in their embassy and got in a world of pain for it(*). Seems like a lot of work to me - why not just use NSO's Pegasus to get access to their phone, upload some kiddie porn to it and let the Feds sort him out?

        (*) OK, not as much as they should have.

  24. Anonymous Coward
    Anonymous Coward

    As bad as the crime is

    Does the whole user base really need to be treated as potential suspects?

    “You’re all guilty, it’s just that we haven’t proved it yet”

    1. Jason Bloomberg
      Joke

      Re: As bad as the crime is

      Don't worry - If you have done nothing wrong you have nothing to fear.

      1. DuncanLarge

        Re: As bad as the crime is

        Until somebody decided they, i their wisdom, say you have done something wrong. According to what laws? What sensibilities? What culture?

  25. TVC

    No doubt systems have improved but..

    Quite a few years ago my corporate system was able to scan graphics in email looking for pornagraphic images and forwarding any it found to my IT team. Apart from the tit and bum photos there were loads of innocent false alarm pictures of kids in baths etc, taken by parents.

    It will only take one false alarm to screw up someone's life or career.

    1. Anonymous Coward
      Anonymous Coward

      Re: It will only take one false alarm to screw up someone's life or career.

      It only takes one ALLEGATION on social media to screw up someone's life or career.

      ...

      arguably, I would say: it serves him / her right (all of you, really), for being on 'social media' in the first place, but then, some jobs make it virtually mandatory to be there...

      1. Anonymous Coward
        Anonymous Coward

        Borders & Employers

        when entering some countries a lack of any social media profile is a red flag and a long wait and search.

        And without it you won't even get short listed let alone interviewed (although I also know of a successful candidate being excluded when the Chair of the place saw a couple of night out pics when they were much younger and "that won't do")

        Call me Marvin, and yes the diodes do ache.

        (black helicopter)

      2. Anonymous Coward
        Anonymous Coward

        Re: It will only take one false alarm to screw up someone's life or career.

        "arguably, I would say: it serves him / her right (all of you, really), for being on 'social media' in the first place, but then, some jobs make it virtually mandatory to be there..."

        Which, if you want to see benefit payments keep coming, you *have* to use it to find work as one of your sources.

      3. Anonymous Coward
        Anonymous Coward

        Re: It will only take one false alarm to screw up someone's life or career.

        "It only takes one ALLEGATION on social media to screw up someone's life or career."

        Too right! Even if you only fuck a sheep once, you never get to hear the end of it. :-)

        1. Anonymous Coward
          Anonymous Coward

          Re: It will only take one false alarm to screw up someone's life or career.

          what do you get when you mix human and sheep dna?

          banned from the petting zoo

  26. fpx
    Devil

    Nothing to Worry About

    There is only a low probability of a false positive.

    After the SWAT team breaks down your door at 4 am and confiscates all your PCs and phones and other electronics, it will only take them a few months to scan it. Then you will only have to answer a few curious questions about "can you explain *this* and *that* on your hard drive" even though this and that has nothing to do with the original find.

    No problem, that will all clear up after only a few years.You will be unable to work without your gear, and everybody around you will be very suspicious, but that is a small price to pay for society as a whole.

    Low probability times a few billion users? Meh.

    1. CountCadaver

      Re: Nothing to Worry About

      Or the local "child safeguarding activists" will beat you to death or at the very least vandalise your property and force you out of your home to "protect the children from this monster" egged on by the local "newspaper"

    2. the Jim bloke Silver badge

      Re: Nothing to Worry About

      Lets not forget malicious SWATting, where some dipshit makes false 911 calls and armed SWAT teams descend on the victim.

      As mentioned earlier, incriminating images can be pushed to a target phone.

      Only way this technology would be acceptable is if we could trust both those implementing it, and using it - and that just isnt going to happen.

    3. Anonymous Coward
      Anonymous Coward

      Re: Nothing to Worry About

      "After the SWAT team breaks down your door at 4 am and confiscates all your PCs and phones and other electronics, it will only take them a few months to scan it."

      At a previous company, we reported receiving dodgy images to our publicly-published email accounts. The police arrived and seized the hard disks from the email server and the computers of the staff with those accounts. They told us we'd get the disks back when they finished investigating.

      That was in 2005. They are still waiting.

      More worrying on a personal level: in the UK, if the suspect arrested on suspicion of this sort of offence has children, they are not allowed to remain in their own home while the investigation takes place. That puts an extra twist of the knife on delayed conclusion/justice.

  27. Thought About IT

    File ownership

    I've never considered any file that's created by my actions on an iPhone to be mine, because I can't go to its filesystem and copy it to anywhere else. That's one of the features I really miss from my Windows phone.

  28. Chris Hills
    Holmes

    Question

    How can you make a model to differentiate between children and little people? I have a hard time believing it is possible, and the ramifications could be severe for innocent people.

    1. Splurg The Barbarian

      Re: Question

      No comment on the rights or wrongs of this announcement, but in answer to your question very easily. The system from the announcement isn't using AI to scan photographs, it is comparing against a national database for f known indecent images of children. This has been created by uploading has values of.imahes.ffound by human examiners and will have been created over years. The idea behind them is it limits the amount of exposure examiners have to indecent images depicting the sexual abuse of children as they only have to deal with previously unseen images or edited versions of previously known images.

      This is how it works in the UK, with the UK's image database.

  29. Headley_Grange Silver badge

    Date?

    I honestly thought this was a misplaced April Fool article when I read it.

  30. chivo243 Silver badge
    FAIL

    I just scanned my phone

    75% of photos on my phone are of switch panels, cables in ceilings, broken connectors etc. about 15 % is vacation photos, I saw on with my son and a friends daughter in the swimming pool. Should I be worried? Should I off load that pic to another non-apple storage?

    What a slippery slope, I'm all for protecting children, where is the better way?

    1. tip pc Silver badge
      Big Brother

      Re: I just scanned my phone

      “ 75% of photos on my phone are of switch panels, cables in ceilings, broken connectors etc”

      We are both obviously deviants!!

      Can you imagine the interrogation by someone who doesn’t understand?

      I’ve also studied chemistry & physics, someone brought up on McGyver would have me locked up for a very long time.

      I also take photos of wiring harnesses and inside machines etc before disassembly to ensure they go back the same way. I’ve seen house wiring with connections going to the wrong colours, someone sometime checked and corrected a mistake but left the wrong colour sheaf. A quick photo ensures the correct sheaf can be put on the correct wire when time comes to reconnect!

    2. Splurg The Barbarian

      Re: I just scanned my phone

      No you shouldn't with regard that specific question. Unless the image matches a hash value of a known image that is stored in the US image database then it won't be flagged up under any circumstances.

      Microsoft do amd have done this on their cloud storage systems for at least a decade, but isn't really shouted about.

      Will the system stay at that, and whether any on device processing regardless of the fact it is images that are to be sent to iCloud are the main questions I have.

      1. doublelayer Silver badge

        Re: I just scanned my phone

        The existing checks are for hashes. The Apple check is using a neural AI to scan imagery. They are not the same. The quality of their AI is not yet known.

  31. aerogems
    Mushroom

    I applaud the effort

    The intent is a good one on this, assuming it's not just some fevered rantings of a conspiracy nutter, but once you open that door it will only be a matter of time before it is applied to other things. I also have a pretty big problem with the idea of the assumption of guilt this implies, and the fact that there is surveillance being done on people without any sort of judicial oversight... you know, those pesky warrant things that require showing probable cause, and sets strict limits on what the police can search for. Much easier to just assume everyone is a potential kiddie porn consumer/distributor and go on endless fishing expeditions until you finally find something.

    Mushroom cloud because I figure that is about how well this will go over with people if the idea isn't murdered in the womb.

  32. tip pc Silver badge

    Every is now under suspicion

    This is a shock.

    For a company who have been holding the candle on privacy for so long, if they now will treat everyone like a potential suspect I no longer want to be funding them.

    I also thought iCloud was end to end encrypted. I’m shocked it’s not.

    I’m looking at re doing my home server stuff and was looking at the Mac mini. I might still look at it if I can run something other than osx on it.

    1. cupplesey

      Re: Every is now under suspicion

      I think it is but at the other end is Apple's mothership.... have they got a backdoor key for them and the US government to use all along....Can we really trust these companies?

    2. chivo243 Silver badge
      Meh

      Re: Every is now under suspicion

      I don't think running macOS will be the problem, don't connect it with your AppleID, never, ever do that... It's that the iPhone is tied to the AppleID.

      I am starting to feel as I've been maneuvered into a safe place that might not be so safe from another perspective.

      1. Arkeo

        Re: Every is now under suspicion

        Your reasoning is sound, but practically impossible for the average Joe/Jane, and frankly a pain in the neck even for a skilled user on Droid: you can make a useless, or basically fake, Gmail account just to activate the phone and use, say, mailbox.org or whatever for real email. But then Gmaps (basically the only G-feature I use) would still link your movements to your Gmail account and therefore to your phone number and IMEI...

        So we'd still be fsck'd, wouldn't we?

        If I'm wrong please correct me...

        Cheers

  33. Pascal Monett Silver badge
    FAIL

    "scanning individual users' iPhones"

    I'm sorry, on what authority ?

    Has Apple been integrated into a special Police branch ?

    What right does Apple have to scan individual users' private property and report the results ?

    Another case of a tech giant making social and police decisions on its own, without any mandate to do so.

    I was never interested in Apple gear.

    Now Apple is on my blacklist.

    1. SImon Hobson Silver badge

      Re: "scanning individual users' iPhones"

      They just have to put in in the "no-one has a couple of days spare to read it all" agreement you have to sign before any modern stuff works and it becomes legal - as in "we can do it, we asked for permission (on page 273 of 425 pages) and you said yes".

      https://www.onelegal.com/blog/fantastic-clauses-hidden-in-contracts-and-eulas/

  34. tip pc Silver badge

    iCloud private relay? Can we trust anything fruity at all?

    What now for iCloud private relay?

    What about iMessage

    What about faceid

    Photos app has been scanning faces in photos for years, will it upload those recognition hashes to nsa now?

    Will I get my door smashed in because an algorithm got it wrong & the humans in the mix needed to make their targets?

    Do I have to now roll my own everything to ensure I retain privacy?

    I don’t use TOR as I see that as joining in with unsavoury types but this move is pushing more people in that direction weather they like it or not.

    Unintended consequences, a little like banning the sale of cigarettes, normal people who never touched illegal stuff will suddenly be more likely to interact with drug dealers to keep their habit going.

  35. mark l 2 Silver badge

    Typical 'won't someone think of the children' response to give yourself permission to search through 10s of millions of innocent users photos to find a handful of law breaking people. Of course if you object to it you are siding with the pedos. Yet no doubt those who did use their Iphone to store illegal image will now stop using an Iphone and switch to Android since they know the scanning is occurring now.

    It reminds me of back in the pre digital camera days where people would get the plod knocking on their door after the photo processing company reported the photos of their kids naked in the bath to the police as kiddypron.

    This is just another way of showing that despite you spending a grand on your new iPhone its NOT your phone it belongs to Apple and they can decide what you they do with it.

  36. Anonymous Coward
    Anonymous Coward

    I'm going to repeat a comment I found in the Washington Post that pretty much says it all:

    "The first problem with what Apple proposes is that it cannot be performed legally in Europe, and, in some parts of the world, accessing someone's content without their explicit permission even carries sentences that come with a mandatory stretch in jail - it can only ever be done by local police, controlled by an investigating judge.

    The second problem is that it puts Apple in jeopardy when they miss something and can thus get sued by the victim for not acting.

    The third and most important problem is that it creates a backdoor for abuse by other entities, almost as bad as the idiotic idea to weaken encryption that shows up every seven years or so. Until now, Apple was seen as the safest device to use in an online age full of hackers and ransonmware criminals and that status took years of doing the right thing to achieve.

    Apple just undid all of that with a single announcement that practically every lawyer with an ounce of common sense would have warned them against.

    Don't be the police. That's what the police is for."

    Isn't this called "pulling a Ratners"? If not, it should be. It's moronic.

    1. Anonymous Coward
      Anonymous Coward

      re. accessing someone's content without their explicit permission even carries sentences

      but this would be with their EXPLICIT consent, first time they open their new iphone and click on that big, green, juicy button that says: 'AGREE!'* (yesyesyesgimmegimmeshinyshinynownowNOW!!!!)

      *to everything

  37. TheProf Silver badge
    Joke

    Missing the important thing

    Yes that's all well and good but how is it going to affect battery life?

    1. Anonymous Coward
      Anonymous Coward

      Re: Missing the important thing

      You are a bad man, and I applaud you.

      Thanks for the laugh.

  38. Anonymous Coward
    Anonymous Coward

    That's instant jail for whoever tries that in Switzerland

    As far as I know, accessing someone's content without their explicit approval carries not just a fine, but a mandatory jail sentence there.

    The only time you get to access someone's content as a provider is under court order, and even then the extract goes to a very small set of poilice people and an investigating judge who then assess if there is a crime in progress.

    Sure, they can try this on Americans because they have at Federal level so many laws breaking privacy that there is probably a fully legal path to do so, but in the GDPR zone I can't see this one fly either.

    1. SImon Hobson Silver badge

      Re: That's instant jail for whoever tries that in Switzerland

      But didn't you read that gazzilion page long licence agreement before clicking "I've read and accept it" ? Somewhere it'll ask for permission, and you'll have explicitly given them permission to do this. So potentially completely legal under GDPR.

      I say "potentially" because GDPR also prohibits burying stuff like this in long agreements, and also prohibits making such acceptance a requirement where it's not actually required for the product or service to work. Look up how long Max Schrems has been going at FaecesBorg for - and that's probably how long you can wait for any practical enforcement action.

      1. Fred Flintstone Gold badge

        Re: That's instant jail for whoever tries that in Switzerland

        So, time, once again, for that excellent Freefall cartoon.

        Enjoy.

      2. Anonymous Coward
        Anonymous Coward

        Re: That's instant jail for whoever tries that in Switzerland

        .. and that's not even mentioning that newer "legitimate interest" permission BS which can only have come about by some serious bribing lobbying.

        It basically doubles the amount of shit you have to opt out of to ensure you have at least a legal basis to go after them, with bucketloads of deceptive design and deliberately misleading labelling to make sure you then still choose to allow it all.

        I'm generally against violence, but I've arrived at the point where I'm convinced that fines no longer have any impact, and percussive education may have to be made mandatory to stop the tide.

        In this context, a certain car brand which ends on "edes" will get it in the neck soon. I've been trying to unsubscribe from their systems since 2019 and complaints have not helped, so now I'm about to have some fun with them at European level for multiple violations.

        I'm through with being nice or gentle, that has yielded zero results.

  39. bellcore
    Alert

    Phantom enemy

    This is just like the battle against E2EE, it's always "For Ze Kinder", yet the reality is that it's hardly ever used in child abuse cases. In Germany, child abuse cases account for less than 1% of monitoring orders. It's used for drug offenses. They don't care about child abuse, they care about their authority.

    https://tutanota.com/blog/posts/why-eprivacy-derogation-bad-idea/

    1. TRT Silver badge
  40. Lil Endian

    Exfiltration of Imagery?

    So, Mr & Mrs A have legitimate and legal imagery of their children (eg. at play in the bath).

    The only difference (arguably) between this and illegal content is in the motivation in taking, and usage of, the images.

    To all intents and purposes this would be a positive hit for the ML system. A human will be required to assess the difference moving forwards. So now the images Mr & Mrs A assumed were private are have been observed by someone they really didn't want to see those images. Without a warrant? With what amount of training?

    Does the imagery then get uploaded to the ML system in some way to improve future operations? Well, that'd be a bit illegal.

    We all agree no connected system is unhackable. How long before that cache is exfiltrated maliciuosly? If indeed Mr & Mrs A's imagery was stored, it's now in the wild and they'd have every right to burn the morons that facilitated that. (I know that's a big "if".)

    The difference between the context of use with a given image (family pic vs child abuse) in some cases is purely in the eye of the beholder. Certainly not for an ML system to differentiate, and I really, really would not want to be a human viewing images to distinguish the difference.

    1. Anonymous Coward
      Anonymous Coward

      Re: Exfiltration of Imagery?

      Maybe always ensure your kids are fully dressed in your iPhone photos, just in case.

      I'm sure the nice officers pouring over your private family photos flagged by this AI have no malice. They are the thin blue line between Good and Black, Wright and Wong.

      But just in case.

      @Hackable... well NSO (Israel military intelligence derived hacking group) hack iPhones with their Pegasus software, so now they can also get journalists and politicans arrested on Apple auto pilot. But that tool is only used with Israeli government approval, so you're safe.... you're not one of these 'pro-Palestine' people right? Good.

      1. Lil Endian
        Pint

        Re: Exfiltration of Imagery?

        You are Mr Cynical, and I claim my £5.

        Which I then buy you a pint with!

  41. Anonymous Coward
    Anonymous Coward

    I already know of a deliberate miscarriage of justice

    - Physical security guy helps a female escapee of a Middle East family settle in the UK

    - A few weeks later, a break in in the office. Nothing is missing

    - A few days afterwards, police gets a tip on child pornography

    - Office is raided, office Mac (used by everyone) is taken

    - Technical "expert" (outsourced contractor who has only ever touched Windows) finds a pic (yes, one) in an iTunes backup

    - Police only sees statistics, so chap gets convicted for child porn. From a tech perspective there was so much reasonable doubt it should not have even ade it to court (I reviewed the files and am about to hand this off to some human rights people who may be able to act).

    He lost his livelihood, and could not even see his own kid without being accompanied as a result of a revenge action, eagerly assisted by the local constabulary who could not spell IT without having to look it up.

    Apple is about to offer mechanisms to make that a lot easier. Well done.

  42. TRT Silver badge

    Shock horror... they are already doing something like this and have been for years.

    Every now and again my iPhone pops up a little message that I have a new memory (let's get this straight right here and now - I loathe this "feature" and am a little bit disgusted that it's not something I can turn off - I live in hope that they'll give it a toggle switch).

    The bloody thing has been through all my photos whilst I'm not using it and has labelled all the photos of my cat (this HAS to affect battery life - there are many thousands of these!), decided that I went on holiday with the kids during these dates and collated all those together, recognised that a big bunch of pics was taken at work on the same day an thought it must be an important event (network cabinet inspection prior to a tidying up session).

    The next step on this path is somewhat creepy, however... identifying potential kiddy pr0n, hashing it, and then sending that hash to be... what? compared to a hash database of known imagery? Given to the Feds along with my phone number?

    1. Anonymous Coward
      Anonymous Coward

      Re: Shock horror... they are already doing something like this and have been for years.

      Yes, I fully agree with you: I do not appreciate it analysing my pictures either, and there's no way to kill that off despite that being in principle a privacy problem. The MacOS photos application also does this.

  43. Anonymous Coward
    Anonymous Coward

    its potential for misuse is vast

    you call it potential, I call it iphone XV, AD 2025...

  44. dogcatcher

    Awei with Iphone

    I now feel really safe with my Huawei for they are never going to admit that they look at anything on my phone - even if they do. Images of my small dog rolling on its back may confuse Apple but look like a menu to other censors.

  45. Luke Worm

    What's the news?

    Google has been doing this kind of scanning since 2008. Microsoft is doing it too.

  46. cupplesey

    Apples ad campain 'What happens on your iPhone stays on your iPhone'.....so Apple lied then?

    What happens when they inevitably get it wrong? Can you sue them for liable or false accusations. Of course i don't condone illegal content but its the thin end of the wedge for big brother/NSA level control and monitoring.

    Dosn't this also violate the US constitution? What about other countries citizens, are they being also watched but not informed just like the NSA.

    1. Lil Endian

      Libel?

      I think you meant "libel" rather than "liable".

      Only saying to save confusion for those whose first language is not English.

      Apols if I'm mistaken.

  47. gandalfcn Silver badge

    "Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage"

    Or

    "The neural network-based tool will scan individual users' iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today."

    1. tip pc Silver badge

      What’s the difference?

      The bad bit is someone has determined that all your photos and documents need to be checked for CSAM, because YOU are likely to have that stuff in YOUR collection.

      1. gandalfcn Silver badge

        "What’s the difference?" So you are saying the cloud is the same as a personal device, correct?

        "The bad bit is someone has determined that all your photos and documents need to be checked for CSAM, because YOU are likely to have that stuff in YOUR collection."

        Do you upload everything to the cloud? I don't, because I don't give a stuff.

        didyou bither to read what the process actually is? Obviously not.

        what is extremely sad is that all the self proclaimed IT experts here don't seem to understand the difference. The sae with a few other things tech. They ignore facts and abuse anyone outside their bigoted orthodoxy.

        1. tip pc Silver badge

          “ Do you upload everything to the cloud? I don't, because I don't give a stuff.

          didyou bither to read what the process actually is? Obviously not.”

          I had enabled iCloud photos on all my devices.

          I have 21 years of digital photos ~200GB, my partner has ~700GB’s. I have local backups but iCloud made it easy to have a cloud backup and also meant every photo was available even on devices without the storage space.

          iCloud photo sync also ensured photos appear on my Mac without having to actually sync the phone.

          Needing 2tb iCloud ensures all phones tablets & macs fully backed up in iCloud too.

          I trusted apple with my privacy and felt my data was safe with them.

          Not anymore

  48. gandalfcn Silver badge

    It seems very few commenters here actually bothered doing a bit of research, but then they are Apple hating and paranoid. Even elReg missed out some important facts.

    1. albaleo

      Fair point. While Apple may not respond to El Reg's intrepid journalists, they have posted info at the link below. It contains links to a number of technical documents.

      https://www.apple.com/child-safety/

    2. Lil Endian
      Thumb Down

      The only necessary research declaration (by Apple) is that Apple are performing the function of law enforcement agencies.

      No warrant. No jurisdiction. No mandate.

      So carry on with your ad hominem farce.

    3. Anonymous Coward
      Anonymous Coward

      One the one hand you accuse people of disliking this because they are "Apple haters" then you claim it is or will be done by everyone else (are they also Apple haters)? In other comments you talk about your love of taking photos of naked kids in the bath, and your confidence in the AI's ability to not flag you as a pedo.

      You're really all over the shop here.

      I get you want to deflect this, but you clearly don't understand what AI is or how people don't want AI flagging them as pedos. Even Apple loving customers.

      1. This post has been deleted by a moderator

  49. This is not a drill

    Remember PHORM

    Phorm was being touted by BT, TalkTalk, etc as a way of protecting users from nasties on the Internet.

    It was absolutely not about monitoring what everbody was doing so that you could sell the data and 'tailor' a users internet experience based on whoever was paying to most to push their products.

    Apple won't be happy until they can control everything you can do and see on your iCrap device. I've never owned an Apple product in my life, never will, and the work iTurd I have forced on me is only used to read work emails, nothing personal.

    And yes I know that Google, facebook, telco's can and do monitor everything, but at least they don't pretend that it's for your benefit.

  50. Sgt_Oddball Silver badge
    Facepalm

    How the hell...

    Does this AI figure out the users 'intent' ? I mean yes some content is obviously vile and should be treated as such but what of a user taking a photo on a beach with a naked toddler running off, refusing to be restrained by a 'bathing suit' happens to be in the background? Or if the kids are being cute in the bath so you take a family pic? What of having a group of children dancing in the back of a camper van at a communal meet up and one of them decides his clothes aren't for him (I dare not guess the reasons why)?

    As always with these things I suspect nuance will be lost (probably because the devs are looking for it on a map of France) </sarcasm>

    1. confused and dazed

      Re: How the hell...

      My understanding is they're checking the hash of the photo against a known database of dodgy stuff. The issue is who defines what is dodgy ..... and who has the right to check on your device without a warrant ....

      1. TRT Silver badge

        Re: How the hell...

        My understanding from the much more in depth Reg article is that they're using the AI-like technology that already goes through photos on your phone or on your MacOS device (OK, the ones in Photos anyway - there's no law that says you HAVE to store those files in Photos.app) to spot potentially dodgy images, and create a hashed version of it in some form that still allows comparison after minor edits are done, send the hash off to Apple for checking against a database of known dodgy images that are in circulation, and then... what? if it finds a match shops you to the feds? grabs all your phone history and contacts and people you've circulated the image to or received the images from and thus profiles a paedophilic ring? Those details aren't clear!

        They could of course do the same for any criminal activity... it's just easier to justify piloting it with kiddie porn. Photos of stolen cars circulated amongst gang members looking to offload a hot motor? Farm equipment is a hot crime at the moment. Sexual abuse of adults? Revenge Porn? Reconnaissance photos of banks, jewellers, wealthy domiciles, industrial premises with valuable IP sent around gang members? Beatings given to transgressors of Gang Rules? I believe these are often shared.

        I'm not saying if it's right or wrong, it just appears to me to be the thin end of a wedge.

        1. gandalfcn Silver badge

          Re: How the hell...

          "on your phone or on your MacOS device" Don't you mean in the cloud?

          1. TRT Silver badge

            Re: How the hell...

            No. Not necessarily. I meant what I said.

            1) There's no need. Photos works with or without an iCloud account.

            2) The phone "backup" to iCloud is supposed to be encrypted with a device or account specific key

            3) Who provides the processing power to analyse all of this stuff? Why not distributed computing? Though there are better tasks I could think of for a semi-asleep iPhone to be working on - how many millions of iPhones are there on the planet now?

            4) The iPhone itself is signed in and active and the data are "unlocked" when the phone is on, supposedly "secured at rest", so you can't just nab the flash storage or an image of it and then use it on another device with a different CPU etc etc They've got to justify all that "oh, only we can repair it with our parts for your safety" crap.

      2. TRT Silver badge

        Re: Who has the right to check on your device without a warrant.

        I wondered about that bit... I mean the technology to do this I think is already embedded into Photos, and that can work either with or without iCloud, so WHY do Apple specify that it's the photos destined for iCloud that are scanned (on device I hasten to add!) and given a metadata ticket? Is it perhaps that they operate under the flag of "it's actually heading out to OUR infrastructure, and so we are obliged to protect ourselves within reason from accusations of being a haven for illegal content"? It's not that it's checking YOUR device, per se, it's checking the data heading FROM your device TO their device.

        Oh, and I know that Apple have separate processes that do the actual uploading and downloading between Photos and iCloud - at least on MacOS... the background task is forever going wrong on my laptop - the fans will come on hurricane force during the night sometimes when it gets its NICs in a twist.

        Hm... I'm sure the lawyers have checked on this. Apple have quite a few of those, I hear.

        1. Lil Endian

          Re: Who has the right to check on your device without a warrant.

          Good take on it TRT.

          So rather than a misplaced attempt at "civil duty" it's misplaced self-preservation. As Apple wouldn't be liable (if at all) until the illegal content hit their platforms. If they are indeed launching a pre-emptive strike (on a yet-to-happen transfer) they're well overstepping their authority. It's tantamount to having a law officer observing your every move, including in your home, "just in case". So yep, Apple are trying to be a corporate version of a police state.

          1. TRT Silver badge

            Re: Who has the right to check on your device without a warrant.

            Well.. what they describe as happening is not so much that they prevent dubious content from hitting their server at all, but that they flag (because it's not a perfect matching process by a long chalk) that certain individual items / content MAY be dubious, but if any one user / device accumulates enough flags, then they balloon goes up and they go into responsible self-preservation mode and say "Hey, coppers! We think this person might be putting us in a bad legal place... so can you take this further, please? Have a look into it?" Then their big, red shiny, corporate bottoms are covered.

    2. gandalfcn Silver badge

      Re: How the hell...

      "As always with these things I suspect nuance will be lost" You mean like you did?

  51. tip pc Silver badge

    In Apples own words

    https://www.apple.com/child-safety/

  52. tip pc Silver badge

    Trust

    It takes a lot to gain peoples trust.

    They’ve now lost my trust.

    Speakers snooping

    Cameras spying

    Phones tracking

    And the consumer pays for it all.

    Un f&@£ing believable

    1. gandalfcn Silver badge

      Re: Trust

      how about Samsung? Sony? and all the rest. presumably you still trust them.

      1. confused and dazed

        Re: Trust

        Samsung and Sony have not created an entire marketing campaign about the protection of your privacy

      2. tip pc Silver badge

        Re: Trust

        I've been buying apple crap as my main stuff since 1991. I've trusted them for 3 decades, sending 10's of thousands there way in the process.

        They had my trust.

        They don't anymore.

        I don't know who to trust now.

        It does look like an update to the forthcoming OS's is needed for these "features" to work.

        so for now need to rollback to the current releases and not update.

  53. gandalfcn Silver badge

    Why do people conflate the cloud and devices?

    1. Lil Endian
      Thumb Down

      Follow Your Own Critisim

      [This is largely a copy of my post on page 2, to which you [gandalfcn] have not responded.]

      Instead of scanning images in the cloud, the system performs on-device matching...

      "On-device" is mentioned a dozen times in Apple's own statement.

      You clearly have not done any research.

      Why do you persist in erroneously attempting to correct posters?

      1. TRT Silver badge

        Re: Follow Your Own Critisim

        Ah... yes, thanks for the link. Interesting... though I'm curious as to how they know that they can combine thresholding with CSAM ticketing and put a figure on "false positives" like "one in one trillion chance per year of incorrectly flagging a given account".

        Almost sounds like they've been trying it out with test data taken from something like memes circulating via WhatsApp (auto-add images to Photos), or Photos shared publicly on Facebook or something.

        1. Lil Endian
          Pint

          Re: Follow Your Own Critisim

          I got the link from from tip pc above buddy ;)

          Others posted too, so cheers all!

          1. TRT Silver badge

            Re: Follow Your Own Critisim

            Yeah, but something about the way you put it made me actually want to read it whereas I was put off earlier by the potential of being faced with the usual reams and reams of Apple legalese and technobabble, but it was actually pitched just right - very understandable by the average reader.

      2. gandalfcn Silver badge

        Re: Follow Your Own Critisim

        I didn't respond because I didn't see it. OK. You also didn;t seem to have actually read and understood what you cited.

        "The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

        Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos."

        Entirely separate things.

        You;re welcome.

        1. Lil Endian
          FAIL

          Re: Follow Your Own Critisim

          I didn't respond because I didn't see it.

          Understandable. Thanks for responding.

          1. You've been claiming and reclaiming that others are wrong because they're conflating cloud & device, restating "...in the cloud...". I've pointed out that on-device is correct.

          2. You've not refuted my statement. You've pulled a different quote for your own purpose. Which fails as it still does not qualify your earlier false corrections of others' statements re: on-device.

          You;re welcome.

          Your attempt a patronising me gains my sympathy.

    2. This post has been deleted by a moderator

  54. Tessier-Ashpool

    Appeasement

    There are two types of people who complain about encryption: those who think of the children, and those who think of the terrorists.

    I imagine Apple is doing this to appease dumb legislators who want an end to encryption. But that would only appease half the complainants at most. Scanning for terrorist content would be irresistible to governments with this kind of technology in place. That would be next on the list for sure.

    A very slippery slope.

    1. Anonymous Coward
      Anonymous Coward

      Dream on......

      @Tessier-Ashpool

      Quote: "....who want an end to encryption....."

      *

      So....books like Bruce Schneier's "Applied Cryptography" need to be banned, and all the (thousands of) copies burned in the public square!!!

      *

      Banned books!.......book burning!........encryption is out of the control of politicians, civil servants, and assorted (private) control freaks.

      *

      ......dream on.......the encryption genie is long gone from that once hidden magic lamp!

      1. Tessier-Ashpool

        Re: Dream on......

        'Tis not I who is dreaming. Various senior political figures in the US and UK want backdoors into secure protocols. They are, of course, engaging in magical thinking, as has often been pointed out on this site.

  55. Anonymous Coward
    Anonymous Coward

    How good is Apple AI at dealing with base64 and IDEA?

    This Apple initiative is simple MISDIRECTION! The masses will "think of the children" and the bad guys will think of a way of avoiding the scrutiny (see below).

    *

    Is it a JPG or TIFF or RAW? Maybe it's a recipe for Black Forest Gateau? What was the key used for the IDEA encryption? Apple AI might have a few problems!

    *

    KUTktHLwrCNGmD2/gUDz8dqm0fNyVWbHjLE6oCl7UJEVBEUWFmHAm3qhzEK+B9juexE5aZHBFfh4

    7qyZm4ABQ0T+13gzTh8cg4KlAwdDK5VNyDR23XuKsbG27cvVr0wQZR37AaBeRrSeG4Pe5KMY0aI3

    D2mEcRXEk0JQ8ImpeEMJ1XtLEz7ey0dnarktOemDWSaaa4iG2mQ0GmltYQ0puneMmaWnfBaCP8m0

    RShGRkkW05hCiXHga6qg2k0pF13kHUqApeoUPj55rrJOOWAfcXhlv75bd0KfKhkdc6weCvwKyoyx

    JjcPe3EhDy0yZdyufuNakKho8JcBiMrpbFBxmmbl1rHpwhnnNRegf7oOGpVP+3iaN2RzryS9qAD+

    iB7kZIUZ6Yn+g8G23xMmHkXLs2Kiseq9/ry5vraz0wITznmlnOLZM2brr/J174i0oLkwje0ppg/w

    55HfHRDXtL8bAvR2ecFia9z9wdZW0/RYqHLhOoWMIbzUBBaEl3VMCbsJT2N2xhWgKwi3iBybYRrE

    b9vDOSroeN6bbp640FDEoCIPJeIUCTi2O6DjftXImZvQ0MoKxOwlfpc388vb6vumjLoFcbOPpXa4

    OABh7Nq2nCX3A24ySiTBjofGwufxaOaorxFHLGFCjFGH0FnQH4KaLkHVTnfwkrcdJHRl5SBWF/W1

    /YwV3skJJl9YNEQ503e4awnc3GVwyo+WE0jM/imgslt6W2WvT8MHWElHwcBxw01pqz1OGwWvaBsk

    14bwjivum/bS7+8nso+MYKESbPVRz1K+GQP8aeJAww6dpisq6cJSMph2jxAyb6ke1P4gDChkVRTw

    VN3Qx/7OkippTDSLtbpYyqpPcRxRowxibfXzGuUqZca25CAplhpKCsCM9DRKzUIvkIEVfYFF0Llu

    Rl4JtVU/OUrHIXBtLY8lPW3cjKZ1M2ajVP1YCN80fkwx4PZuKXXYmmfEYi6HapPJ2rE3o5kGaXYY

    OrBefEw0529xzJ8R5ddFyYHffBlYDnJr092tzAFIfch//T/s3ljslQ2V+K73EQ8n8LKiUZZpERZz

    hgyfCQfT7s7ATkiTfwIIeFi4Elynea5esT9LBlk1lkNjjNXHXZKdxGSGl/uTt9xV/PlWaHOkFhOI

    BDMQRKzED0MJmuwVb5bS/vJGu37xaeyYG9PU7rVGiSfGFsWHrklpLkFFWIxYpQtUKom2oTekV2XP

    4+dmsieXEjXt3H7jN6PCFG1CFm6IUFS4Ok8zRhxDvXn7c1FR2Nd+v+fwO5oU4MjTZpg/dvpAUzIl

    HnJp9dWGotkGqLPL9dg76vm9he+Emc0mybM9JyNO88jfcYXQcg3qM0GFlDEkMe7cDUtczNcFzSDz

    YDV8Y0Lj4bJNjpPvhv4KeZ8De6L1eOy5wPjF2rh53F8DBhQ8bdFPm6qNjYaQ4fO/lpK1Rv0iGXWc

    XA6KMypW4zYoDlVekt1y7lKIwk6yMJhlTRiYzCW1hn15Wou9BCtX4eYIJwOhSshOQKMbDzKRZSYv

    ToGWMolwKvHVOEUJ1QvjoGS6rOQS45c+71wC45luYyj3zqB2zl4fgl9hDgkg5r12E9y63pbfYmeN

    4SLTil1Y3PYVm41fbEH7cq9BVSB0hGl5nh+Xg0N7TePCkPF8RZeKU7w0/GZ39Sm63AGIYUlnZCyY

    RcLEZYn1MGUB+WQOZnJT0AhdbeXBrglC2Cr9kSBZCCKNrQbxFy8GDeH69oV31x57ayl5mjqEQGuR

    SV1DXpaz2CGW32m/mfMDLMSC3PAvOJYj8qZ8dp5ELsUZKJ6o5P2prA0T9ckNI+b7gTaK5K7kyDPd

    xlZKD9z5Z/c=

    *

    Let us know when you know what's in this example!

    1. tip pc Silver badge

      Re: How good is Apple AI at dealing with base64 and IDEA?

      that's not a photo stored by the photos app on an I device.

      it could be once you unencoded/decrypt it.

      you could just turn off iCloud and backup your photos to plex or something.

  56. bronskimac

    Fourth Amendment?

    I'm pretty sure the US courts would view this as a breach of the Fourth Amendment of the Constitution of the United States of America. There needs to be "probable cause" to carry out any search. I don't see "You've got a phone" without any other evidence, as probable cause to search it. "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."

    Of course in the UK our Members of Parliament (MPs) will love it and rush through any changes to legislation needed to make it happen, whilst continuing to exclude themselves from any such searches.

    1. Irony Deficient Silver badge

      Re: Fourth Amendment?

      There needs to be “probable cause” to carry out any search.

      Note that the Fourth Amendment only constrains searches by the government — see the Supreme Court’s majority opinion in United States v. Jacobsen :

      The first Clause of the Fourth Amendment provides that the

      “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated. . . .”

      This text protects two types of expectations, one involving “searches,” the other “seizures.” A “search” occurs when an expectation of privacy that society is prepared to consider reasonable is infringed. A “seizure” of property occurs when there is some meaningful interference with an individual’s possessory interests in that property. This Court has also consistently construed this protection as proscribing only governmental action; it is wholly inapplicable

      “to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any governmental official.”

      A legal defense against non-governmental searches by individuals or organizations would need to based on other areas of the law, e.g. on trespass or online privacy legislation.

      1. Arkeo

        Re: Fourth Amendment?

        Shouldn't that "private individual" also abide to the US Constitution? After all it's *the* fundamental law by definition, every State or Territory wishing to join the nascent US had to sign and accept it, did it not? If only a Federal agency should abide to the Constitution wouldn't that become an extremely dangerous loophole or precedent? If it's valid for the 4th why not the 1st, why not the 13th?

        1. Charles 9 Silver badge

          Re: Fourth Amendment?

          The point is that the Bill of Rights is intended to protect citizens from State action (of which the ex-colonists had plenty of experience). The general rule is that issues between private parties are their own business unless ties get stepped on along the way.

        2. Irony Deficient Silver badge

          Re: Fourth Amendment?

          No one stated that a private individual should not abide by the US constitution. If you read the US constitution, you will find that much of it does not directly affect private individuals; Article I. primarily deals with the powers of and limitations on Congress, Article II. primarily with the powers of and limitations on the President and Vice-President, Article III. ditto with the Supreme Court and its inferior Federal courts, &c.

          New states had to accept the US constitution, but I don’t know if “signing” it was part of that acceptance. New territories were creatures of Congress that were organized as such only once controlled by the US, so any acceptance in their case was performed by Congress.

          The Supreme Court did not state that only Federal agencies had to abide by the constitution. As was quoted in the case above, the opinion stated that the first clause of the fourth amendment to the constitution only applied to the government (and you could follow the links to past cases within the case link above to find the opinions that served as precedents); that case made no other determination on any other part of the constitution.

          Regarding your second question, unlike the first clause of the fourth amendment, the first amendment explicitly constrains Congress, and the thirteenth amendment still allows slavery and involuntary servitude as punishment for crimes, and explicitly gave Congress the power to enforce the amendment through legislation.

  57. Anonymous Coward
    Anonymous Coward

    Though Crimes

    I am having to post this with a disposable account as in this day and age I am basically guilty of heresy for speaking out against the insanity of our times.

    Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone. That is a fundamental principle of a free society which has been acknowledged for if not decades, hundreds of years. By all means prosecute the distribution of such material - but to prosecute possession is returning us to the Middle Ages hunting witches again.

    It is only because of radical feminists and other moralists creating moral panics in the 1970s and 1980s are we faced with the present situation in the 21st Century, which is extremely dangerous in a highly interconnected society such as ours, where it is very difficult if not impossible to stop people from unintentionally coming across such material.

    We might as well have a nuclear reactor in our own homes, yes it's very useful and provides lots of free electricity but one day it can melt down destroying the entire family.... Do you see the analogy I'm making? Because the Internet is just as dangerous. The penalties for possession of such material and being put on the sex offenders register are so horrific, it's totally unreal. It's like a real life nightmare. I cannot believe I'm typing this here in 2021. What has happened to our country?

    We criticize dangerous products that are unsafe and burn your house down or electrocute you, but why can't we criticize the dangers of the Internet and all the ridiculous draconian laws involving it? It is as if the law itself and the crazy ways it's made is beyond discussion?

    Nobody should have to fear their own computers (unless they are doing major hacking/fraud/sending death threats, etc...).. This simply cannot be happening in a free society.... It looks like we are no better than China - it's just over different stuff here in the West....

    1. Lil Endian
      Flame

      Re: Though Crimes

      AC: Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone.

      I see no reason for any "man (or woman) on the street" holding child porn legitimately. I genuinely feel for those that must view such material as a part of their work (the judiciary springs to mind).

      Obviously delimiting those with the right to data retention and those without creates the opportunity for grey areas, but it's handled already. When the right/position is abused justice must be enforced vehemently. The controls must be stringent.

      Carte blanche "it's ok, I didn't do it"? What? Mr X gives pics of his own kids to Mr Y, and vice versa. That is fine with you?

      Edit: I certainly do not consider Apple as being in the group with a possible mandate to retain these images in any format, or to "investigate" anyone.

      1. Anonymous Coward
        Anonymous Coward

        Re: Though Crimes

        @"I see no reason for any "man (or woman) on the street" holding child porn legitimately. "

        They aren't. You have no evidence otherwise. You falsely claimed they do, in your comment, to justify a speculative search of their private media with your AI, without a single shred of evidence.

        I bet power-trip officers and spooks are salivating at this. What they're doing here "FOR THE CHILDREN" is establish the right to search the digital media of people WITHOUT SUSPICION in bulk preemptively against their own search set.

        [1] I noticed you did not say "everyone/anyone", you said "man on the street", so I'm curious who you think has a legitimate reason that caused you to prefilter there.

        [2] You're not even talking about child porn are you, you're talking about an AI APPROXIMATE and SKEWED scoring of images trained on a set CLAIMED to be child porn!

        A speculative AI model based on a training set provided by the searcher! So not even "FOR THE CHILDREN", for an AI estimating of age algorithm / estimation of sexual activity algorithm, representing 0.00000001% of all images in real life, yet trained on them as if they are 50% of images. A false training set designed to give a lot of false positives, where normally it would return limit-zero results if trained on a set of all images.

        Apple just threw away their customers privacy right as a fundamental legal principal here.

        1. Lil Endian

          Re: Though Crimes

          I'm not sure if you're the original AC, so to save confusion I'll call you.... Bob.

          Hello Bob,

          [@] I was responding to AC's statement/scenario, that's obvious. I'm not speculating or justifying anything. It's quite clear what I was saying, I'm not sure how you skewed it unintentionally.

          [1a] Answered in my previous comment: ...those that must view such material as a part of their work...

          [2a] Bob: You're not even talking about child porn are you... To clarify: I'm talking about legal jurisdiction and unmandated warrentless searches of private property. The justification (illegal content) and method (ML) are totally irrelevant.

          [2b] Bob: "...a set CLAIMED to be child porn!" Apple: ...the system performs on-device matching using a database of known CSAM image[s]...

          [3a] AC: "Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone." In your country maybe, but in mine retention of child porn is criminal (which is totally supported by me).

          [4a] @Bob/AC: You didn't respond to the Mr X/Y scenario, so is that fine with you?

    2. coddachubb

      Re: Though Crimes

      https://en.wikipedia.org/wiki/Idiolect

  58. TM2015

    Spare a thought for those manually reviewing the images

    Yet more jobs, probably low paid, that will cause PTSD in people just trying to pay their bills. No one could pay me enough to be a content reviewer in situations like that.

  59. coddachubb

    Everything Apple do regards security is about protecting their business model, not wider society.

    Caveat empor.

  60. Rtbcomp

    Big Brother is Wrongly Accusing You.

    My biggest concern is reliability. We keep hearing stories about bank accounts of innocent people being summarily suspended or closed due to a computer suspecting fraud, how long before people are accused of being called paedophiles because some software says so?

    I've just tried to sell a Monopoly game on Ebay, only to have the listing banned because it contains the word "Monopoly" which according to Ebay's computer means I'm selling some sort of gambling product. I got round it by spelling "Monopoly" backwards. It seems to have let thousands of similar listing through though.

    1. TRT Silver badge
      Devil

      Re: Big Brother is Wrongly Accusing You.

      Tsk! eBay banning Monopoly... that would be like Amazon the online book-flinger banning a book called "Why Boycott Amazon? A beginners guide to the wrongdoings of global corporations using specific examples."

  61. Fruit and Nutcase Silver badge
    Big Brother

    Monkeys

    See no evil, Speak no evil, Hear no evil.

    This takes care of the first. Next they''ll start analysing your speech, then what you are hearing.

    I think I got that in the right order

    1. Lil Endian

      Re: Monkeys

      Nice :)

      "See no evil. Hear no evil. Speak no evil." is the 'modern' version, I'm guessing sanitised by the Victorians. AFAIK the original ended with "Do no evil." which had the monkey covering his nuts.

      Interesting in that "doing evil" is the basis of law yet it's omitted in the phrase. "Speak no evil" is covered by law, and now the law is moving into the other areas.

      [WTB a monkey-covering-nuts icon!]

      Edit: I was wrong about the Victorians, seems it's just a variation (Three Wise Monkeys)

  62. Jim-234

    Sure it's to go after the worst of the worst now....

    Folks should start making bets on how long it will be before Apple starts scanning you phones for "dangerous misinformation" or "Subversive political activity".

    1. Fruit and Nutcase Silver badge
      Big Brother

      Re: Sure it's to go after the worst of the worst now....

      Microsoft Office and Analytics just needs a few tweaks to do it on the desktop - if the capability is not there already

      https://www.microsoft.com/en-gb/microsoft-365/business/myanalytics-personal-analytics

      https://docs.microsoft.com/en-us/workplace-analytics/myanalytics/mya-landing-page

  63. Buttons

    Global Vampires

    I don't think many people will doubt that technology can be used for nefarious purposes whatever the original intention. There are no morals when it comes to the deployment of IT and the use of data, its business. These people are not our friends.

    Scanning on device content and matching it against a set of data defined by Individuals/Groups who will undoubtedly have a view, even if they attempt to be neutral in some way, will lead to errors in the results. I expect the technology has improved, but I'm thinking of the Met Police's attempts at rolling out face recognition in London. There were many false positives, AIUI, and it had real problems with people who were not born with pale skin.

    While Apple just wants to ensure that we're no erring and therefore not a danger to society, it is a model which can be quickly expanded, to include other services, onto the devices that we so happily buy to track our activities. I think a few people have already mentioned ways in which Apple can 'Improve' their service and I'm sure they're right. After all a gun is a useless bit of metal until someone adds bullets, points it and pulls the trigger. People cannot be trusted to do the right thing even if they can agree what the right thing is . . .

    More than this, I feel that scanning in the way that Apple propose will do two things,

    1) Apple will become a police force, an influential arm of law and order. Should they have that power?

    2) By scanning devices I think that they have removed the presumption of innocence. We will all be guilty before being proved innocent.

    Now apply that device scanning to all your other misdemeanours. You know what they are: Jumped a red light recently? Had deadly thoughts about your neighbour and told a confidante?

    Its a proposal that wants us to accept an overt form of surveillance, but of course 'Nothing to hide, nothing to fear. OK?

    I love 'Big Brother'! and I'm up to date with my subscription.

  64. Anal Leakage

    Remember back in the aughts…

    …anytime the iTunes terms of service wiggled, the excitable and invested would scream “STEVE JOBS IS GOING TO TAKE AWAY YOUR MUSIC!!!”

    Which totally happened

    1. Sorry that handle is already taken. Silver badge

      Re: Remember back in the aughts…

      Fortunately for its users, itunes has been financially sustainable. Guess what happened to users of other services (e.g. zune)?

  65. cartledger

    They already patented disabling the camera at certain times in certain locations. The example in the application was music concerts but we know it will ultimately be protests and discreet filming of establishment wrongdoers. With this technology, they will even be able to find and delete your images and videos after the event or that have been shared with you.

  66. Cybersaber

    This is an encryption backdoor for anything on the iPhone.

    Per https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    in the "On-Device PSI Protocol" section that deals with how the image scanning and matching works, there is a bit of handwaving. They detail how it is _intended_ to work, but it all relies on a secret key held by apple. Any image can be decrypted with this secret key, even though they say it can't if it doesn't match certain image descriptors. All you have to do is change the descriptors.

    This is an encryption backdoor prima facie, and just because you design a backdoor for one use, doesn't mean malicious actors won't find a way to confuse/repurpose this anti-CSAM mechanism for their own ends.

  67. TheProf Silver badge
    WTF?

    Richard Pic

    Every so often there are stories in the news outlets regarding women who've been sent 'dick pics'.

    This seem to happen because a miscreant on public transport has taken advantage of the simple Apple provided method of sharing pictures.

    As far as I can tell, if the receiving iPhone's Airdrop is set to 'Everyone' then a thumbnail of the obscene image is presented on screen.

    Substitute dick pic for child porn on an iPhone with pre-update firmware.

    Question: If the receiver of the image rejects the image is it removed from the iPhone without leaving any trace? Would the iPhone scan the incoming image to determine if it is 'legal'? How loud is the siren on an iPhone when it identifies an 'illegal' image and how long would it be before the baying mob set upon the innocent victim of cyberflashing?

    1. gandalfcn Silver badge

      Re: Richard Pic

      Question: If the receiver of the image rejects the image is it removed from the iPhone without leaving any trace? " Would it have been uploaded to the Cloud? No. OK?

  68. Anonymous Coward
    Anonymous Coward

    Whaaaa?

    1) the tech is now useless because anyone with a real reason to fear it has already ditched their iPhone as a result of the coverage,

    2) therefore, the only people who are going to be flagged up by this are false positives, each of whom will no doubt go through a nightmare time whilst law enforcement eventually gets round to acknowledging that the tagged pics were of tulips, or something equally innocent,

  69. Arkeo

    What about EU Countries?

    Doesn't this blatantly violate the GDPR? Or are our governments supposed to turn a blind eye over such a fundamental violation of the rule of law just to appease the Americans?

  70. tip pc Silver badge

    Calmed down and checked the detail

    Apple's press release

    https://www.apple.com/child-safety/

    detail of interest

    These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*

    from a 9 to 5 article

    https://9to5mac.com/2021/06/07/apple-will-let-users-stay-on-ios-14-and-receive-security-updates-even-after-ios-15-is-released/

    Apple will let users stay on iOS 14 and receive security updates, even after iOS 15 is released

    For the first time, Apple will allow users to stay on the previous major version when iOS 15 ships in the fall. Users will have the choice to stay on iOS 14 and receive important security updates, or upgrade to iOS 15 to take advantage of all the new features.

    Previously, Apple would release older security updates to devices that could not upgrade to the latest version. However, if you owned the latest Apple devices, getting the latest security updates necessitated updating to the latest version for iOS.

    Presumably, at some point, Apple will require everyone to migrate to iOS 15. You can expect that to happen when iOS 16 comes out next year.

    So it looks like the alternative is to not update to iOS 15, iPadOS 15, watchOS 8, or macOS Monterey

    I wonder if the new phones and systems released after the new OS versions are out can have the previous OS's installed?

    1. Anonymous Coward
      Anonymous Coward

      Re: Calmed down and checked the detail

      I'm going to see if that's possible, as I already run iOS15 beta and MacOS Monterey beta.

      This is 100% unpalatable, also because I feel that this represents a MAJOR and frankly unacceptable setback in overall platform security.

      Apple screwed the pooch with this one, properly. Years of trust - gone in an instant.

  71. Ken Moorhouse Silver badge

    Apple will hold the unencrypted database of photos

    Long thread... Not sure if anyone has made this point (apologies if it has):-

    Will there be a surge in the number of paedophiles applying for jobs at Apple in the hope of getting involved in this project?

    1. Anonymous Coward
      Anonymous Coward

      Re: Apple will hold the unencrypted database of photos

      I think you may have beaten me by seconds, but I think that point deserves some emphasis anyway.

      From what I've heard of other moderation efforts, it's also not exactly a fun job as you're exposed to the depravity of others. I personally would not be able to do that job, it know it would haunt me when I would head home.

    2. Lil Endian

      Re: Apple will hold the unencrypted database of photos

      I've not seen that Ken, but it's a good Q.

      I was thinking that the only people able to do the job of viewing these images (without PTSD et al as someone mentioned) will either be paedophiles (as you say) or sociopaths that don't connect on the "human" level.

  72. Anonymous Coward
    Anonymous Coward

    So, who watches the watchers?

    Doing this almost seems motivated by a desire to collect such imagery. It makes no sense for Apple to destry a reputation for security and privacy built up over years, so it must have some other driving motive. WTF prompted this?

    The other fun problem I see is the potential of unauthorised access to ADD things. Say you're an Olympian who just escaped to another country and some dictator wants to destroy your reputation because you made him look even more of an idiot than he was in the eyes of the rest of the world.

    Now I have a tool to push some dodgy images into your phone, which I can then "leak" to the press and local law enforcement.

    Yeah, well done Apple. Would have been nice if you talked to some sane people first, you know, out in the real world.

    1. Nifty Silver badge

      Re: So, who watches the watchers?

      "The other fun problem I see is the potential of unauthorised access to ADD things."

      Do you have WhatsApp installed on your iPhone? By default, anyone sending you a video or image can already add items to your photo album.

    2. LDS Silver badge

      "it must have some other driving motive. WTF prompted this?"

      The reason is probably they're deploying a technology that allows them to control whatever you have on your devices (i.e. music not from iTunes...), and the only way to make it acceptable is to tell it's just to hinder one of the most horrible crimes.

      Apple knows that sales of iDevices will slowly shrink because it will be harder to add more new technologies, and need other revenues streams in the future, and is preparing.

  73. Snowy Silver badge
    Facepalm

    A legal minefield

    The accuracy is never going to be 100% and with the repercussions of getting it wrong so high I do wonder what do Apple get out of it?

    1. Anonymous Coward
      Anonymous Coward

      Re: A legal minefield

      "I do wonder what do Apple get out of it?"

      A smug feeling of self-importance? Wait, no, they already have that.

      Gratefulness from their customers? Hmmm, probably not.

      Respect from the industry? Oof!

      Lead-ins to other government contracts? LOL, "Apple" and "government contract" in the same sentence makes as much sense as "unicorn" and "starship" in the same sentence.

  74. Phones Sheridan

    This online newspaper was BANNED BY APPLE!!! and you won't believe why!

    "Apple infamously refuses to talk to The Register"

    As the subject says, I am surprised that El Reg have not tried to make this viral. Every other day I read about something else BANNED BY SOMEONE™ and it fills my social media relentlessly. Try harder!

  75. Anonymous Coward
    Anonymous Coward

    And the Apple guy responsible was sued sucessfully by a former employer.

    Seems like a very trustworthy guy...

    https://financialpost.com/executive/management-hr/blackberry-ltd-ontario-sebastien-marineau-mes

    Then you add in all the other stories over the years about Apples "ethical standards" that make Microsoft look like Mother Teresa what could possible go wrong.

    Now this story brings up an interesting legal issue. The software must have been trained. No way was it 100% unsupervised. Unless I am mistaken even inadvertent non voluntary viewing of child porn images is a criminal offense in California. Unless part of a criminal infestation or by LEO's. And the possession of the software training images in any form was also a criminal offense.

    Sounds like someone did not run the project by the lawyers first.

  76. CuChulainn

    Have They Perfected AI Now?

    I hope they have.

    Can you imagine an innocent photo being tagged as child pr0n, and what lists you'd end up on as a result? I mean, people never use their phones to take photos of their kids, do they?

    And the effect that could have on your life. And how difficult it might be to get off the lists. And how little it would matter if you did if other people had already found out?

    I'm thinking of that case a couple of years ago where a black couple took a selfie and Google identified them as 'gorillas'. Among other examples of how good AI has been up to this point.

  77. Ashto5

    Maybe this is the last iPhone I own

    To be honest I really just want something that texts and makes calls

    If your going to attempt to make me a criminal by your definition

    Then I don’t want your product

  78. The Central Scrutinizer

    This has got "clusterfuck" written all over it.

    1. Anonymous Coward
      Anonymous Coward

      I have called it a Ratners.

  79. CrackedNoggin Bronze badge

    The actual law enforcement budget for following through on child porn leads is not nearly enough to keep up with all the evidence already assembled by volunteers who infiltrate paedo groups on the internet. It's just a few million dollars for the whole USA.

  80. Lil Endian

    Honeypot?

    It might be an interesting exercise to test the resilience of the ML system against a honeypot. A write-<scan>-delete rinse and repeat process. Or a pictorial zip bomb type thing.

    Just as an academic exercise, you understand...

  81. Anonymous Coward
    Big Brother

    I've just ordered a feature phone. Nice big buttons, does calling and texting.

    Apple, Google, Microsoft, they can all fuck off.

    1. Charles 9 Silver badge

      Did you know Facebook is working it's way onto FEATURE phones now? Pretty soon, every phone out there will feature either Big Brother or unsupported frequencies...

  82. Jonjonz

    This does not add up.

    How often do multinational corporations suddenly decide they exist to become a vigilante versus one specific type of crime and invest significant resources into the process?

    Nada.

    How often do multinational corporations get in bed with the state to cooperate in the surveillance and data mining of individuals, hum, sounds like more familiar territory.

    Don't pay any attention to this massive AI we slip-streamed onto your device as it eats cpu cycles. It's for the children! Trust us to look after you while we sell every bit of data on you that to the highest bidder (we don't call them that, we call them business associates to skirt the law.)

  83. JavaJester
    Alert

    Why Stop with iPhones?

    Now that Apple has shown the world that using technology to surveil and control users is appropriate, why should governments stop with iPhones? There is a whole world of electronic device waiting to be put into surveillance service. The company that ran the 1984 Super Bowl advertisement has all but invited 1984 surveillance on our portable telescreens.

  84. Tron Bronze badge

    The return of the Amiga at last, with an OS that is not spying on you.

    Apple have just undermined trust in computing generally and in their own products, specifically.

    This does beg the question of what we do when we can no longer trust the OS provider not to auto-scan our files.

    To OS provider, we can add software provider, cloud provider, Webmail company, VPN and other online software service provider. Maybe even firewalls have ears.

    The next popular application may be a sandboxed Works/browser package, but I guess that could be bugged by the OS vendor when it uses the screen or printer.

    There is the option of an offline system. Once a system is set up, you should be able to use it offline, encrypting any data that you then feed into an online system to e-mail. W7 works OK offline. Not sure about the latest versions of Apple and MS.

    As the three main OS providers are American, governments outside Washington have a problem, as the Americans can simply order Apple to do their dirty work in the name of national security. If you have pre-patent designs for something new on your system, will they be auto-scanned? A non-American next generation anything would be a threat to US national security.

    China are going to be knocking on Apple's door real soon with a lengthy wishlist, should they want to continue operating in the Middle Kingdom (whilst mandating Huawei for members of the party).

    This is an absolute train wreck that we did not need on top of Covid and climate change. But perhaps it will stimulate a new round of development as companies offer options that offer protection from spyware built into the OS, and alternatives. Raspberry Pi? Distributed systems? Fax?

    Of course, if the USG was already doing this, they won't be pleased that Apple has made the whole planet aware of it being an issue.

    1. Charles 9 Silver badge

      Re: The return of the Amiga at last, with an OS that is not spying on you.

      "China are going to be knocking on Apple's door real soon with a lengthy wishlist, should they want to continue operating in the Middle Kingdom (whilst mandating Huawei for members of the party)."

      Maybe that's the reason for all this. China may already be knocking with a list of demands. And unlike last time, Apple's potential counter of packing up and leaving may be accepted because China now has a strong homegrown phone market and may well be willing to go without iPhones in their country. Who's got the most to lose now? China's access to an American icon they can just pillory, or 1 1/2 billion potential customers for Apple?

  85. gdbc

    They're after your meme's. The only thing this has to do with Pedophilia is the marketing guys getting you and "mum" to accept it. People aren't that stupid Apple. Its lazy, the first thing anyone wanting more power does is state "Its because of the children!". No doubt Google will follow suite. Soon a "hate" image will be you taking the p*ss out of the "wrong" electoral candidate or suggest an election was rigged.

    Quick question: Was this announced before or after the US senate passed the "Infrastructure" bill?

    1. Irony Deficient Silver badge

      Was this announced before or after the US senate passed the “Infrastructure” bill?

      Since the US Senate has not yet voted on the infrastructure bill (apart from a vote to invoke cloture to prevent the bill from being filibustered), the possibilities are either that this was announced before the Senate approved the bill, or that this was announced before the Senate rejected the bill.

      What do “our memes” have to do with the Senate vote on the infrastructure bill? Is there some sort of anti-meme legislative proposal buried within it?

  86. msobkow Bronze badge

    AI is not able to do this reliably any more than it can identify individuals accurately.

    The problem is society and industry has this perverse idea that self-adjusting pattern matchers are "intelligent." They aren't. They're just very, very fast at doing the matches and providing CANDIDATES that need to be REVIEWED by HUMANS.

    And we all know how big the tech industry is on hiring competent people to curate posts and content elsewhere on the 'net.

    I expect there to be PLENTY of false charges, investigations, and MASSIVE lawsuits against Apple for the slander claims.

  87. R.O.
    Big Brother

    Today and tomorrow and later

    When they say it's about the pedophiles you know right away they are lying and it's really about expanding police state mass surveillance. I guess we should have known Apple's apparent commitment to privacy was just another PR and marketing scam.

    Today it's about pedophiles, tomorrow parking and traffic law enforcement then how much you love big brother.

    It's a continuum.

    1. Anonymous Coward
      Anonymous Coward

      Re: Today and tomorrow and later

      The endgame is clearly to put more old widowed ladies in prison for not paying their TV license / council tax.

  88. Anonymous Coward
    Anonymous Coward

    It’s difficult to object

    If this achieves its objective then what’s not to like, but I fear the doom mongers might be on to something, and this could well be the very thin end of the wedge. Other solutions to this growing problem are not forthcoming further I suspect the scale of this sickness is beyond our wildest imaginings. What a quandary.

  89. Pen-y-gors Silver badge

    Alternative headline

    "Apple will hold the unencrypted database of photos (really the training data for the neural matching function)

    Apple now owner of world's largest stash of kiddie-porn! It's for research - honest! says senior exec.

  90. the Jim bloke Silver badge
    Angel

    In an amazing move of social redemption..

    Apple have adopted a position that makes child abusers the good guys...

    no sarcasm... but drowning in irony..

  91. albaleo

    Apple have added a FAQ on their website. I presume this is in response to the various outcries.

    https://www.apple.com/child-safety/pdf/Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf

    1. Nifty Silver badge

      Little changed from the first announcement. The wording of the 2nd paragraph, that refers to all IOS devices, not just family account managed ones, says it's an in-phone scan, followed by an upload of suspicious images to iCloud. It does not say that opting the device out of iCloud will defeat this feature.

      The flow diagram that Apple initially published did indicate it's regardless of whether iCloud is enabled.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021