back to article Apple's bright idea for CSAM scanning could start 'persecution on a global basis' – 90+ civil rights groups

More than ninety human rights groups from around the world have signed a letter condemning Apple's plans to scan devices for child sexual abuse material (CSAM) – and warned Cupertino could usher in "censorship, surveillance and persecution on a global basis." The US-based Center for Democracy and Technology organised the open …

  1. Anonymous Coward
    Mushroom

    Apple has learned a lot from China

    Someone needs to tell Apple that the US is not China.

    And no, I am not in the screeching minority - whatever that is. I do, however, have a brain. And so do many others.

    The idea of a private, constant, secret and for-profit US Department Of Suspicion - which, in my view, is exactly what Apple is trying to become - just does not sit well with a lot of folks, myself included.

    I can't wait for the federal lawsuits that will be triggered by this moronic idea if Apple doesn't backtrack and forgets about it.

    1. Scott 26

      Re: Apple has learned a lot from China

      > I can't wait for the federal lawsuits that will be triggered by this moronic idea if Apple doesn't backtrack and forgets about it.

      Hopefully before innocent lives are ruined

      1. Doctor Syntax Silver badge

        Re: Apple has learned a lot from China

        The guilty will already have got the message. Don't buy Apple.

        1. Fruit and Nutcase Silver badge
          Big Brother

          Re: Apple has learned a lot from China

          The problem is there are scores of people who've never bought Apple and they are going to be tarred with the same brush - Squeaky clean Apple/Apple users, but criminals the rest of them

          Anyway, no surprises here - it's something that Apple have been cultivating over the years when it comes to movies...

          "Apple won't let bad guys use iPhones in movies, 'Knives Out' director Rian Johnson says"

          "Trying to figure out who the bad guy is in a movie? Look for the character that hasn't touched an iPhone."

          https://www.cnbc.com/2020/02/26/apple-wont-let-bad-guys-use-iphones-in-movies-says-knives-out-director.html

          1. Graham 32

            Re: Apple has learned a lot from China

            Isn't it more a case of "Apple won't pay you for product placement if you let the bad guys use iPhones"?

            It's not like every car manufacturer has to approve the use of their vehicles in films.

            I know the 90s were a different time but I doubt Jaguar approved "For men who'd like hand jobs from beautiful women they hardly know." https://www.youtube.com/watch?v=XzyNPoI17rE

            1. jake Silver badge

              Re: Apple has learned a lot from China

              One wonders what Jaguar thought of 1971's "Harold and Maude" ...

              If any of you kiddie commentards haven't seen it, it's well worth a watch.

          2. anonymous boring coward Silver badge

            Re: Apple has learned a lot from China

            ""Apple won't let bad guys use iPhones in movies, 'Knives Out' director Rian Johnson says""

            Huh? So how would Apple stop that then?

            It's more probable that Apple pays for product placement, but won't pay if iPhones are used by the villains.

            1. Charles 9

              Re: Apple has learned a lot from China

              Trademark infringement suits sounds like the most likely route. Remember, any product placement usually has to have the manufacturer's OK before it can appear.

    2. Anonymous Coward
      Anonymous Coward

      Re: Apple has learned a lot from China

      Someone needs to tell the US that the US is not China.

      "The idea of a private, constant, secret and for-profit US Department Of Suspicion"

      How about a not-for-profit group?

    3. elsergiovolador Silver badge

      Re: Apple has learned a lot from China

      Someone needs to tell Apple that the US is not China.

      Well, the US is now run by Democrats and authoritarians from that church look up to China as a model for how the state should run its people.

      1. gandalfcn Silver badge

        Re: Apple has learned a lot from China

        Erm, seems you don't know much about democracy. Also, wasn't Trump the one who was big buddies with Putin, Xi and Kim?

        Presumably you conflate public good and safety and authoritarianism. You need to read your Constitution, as do Trump and the majority of his supporters.

        1. elsergiovolador Silver badge

          Re: Apple has learned a lot from China

          What's up with this binary thinking? Let me break something out to you, there are authoritarians in Republican party too - they, however, prefer a German version of authoritarianism.

          1. Swarthy

            Re: Apple has learned a lot from China

            To be honest, both parties are authoritarian. Even US Libertarians are authoritarian. Not that the parties have authoritarian elements, or leanings, they are straight up authoritarian. The main difference is flavor of authoritarianism: Corporate or Government.

            There is no non-authoritarian element in US politics; Bernie is as close as we get, and he's not that close.

            1. gandalfcn Silver badge

              Re: Apple has learned a lot from China

              Indeed, but the GOP and its corporate and religious masters are seriously authoritarian nowadays.

    4. Anonymous Coward
      Anonymous Coward

      "Department Of Suspicion"? Apple has learned a lot from The Stasi.

      There. Fixed the Title for you.

    5. Down not across

      Re: Apple has learned a lot from China

      I can't wait for the federal lawsuits that will be triggered by this moronic idea if Apple doesn't backtrack and forgets about it.

      Cat's out of the bag. Would you believe Apple if it said it changed it mind?

      I can see potentially lot of pressure from LE and/or governments now that Apple has dangled this "backdoor" into the "unbreakable" iDevices.

    6. martyn.hare

      Don’t forget

      Every image which goes into an MS Office document/note/presentation is subject to the same kind of checks because the image is uploaded to Microsoft’s cloud in order to add alt tags using AI analysis, even if you’re NOT using SharePoint or OneDrive to store the actual document. Microsoft uses PhotoDNA which uses a similarly broken algorithm and does NOT adequately disclose its use outside of OneDrive nor does it explain how it works.

      If we decide it’s wrong when Apple does it then it is definitely wrong if Microsoft does it!

      1. Newold

        Re: Don’t forget: a slightly different thing

        Sorry, but (like all other cloud providers and facebook and many others)

        Microsoft scans for CSAM *on their servers*,

        Apple wants to scan *on your devices*,

        plus iMessage filtering and have Siri and Search detecting "unsafe situations" and "searches for CSAM-related topics.

        To me, that's a real massive intrusion of privacy and security, which can and most certainly will be abused not only in russia and china.

        (sorry for errors, I'm no english native)

        1. Anonymous Coward
          Anonymous Coward

          Re: Don’t forget: a slightly different thing

          If they scan for it on their servers, aren't they, because of strict liability, also guilty of posession? Can you image the shitstorm if every Microsoft employee was added to the sexual offenders list...

          1. elsergiovolador Silver badge

            Re: Don’t forget: a slightly different thing

            If they scan for it on their servers, aren't they, because of strict liability, also guilty of posession?

            Imagine if someone rents a flat and has some drugs hidden in the closet. Should landlord or agency be coming over every now and then and do a thorough search? In a way this is a server side scanning.

            I guess users agree to that by accepting the terms of service.

            Personal device, just like your own home should be off limits unless there is a warrant.

            Some people do mental gymnastics and try to justify it by saying that Apple wants to maintain end to end encryption and scanning on the client is the only way to maintain privacy. But that just sounds like Orwellian doublespeak. If they really want to scan through people stuff and they cannot do it with end to end encryption (aka going to someone's rented flat), then perhaps they should stop offering such service altogether.

        2. gandalfcn Silver badge

          Re: Don’t forget: a slightly different thing

          "Apple wants to scan *on your devices*," No, In the cloud. i.e. a server.

      2. Doctor Syntax Silver badge

        Re: Don’t forget

        "Every image which goes into an MS Office document/note/presentation is subject to the same kind of checks because the image is uploaded to Microsoft’s cloud"

        Even if the PC is off-line?

        Not that I wouldn't put it past them if they got the chance. As I recall from the early days of W10 their T&Cs were artfully worded so as not to exclude them from almost anything.

        1. elsergiovolador Silver badge

          Re: Don’t forget

          I just filled my OneDrive with garbage to the full, so it does not sync anymore. Annoying thing are pop ups trying to sell me more storage.

          Probably nothing stops them from uploading document, scanning and then deciding it won't fit...

        2. gandalfcn Silver badge

          Re: Don’t forget

          Interestingly, M$ has been doing this for years yet not a tweet from the paranoid Apple haters.

          " the image is uploaded to Microsoft’s cloud" / the image is uploaded to Apple’s cloud"

  2. Cybersaber

    "Now infosec bods are reverse-engineering the technology to see how it really ticks. If it can be shown that the image-matching algorithm can be fooled into flagging up an entirely innocent picture, such as via a hash collision, it will render the content-scanning tech of limited use for its stated purpose."

    This article is slightly behind the Register article earlier today about someone already did that. It took less than a day after the public part of the algorithm was released.

    Apple isn't this dumb. This is a 'think of the children' smokescreen to get us on the slippery slope that countries like the PRC are strong-arming them into, but trying to save face with this farce/smokescreen of a feature.

    1. Charles 9

      I found this article in the Washington Post about a parallel project and the finding that the technique is inherently dual-use, like a knife. Meaning it's nigh-impossible to prevent it being abused, especially by a sovereign power. As I've said before, I think the only reason Apple are announcing it now is because they are being pressured into including it: likely by China.

      https://www.washingtonpost.com/opinions/2021/08/19/apple-csam-abuse-encryption-security-privacy-dangerous/

  3. Pascal Monett Silver badge

    Finally

    Seems like the crickets have finally been silenced.

    It's good to hear that there are official entities raising shields against this egregious invasion of privacy.

    Think of the children is a nice excuse - let's not push it too far.

    1. big_D Silver badge

      Re: Finally

      The German Governments Media Agenda committee said Apple should remove it immediately and:

      “größte Dammbruch für die Vertraulichkeit der Kommunikation, den wir seit der Erfindung des Internets erleben” (The biggest dam burst for the trust in communications, that we have seen since the invention of the Internet.)

      “Every scanned content destroys the trust of users, that their communications are not being monitored. Without trusted communications, the Internet will become the biggest surveillance instrument in history.”

      1. Newold
        Unhappy

        Re: Finally

        Whatever politicans say, it's not worth a penny,

        they change their minds more often than their underwear.

        In this case I guess, as soon as the "Innenminister", police and secret services have been talking about the great opportunities to fight terrorism, drugsmugglers and other serious crimes (like violation of copyrights), then the rest of the "Bundesregierung" will accept this tech - of course with "Bauchschmerzen".

        (I'm german, sorry for errors in my english)

        1. big_D Silver badge

          Re: Finally

          I fear you may be right. But it is at least a good sign at the moment.

          Wir können hoffen… (I’ve lived in Germany for 20 years now.

    2. teknopaul

      Re: Finally

      re think of the children.

      Humans will review all suspect photos? long queue of paedos applying for that job.

      That will happen, as to whether we get to hear about it is of course up to Apple, who typically insist, when they are dragged through the courts, that the details remain private.

      1. Doctor Syntax Silver badge

        Re: Finally

        AIUI the proposed system is that if there are sufficient hits TPTB are notified. Nobody in Apple gets to look. It avoids the risk of being accused of being in possession. They are passing on suspicion, not knowledge.

  4. sev.monster Silver badge
    Childcatcher

    Coming at this from the angle that Apple is pushing, I think everyone with a functioning, modern moral compass can agree that harming innocents—children or otherwise—is a Bad Thing. But I personally do not think it is or should be Apple's responsibility to become the Arbiter of All that is Good. Companies and governments should stay well away from private citizens, even if it means that Bad Things can potentially occur. In addition, I believe the protection of our youth should be the responsibility of the parent(s) and the community around them, not a faceless international megacorp using proven less-than-reliable automated detection methods.

    I also have reservations with the idea that someone dumb enough to store such unmentionable, illegal files on their iCloud would also be the type to actually produce such content. How will it help law enforcement find and capture active producers? Will their technology even help stop the harm of children? Given the sensitive nature of the matter, I doubt we would ever know should the technology be utilized for real. And that makes it even less trustworthy.

    1. Charles 9

      "I believe the protection of our youth should be the responsibility of the parent(s) and the community around them..."

      Problem being what happens when parents and others abdicate that responsibility. There has to be some sort of fallback to avoid anarchy...

      1. Anonymous Coward
        Anonymous Coward

        Then those parents will be prosecuted under existing laws and those children will be looked after by existing organisations. Nothing changes there, we have plenty of support in place already.

        1. gandalfcn Silver badge

          "Then those parents will be prosecuted under existing laws". Really? Do you honestly believe that?

      2. Anonymous Coward
        Anonymous Coward

        "Problem being what happens when parents and others abdicate that responsibility."

        Have an upvote for the sentiment. But this has nothing to do with Apple's photo scanning plan.

      3. big_D Silver badge

        Also, in many cases, the parents either turn a blind eye or are themselves involved.

        There was a case about 2 years back, where a couple was pimping out their sub-teen daughter.

        Likewise, a couple of weeks back, there was a judgement passed on a mother of a boy, where her partner had been abusing him for years, and selling him out, and filming and uploading the abuse - this is the so called Schrebegarten scandal in Germany from 2019, where a ring was being run out of a chalet on an allotment. The mother found out her child was being abused, but didn't do anything about it.

        So, leaving it up to parents isn't always a good idea.

        Certainly parents should be responsible for a child's safety and for ensuring it knows the difference between right and wrong, instilling discipline (non-violently) etc. But there are plenty of abusive parents (and spouses, for that matter) out there, that the state and the police have to take action.

        1. gandalfcn Silver badge

          That you were downvoted says rather a lot about the lack of morals of the downvoters.

          1. sev.monster Silver badge

            I downvoted because, while it's terrible that that happened, "think of the children" is not a valid excuse to allow private corporations such power. They already have enough as it is.

            Pray tell, what would Apple's setup here provide? Again, I very much doubt that the perp would have uploaded the content to iCloud or similar, and even if he did, if the hash isn't indexed it won't be detected. In either case, there are other, less incriminating places to store such things that I would think someone daring enough to pimp out their own child would opt to use. And on that note I don't see how Apple's technologies are going to help stop face-to-face pimping of children.

            I can't speak for Germany on what they need to do to help curb such behavior, but it ain't this. And no matter what one does, there will always be those that break the rules and get away with it long enough to commit atrocities. It is a part of life and we should not let emotional reaction to sensational incidents guide us to making poor decisions that affect a large percent of the populace.

    2. Doctor Syntax Silver badge

      Brits of a certain age will remember the Gary Glitter incident. And those with any form of legitimate but confidential information on their PC would immediately have realised where not to take it for repair should that become necessary.

    3. DevOpsTimothyC

      Not an iWhatever user, but doesn't almost every image you take / receive automatically get uploaded to the iCloud account in case you break your iThing?

      1. doublelayer Silver badge

        That's an option, so for lots of people, yes. Another option is to automatically backup the system to iCloud, so that's another way it could happen automatically. Both probably have to be disabled to prevent it.

        1. DevOpsTimothyC

          Is the default option to sync to iCloud ?

          1. sev.monster Silver badge

            You are prompted when first setting up your phone. An Apple ID is required to set up an iPhone as far as I remember, and syncing everything to iCloud is provided as a yes/no option. You can toggle it off later.

            Also not an iThings user so I could be wrong.

      2. gandalfcn Silver badge

        No. It is an option.

    4. Richard 12 Silver badge

      It cannot protect anyone at all

      Apple are proposing searching iDevices for known CSAM images.

      Anyone abusing kids to create new images cannot be detected this way, by definition.

      So one fairly obvious consequence is an increase in abuse as people try to create new, undetectable images by abusing more, new victims.

      As a parent, that terrifies me.

      1. sev.monster Silver badge

        Re: It cannot protect anyone at all

        That is actually a really good point. Make the usual stuff harder to find by busting the low-level consumers and possibly distributors, and there will be someone out there driven enough to fill the power vaccuum. I wonder if there is any research on that?

  5. aerogems Silver badge
    FAIL

    I absolutely applaud the intent behind what Apple is doing, but even the people who invented this system said that it would be very easy to adapt to less noble goals and shouldn't be used without significant safeguards in place. I'm not convinced, and it seems I'm in good company, that Apple has done this. Maybe at some point Apple will manage to convince us that their process is secure, but it kind of feels like some cattle rustlers already made off with the cattle, burned the barn down, and then paved over and built a high rise office park in the barn's place on that issue.

    Fail icon because someone already used the mushroom cloud and I was torn between the two.

    1. Doctor Syntax Silver badge

      It's inherently not secure, not because of the implementation, whatever faults it might or might not have, but because the capability itself is neutral and can be turned to any form of surveillance Apple might choose or be pressured to choose.

      1. gandalfcn Silver badge

        It may be, but then the problem would lie with the elected government doing the oressuring, wouldn't it and not with Apple.

        1. DevOpsTimothyC

          Most of the big names on the internet do some sort of "Against our T&C for content we don't like". Apple are just pushing that onto people's phone / tablets etc

  6. Peter Prof Fox

    What if say Ford or Toyota...

    What if motor manufacturers built-in tracking systems? Great for tracking down speeders (good) but also great for who attended what rave or political meeting. Where data is banked against the time it might be 'needed' by some enforcement agency, there is no opportunity to defend the most obvious defence "My car may have been there but so what?" Joe McCarthy lives! According to our records (sorry citizen, you can't get a complete copy of what we (and our partners) hold on you) you're a heterodox recidivist. Your car can be as good as a fingerprint even if it is another driver. As AI gets more prevalent (I didn't say more sophisticated) there will be many more false connections and it will be up to YOU to come up with a defence against a fluffy snowstorm of 'evidence'.

    1. gandalfcn Silver badge

      Re: What if say Ford or Toyota...

      Your paranoia is palpable.

      1. sev.monster Silver badge

        Re: What if say Ford or Toyota...

        Are you even paying attention to the outside world? As we speak your ISP is likely tracking your every page visit and selling that information to advertisers, and depending on your locale that data may be linked to your full name, email address, etc. It's 100% legal in the USA to track complete web history for I believe up to 3 months, ever since Ajit Pai wrecked the place and got rid of Net Neutrality.

        And it is more than trivial to buy a matching record from yet another completely legal information broker, which contains not only your name but any past/married names, all the addresses you've ever lived at (i.e. registered with your governace), your family and their information, your pets, your job history, employment status, tax bracket, and who knows what else. I know this for FACT because I used to work for a company that had unlimited access to Acxiom's databases, and it had EVERYONE and I mean EVERYONE I ever searched for.

        Your very life is just another record in a database, and that should scare you. Giving companies even more power to track us is not what we need. Of course your government likely has plenty of dirt on you, but at least there's some assurance that they probably don't want to share that trove of data with the likes of say Apple. God help us if there were a leak of that data and it got merged with other databases of information—millions of people would have their entire identity and history up for grabs, from top to bottom.

  7. Anonymous Coward
    Anonymous Coward

    One, very large, rotten apple

    I hope the barrel can be saved

  8. Anonymous Coward
    Anonymous Coward

    Apple Customers.

    From Fashion Victims to Fascist Victims in one easy step.

    1. Lord Elpuss Silver badge

      Re: Apple Customers.

      Here's an idea, f*ck off with your Apple customer bashing. This is abuse of privilege by a large company, nothing more or less.

    2. gandalfcn Silver badge

      Re: Apple Customers.

      The only fascists in the USA are the GOP and their supporters and there are rather a lot of them here bashing Apple. .

  9. gandalfcn Silver badge

    So the situation is. Do nothing about the problem of child sexual abuse because whatever you do will be wrong.

    This sounds just like the 180 year fiasco in Afghanistan.

    1. Anonymous Coward
      Anonymous Coward

      "Do nothing about the problem of child sexual abuse because whatever you do will be wrong"

      It's not a binary choice. There are PLENTY of things we can and do do about this which are not massive invasions of privacy and rights.

      1. gandalfcn Silver badge

        "There are PLENTY of things we can and do do " Really? So why is it such a major problem? A problem that seems not to be going away.

        1. sev.monster Silver badge
          Childcatcher

          ...Because human beings are [/have the capability to be] monsters? I'm sorry to tell you this, since it seems like you didn't know.

    2. doublelayer Silver badge

      "So the situation is. Do nothing about the problem of child sexual abuse because whatever you do will be wrong."

      No, the situation is do something else about the problem of child sexual abuse because what you're currently doing is wrong. False dichotomy rejected.

      1. Charles 9

        Rejection rejected. ANYTHING you attempt to deal with the child abuse problem is going to be (a) impractical or (b) too prone to collateral damage. For cultural and perceptual reasons, if nothing else.

        1. doublelayer Silver badge

          Many would disagree with you. Existing police investigations with limited warranted surveillance have a higher approval rating than this does. Hence, a different method which works and isn't by its nature evil.

        2. Anonymous Coward
          Anonymous Coward

          The problem is simple. People need to stop putting forth proposals to police and monitor the majority of people in order to catch a very small minority of perpetrators.

          As with most social ills, many people take it personally (perhaps they have abducted family, for one horrid example), and have no issue with tromping over everyone's rights to get revenge and a "magic way" to stop it from happening again.

          There are no "magic solutions" that aren't more reminiscent of 1984 than any free society history has ever dreamed up.

    3. big_D Silver badge

      Other cloud companies have been dealing with this problem for years. They check what is loaded onto their servers, as it is loaded.

      They don't break the sanctity of the local device.

      Apple has just moved the discussion on from "the local device is sacrosanct" to "anyone can search for anything they want on the user's devices" (if they apply enough pressure).

    4. Charles 9

      Sometimes, the cure is worse than the disease. I can think of worse things in the world than this, after all...

    5. Anonymous Coward
      Anonymous Coward

      Yes! Finally someone tells it like it is!

      Apple users are abusing children left right and center. If Apple doesn't spy on them, it's failing in its duty to stop child abuse by its customers.

      Why should Apple, even have to tell its customers about the scanning? Since when do you tell criminals they are about to be caught. It should just be free to scan their phones whenever it wants for whatever it wants to save the children from abuse.

      It should be free to monitor their cameras, and record their microphones to protect children from being abused by Apple's customers.

      If Apple doesn't search its customers phones, they will abuse children. Why, they are probably abusing children right now, why the delay!?

      You are the one person here to speak the truth, that Apple customers abuse children.

      It is exactly like the Taliban forcing women to cover up completely lest Apple users get excited and molest children. Or the Tellyban forcing OnlyFans users to cover up, lest Apple users get excited and molest children.

      If Apple don't save the world from Apple users and their molestering, then the USA has failed in Afganistan.

      1. Anonymous Coward
        Anonymous Coward

        I always thought there was something shady about those silhouette ad characters of theirs.... *badum chunk*

    6. Richard 12 Silver badge

      This is actively encouraging abuse

      If such people think that they are very likely to be caught if they download "old" CSAM, then they will try to get - and create - new material.

      The goal should be to prevent abuse, not to encourage more abuse.

  10. Anonymous Coward
    Anonymous Coward

    Where is the line then?

    In other cultures children may run around in their birthday suit on the beach until approaching puberty.

    At least they used to back in my day.

    My point is that US/UK hysterics about nakedness could result in thousands of completely innocent images being classified as "indecent", or whatever is the term.

    1. AnotherName

      The thing that struck me about this when I first heard of it is that they were scanning the uploads and see if they matched against a database of known images. Surely this idea will put more children at risk when the offenders have to start making more *new* images to share that aren't currently known about by the authorities to work round this matching process.

  11. I should coco

    Naked babies

    So I am a loving father of two kids, and yes I have naked pics of them in the bath, in the pool etc. How is this handled? Do I have to submit an application to the government/Apple to say its ok?

    1. big_D Silver badge

      Re: Naked babies

      As long as you don't send those pictures to an adolescent's device, you are fine.

      The CSAM is a finite list of known abuse images that have been confiscated from known offenders. The signatures for these images used to check images on cloud services (or in Apple's case, on the local device, before they are uploaded), so, unless you somehow manage to recreate one of those images, you are okay.

      The content warning is between child and parent, if the child receives (or tries to send) photos including nudity, they will be warned and if they continue to send/view the image, the parent will be informed.

      1. Doctor Syntax Silver badge

        Re: Naked babies

        "The content warning is between child and parent,"

        The announcement at https://www.apple.com/child-safety/ does indeed mention warning parents but it also says

        CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

        1. big_D Silver badge

          Re: Naked babies

          Yes, but this only covers known CSAM images, not images they make themselves.

          1. Cybersaber

            Re: Naked babies

            But Apple can change their minds at any time. What if their politics decide that the dad taking photos is child abuse later?

            Arguments about what a company/government/person claims they won't do today don't hold water. What matters is what a tool can be used for. Minds change. Politics change. What is a crime, changes. History both shows and teaches us this, and failure to learn those lessons will bring tears.

            Read up on how why the Nobel Peace Prize was created.

            This genie will not get put back in the bottle, especially when there are powerful governments around the world that will very, very much want it to be let out and will use their economic might to resist efforts to try to put the stopper back in.

            1. Lusty

              Re: Naked babies

              Indeed, we're literally now at the "prove you didn't take naked pics of your kids because you're a pervert" stage.

              At the very root of this, we need to ignore the big debate and concentrate on the fact that Apple are illegally scanning our images. There is definitely criminal activity here, but it's not the users, it's Apple.

          2. Anonymous Coward
            Anonymous Coward

            Re: Naked babies

            Not if the algo matches them. Apple and others have switched from straight MD5s to more AI based fuzzy matches.

            He shouldn't be photographing his kids naked, that CSAM set is marketed as child abuse, but that's the marketing not the reality.

            It's an *expansive* set of images, viewed as sexual abuse by the *reviewers* that put them in. The *most* images they can include. The *widest* possible definitions.

            If the reviewer is turned on by his naked kids in the bath, then to him thats a child abuse photo, and his kids photos can end up in that database.

            Imagine a childless sexless sad git in a dark office, who, for some reason chose a job where he see child porn all day. Do you really want him seeing your kid's shit covered ass on a changing mat, mixed in with the actual child porn he's reviewing? No. So don't take that photo.

            1. sev.monster Silver badge
              Childcatcher

              Re: Naked babies

              As far as I can deduce hash matching is for "known CSAM", and the AI is only for the child protection scheme where parents are informed if their kids are taking nudes—which is surprisingly common, and something I am in favor of informing parents about (as long as the picture doesn't get back to Apple).

              Do some digging and sites like Omegle were apparently once rife with those of questionable age using their iThing to record theirThings when strangers asked them to. If there were a way to inform parents about that, we could save a lot of kids from future regrets, and hopefully from future contact with potentially dangerous sexperverts asking to come over when mommy and daddy are on vacation...

    2. Doctor Syntax Silver badge

      Re: Naked babies

      Many years ago a TV newsreader became part of the news because the photo-processing company - or maybe just somebody who worked there - informed the police because of such a set of photos. This is Apple trying to reproduce that situation for the digital age.

      1. Anonymous Coward
        Anonymous Coward

        Re: Naked babies

        Has anyone important left Apple recently? A Mr. Ive, perhaps...

    3. Lord Elpuss Silver badge

      Re: Naked babies

      A few years ago a (female) friend of mine sent me a pic of her little ones - naked in the bath. I deleted the pic, then wiped and hard-reset my phone. I then deleted all iCloud backups, because you never know. Ultimately I couldn't deal with the stress that it might have survived all of this and be stored 'somewhere'; I ended up trashing the phone (iPhone SE), deleting my iCloud account and creating a new one.

      Paranoid possibly, overreaction almost certainly; but it needed to be done for my peace of mind otherwise I'd always have that feeling that it was buried somewhere, ready to be flagged and studied by some grubby investigator, right before my life was ruined forever. And this - THIS - is the kind of stress that intrusive, faceless, unchallengeable surveillance can give people. It's vile.

      And yes, I was very angry with the friend that sent me the pic. On your own device, probably ok. On somebody else's? Quite possibly criminal. Why the hell she thought it was ok to send it I will never know, but she learned quickly that it was very very not ok.

      1. Anonymous Coward
        Anonymous Coward

        Re: Naked babies

        I hope you aren't in the U.S. because you've just confessed to a crime for which there is no statute of limitation, and for which strict liability (no intent necessary) is applied. Enjoy prison.

        1. This post has been deleted by its author

        2. Lord Elpuss Silver badge

          Re: Naked babies

          What an utterly retarded viewpoint.

        3. sev.monster Silver badge

          Re: Naked babies

          Care to reference the exact wording? Because as far as my IANAL brain knows, problems come from creation, posession, and transferring to someone else. I vaguely remember a case where a man unknowingly downloaded lots of CP when he visited a shady site, and investigators pulled it off his drive—he was not charged since it was transferred to him without his express consent or knowledge, and he made no attempt to further transfer it to anyone else. Unfortunately I am not getting any hits else I would link it.

    4. SImon Hobson Bronze badge
      Stop

      Re: Naked babies

      So I am a loving father of two kids, and yes I have naked pics of them in the bath, in the pool etc

      As will many, probably almost all, parents. It's normal and one could argue that not having such photos would be abnormal.

      BUT in the UK it's illegal - and it's a strict liability offence meaning that there is no allowance for "but it's just being a normal parent" or any other similar defence. So technically, being a loving and normal parent could end up with you on the sex offender register, your life turned upside down, your family torn apart, etc, etc. Even the files not being accessible to you, or being files you don't knwo the existence of, isn't technically a defence.

      Even if at some point the police of CPS decide not to pursue the case, you'll have had all your computers taken for examination (don't worry, you'll get them back ... sometime ... perhaps a few years later ... and maybe even still working), you'll have lost your job because no employer want to be seen employing a known paedophile, you'll never be able to go anywhere without the neighbours looking at you in a "funny way" because clearly there's no smoke without fire, and the kids will be traumatised by all that's going on. But that's OK, it's all done because "think about the kids".

      And no-one can argue that such a situation is absurd - because to argue against any measure brought in to "think about the kids" must mean that you are thinking about the kids in the wrong way. That's how things like this happen - the wedge goes in, and bit by bit it's driven in harder and harder, and people argue "think about the kids" until it's too late to get the wedge out when they realise what a crazy situation they've created.

      1. Charles 9

        Re: Naked babies

        "That's how things like this happen - the wedge goes in, and bit by bit it's driven in harder and harder, and people argue "think about the kids" until it's too late to get the wedge out when they realise what a crazy situation they've created."

        What if they never realize the situation...or worse, see the burning room and simply go, "This is fine"?

      2. Anonymous Coward
        Anonymous Coward

        Re: Naked babies

        "BUT in the UK it's illegal - and it's a strict liability offence meaning that there is no allowance for "but it's just being a normal parent" or any other similar defence."

        Not even, "It was planted on me"? Because otherwise, what's to stop someone dead-dropping a picture of a naked cherub (innocent enough, but just "guilty" enough, too) to every member of Parliament or something?

        1. stiine Silver badge

          Re: Naked babies

          " what's to stop someone dead-dropping a picture of a naked cherub (innocent enough, but just "guilty" enough, too) to every member of Parliament or something"

          Nothing at all. But forget about Parliament, instead think about 'the executives of your competition'...

          1. jake Silver badge

            Re: Naked babies

            "forget about Parliament, instead think about 'the executives of your competition'..."

            Seems to me that your MPs are executives of your competition.

          2. Charles 9

            Re: Naked babies

            "Nothing at all. But forget about Parliament, instead think about 'the executives of your competition'..."

            Parliament is the competition when you're an anarchist out to prove law is not worth it...

      3. Doctor Syntax Silver badge

        Re: Naked babies

        "As will many, probably almost all, parents. It's normal and one could argue that not having such photos would be abnormal."

        Based on a sample of how many?

        We certainly didn't take any photos of ours like that. I'm not aware of any other parents I know having done so. Maybe it's a generational thing.

        1. anonymous boring coward Silver badge

          Re: Naked babies

          "Based on a sample of how many?

          We certainly didn't take any photos of ours like that. I'm not aware of any other parents I know having done so. Maybe it's a generational thing."

          As I said above: It's a cultural thing. And, obviously, culture can also change with time.

          Certainly no-one cared one bit about kids running around naked on the beach back in the 70s. Or the parents takin a few snaps when they were playing. Things might have been different in Britain.

        2. sev.monster Silver badge

          Re: Naked babies

          It was certainly very common here in the States. We also used to leave our doors unlocked... Now many of us don't even go outside unless absolutely required and have stay-at-home jobs. How times change.

  12. mark l 2 Silver badge

    Apple still haven't given any good reason why this scanning need to be done on the device and not on Apples own servers, if it is only concerned with images uploaded to the iCloud and not about looking for stuff on the device itself, why not just scan the photos that are uploaded to icloud when they hit the icloud server?

    1. stiine Silver badge

      They can't scan them on their servers because they're encrypted before being uploaded.

      1. doublelayer Silver badge

        You are incorrect. They are encrypted after being uploaded with Apple's key. Apple can decrypt them and has done so repeatedly to comply with warrants from law enforcement. Apple has not announced any plans to change this. If they do, which again they have not, it would not justify on-device scanning.

  13. Cybersaber

    There's two big angles to this

    One angle is the abuse of this scanning technology for nefarious ends, but it needs to be stressed that whether you're for or against it - this isn't the government. It's a private company:

    A) Usurping policing powers. Private citizens don't get the remit to take it upon themselves to demand access to people's person and effects to search for and report evidence of crimes. Apple is a private corporation, not a government.

    B) Hacking or Extortion (depending on your point of view) The option to 'not update' is not an option at all, because security updates are not unbundled from the feature updates. I can't choose to keep current functionality but keep my device secure. My choice is to either accept this awful practice, or render my phone insecure and unfit for purpose.

    This is a whole different (and more fundamental) problem than whether an _authorized_ entity like a government should or shouldn't do these things on my phone.

    1. stiine Silver badge

      Re: There's two big angles to this

      Its a private company today? What about tomorrow?

      And as for it being 'a private company,' well, you agreed with their ToS, didn't you? Guess what's in the ToS...

      1. Anonymous Coward
        Anonymous Coward

        Re: There's two big angles to this

        In civilised countries those TOS are illegal. The US will have to fend for themselves

    2. Anonymous Coward
      Anonymous Coward

      Re: There's two big angles to this

      In theory wouldn't this be a DMCA violation as well as a GDPR violation? Apple could be bankrupted from the fines...

    3. Doctor Syntax Silver badge

      Re: There's two big angles to this

      "My choice is to either accept this awful practice, or render my phone insecure and unfit for purpose."

      Are you sure those are the only alternatives?

    4. jake Silver badge

      Re: There's two big angles to this

      "My choice is to either accept this awful practice, or render my phone insecure and unfit for purpose."

      IMO, the iPhone is inherently unsecured and unsecurable, and is thus unfit for purpose. So I don't carry one. Simples.

  14. Anonymous Coward
    Anonymous Coward

    Bad Apple, or?

    Could it be Apple was forced to create this mass surveillance back door by Five Eyes and/or maybe other government agencies? And, is now sworn to secrecy by law?

  15. anothercynic Silver badge

    Screeching minority, eh?

    All I'll say is... if no-one stands up for your rights, then you're stuffed.

    You may wish to disagree with the 'screeching minority' all you like, but don't conflate their concerns about individual privacy and the flagged issues with implicit agreement of what the 'thing' is attempting to solve.

    I'll rather continue to be a 'screeching minority' and be proven wrong, than stand on the side of those just standing by saying 'oh, just suck it up already' until the 'screeching minority' is proven right when it's too late and everyone screaming 'how did this happen!'

    Sorry. Not sorry.

  16. Anonymous Coward
    Anonymous Coward

    There are a lot of companies working towards 1984. They're only 37 years late. Mind you, the character did say he had no idea what year it actually was... :)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like