back to article Apple's iPhone computer vision has the potential to preserve privacy but also break it completely

For a company built around helping people communicate, Apple sure has problems talking to folk. It pole-vaulted itself feet first into the minefield of Child Sexual Abuse Material (CSAM), saying that it was going to be checking everybody's images whether they liked it or not. Then, when people took this to mean that it would …

  1. Pascal Monett Silver badge

    It's simpler than that

    Apple is not the Police.

    It has no right, no mandate and no authority to touch my pics - even if I decide to upload them to a cloud service.

    That Apple is giving itself the right to do so should actually be illegal. If Microsoft declared that it had decided that Windows 1 0 would have a service that analyzes all pics on the local computer there would be outrage and an enormous backlash.

    Apple has done just that and all I hear are crickets.

    Something is very wrong here.

    1. Andre Carneiro

      Re: It's simpler than that

      The way I see it there’s quite a commotion in the Apple ecosystem. Very loud crickets, I’d say…

    2. big_D Silver badge

      Re: It's simpler than that

      Except Microsoft, Google, Facebook, Twitter et al are also doing the same CSAM checks, but only on stuff that is in the cloud - the checks are done on their servers, not on your local device.

      That is a big difference.

      Given that nearly every podcast and tech news service has been attacking this decision all last week, those must be very noisy crickets near you!

      1. Version 1.0 Silver badge

        Re: It's simpler than that

        They all say We Value Your Privacy

        $$$$$$$$$$$$$$$$

      2. elsergiovolador Silver badge

        Re: It's simpler than that

        the checks are done on their servers, not on your local device.

        It's like having a landlord coming over and sweeping your place in case you may be hiding drugs.

        1. sabroni Silver badge

          Re: It's like having a landlord coming over

          No it isn't, read what you quoted:

          "the checks are done on their servers, not on your local device"

          It's like you sending some of your stuff to your landlords house for storage and him taking a look to see what you've sent. Very different to them coming round your gaff and looking at whatever they choose.

          But we're discussing what google and MS currently do, not what apple are proposing, which is maybe where you're getting confused?

    3. Charlie Clark Silver badge

      Re: It's simpler than that

      I think you'll find that obscene publication laws have pretty broad scope and lawmakers have made no secret of wanting to co-opt manufacturers. In addition, the terms and conditions for cloud services routinely cover such interventions (DMCA, et al.) and many US companies are subject to summary injunctions from the various US government agencies.

      But there are still several problems: simply possessing the images is not usually considered a crime. In a sense, once the images have been produced the damage (to the child has been done) has been done and everything after that is copyright. You also, always, run into problem with blanket bans because they tend to criminalise everyone including those doing legitimate research and even the agenices themselves. I'll leave that for the lawyers to decide.

      The other main problem is that this obviously only the first step along the AI road: if the AI gets good enough, then surely it will be able to detect the criminal activity of making child porn as it happens… If you think that sounds too good to be true, it probably is.

      1. JDPower666

        Re: It's simpler than that

        Not sure what country you're in where "simply possessing" child sex abuse images isn't a crime.

        1. Charlie Clark Silver badge

          Re: It's simpler than that

          It's largely a matter of definition as to whether a particular image is of child sex abuse. Infamously a UK judge refused to be drawn on a similary definition of pornography and stated "I know it when I see it". And then you have the whole category of fictional images and, presumably, soon those generated by computers, because condemning these puts us in the realms of thought crimes. If someone thinks or even fantasises of doing something illegal, are they guilty?

          I recently came across an analogue picture I'd taken of a friend's younger children on summer holiday, so they were running around half-naked, several years ago and wondered whether such photos could be considered illegal. But then I also remembered, that is has long been established practice for photo development labs to report suspect issues to the authorities.

      2. DS999 Silver badge

        Re: It's simpler than that

        As the guy above said, in the US at least (can't speak for the laws in other countries, but most civilized countries will be the same) simply possessing child porn images is VERY MUCH illegal on its own! Its crazy to claim no additional harm results from people having and sharing copies of that - if your daughter was kidnapped, forced into sex acts, then later returned to you, would you think no further damage is being done to her for your next door neighbor to have a copy of those images?

        Researchers have to jump through a lot of hoops to get access to child porn without running afoul of the law, and I imagine that's all pretty highly regulated. Which is as it should be, you wouldn't want pedophiles to be able to legally get it by claiming to be "researching" the topic.

        What Apple is planning to do is totally different than trying to detect new photos of child porn. I very much doubt it is possible for "AI" to reliably detect pictures of naked versus clothed, or whether a subject is involved in any sex acts. Even less possible is reliably telling the difference between say a 12 year old who looks older than that age and an 18 year old who looks younger than that age - even moreso if the subject's face is not clearly visible.

        So no, while the authorities (and probably a lot of regular people) would love to make it so a phone or every other device with a digital camera would refuse to take a picture or video of "child porn" so no new child porn could be produced, that's not on the horizon. Nor is Apple's claimed ability to detect whether a hash of an image matches the hash of an image of known child porn moving a step closer to that "no new child porn" capability. Unless we get true human equivalent AI we won't see this ability in our lifetime. And if we did, child porn producers would merely use older gear made prior to when such an ability was added, so while it might limit the output somewhat it sure wouldn't stop it.

        1. TheKnowAlotGuy

          Re: It's simpler than that

          One of the problems is that pictures of nude or seminude children is very different from country to country.

          In some countries its a serious offence, while in others it is natural that parents snap photos of their small kids playing with water outside in less than dressed situations.

          If the algorithms trigger on those occasions, and thus makes some one have a human look at the pictures, it is a serious breach of privacy.

          1. rg287 Silver badge

            Re: It's simpler than that

            One of the problems is that pictures of nude or seminude children is very different from country to country.

            Sally Mann's "Immediate Family" (Wikipedia link) springs to mind.

            Publishing 13 nude photos (of 65 in the collection) of her kids was controversial in 1992. In her case the argument is more over the ethics of publishing photos of her kids in that fashion and them growing up with intimate childhood photos in public circulation, in the same way there are concerns about parents oversharing their kid's lives on social media these days.

            But it certainly wasn't pornographic, nor depicting abuse. But it would be a stonkingly sophisticated AI that could make such distinctions.

            1. Snake Silver badge

              Re: jurisdiction and age

              It is also VERY VERY important to bring the question of "what is a child" into this discussion.

              The U.S. state of South Carolina last week was discussing changing the current legal consent age of marriage... FROM 14

              https://www.dailymail.co.uk/news/article-9895789/North-Carolina-raise-marriage-age-14-16-bar-16-year-olds-marrying-21-year-olds.html

              So, interestingly, their laws declare a nude image of a 14-year old to be "pornographic" but having *sex* with that 14-year old is just fine as long as you can force them/their family into agreeing that there's a marriage associated with it.

              So, as usual for the fundamentalists behind some of these things, a 'convenient' double-standard.

              Age of consent varies depending upon the country, and as shown even varies depending upon the jurisdiction within that country. Thailand, for example, is notorious for leniency in age consent, so does Apple have the right to tell the Thai people that Apple will apply *its* social mores to anyone using Apple products with Thailand?

              Yes, it's a sticky can of worms. But they opened it and now they are the ones who have to deal with it.

            2. Anonymous Coward
              Anonymous Coward

              Re: It's simpler than that

              “ it would be a stonkingly sophisticated AI that could make such distinctions”

              Luckily it doesn’t have to. The AI in this case simply has to compare it to a database of known abuse images compiled by professionals who cast votes to agree illegality. The AI is not a model to identify CSAM, it is simply saying ‘is this picture one known to me’ (but not necessarily binary identical due to compression or cropping).

              1. mistersaxon

                Re: It's simpler than that

                ...because let's face it, one very simple way to copy images is to take a photo of them. No USB sticks required, no bluetooth or wifi, heck, you wouldn't even use a phone necessarily. So having on-board scanning for known CSAM is a simple way to stop that happening on phones. I suspect it's an edge case but it will stop it.

                You'd like to think this means it makes it harder for users and distributors of CSAM to use their phones for the tasks and hence will drive down the volume of this material but I suspect it will only drive up the desire for rooted phones while eating away at the privacy of regular people for little to no benefit to them OR law enforcement against CSA. It will make a useful testbed for looking for other kinds of images or media.

      3. big_D Silver badge

        Re: It's simpler than that

        Simple possession of such images is very much a sex crime, at least it was in the UK growing up and it is here, in Germany.

        Take Gary Glitter, he wasn't prosecuted for copyright infringement...

    4. Anonymous Coward
      Anonymous Coward

      Re: It's simpler than that

      To think MS AV scans are not reporting 'suspicious' files back to MS for purposes besides malicious software is naïve.

      1. Dadz

        Re: It's simpler than that

        Indeed, MS reports software used for piracy back to MS.

        Windows Defender even automatically uploads suspicious files to Microsoft.

        Often, AV software uses heuristics to detect suspicious software and helpfully uploads it directly to your AV provider for analysis. If you scan all files, sometimes heuristic strings are found inside non-executables; some virus droppers hide a virus in an attached image using stenography.

        So Windows 10 systems already have all the code in place needed to scan for any kind of content on your PC and upload it automatically, and they already do.

        However, you can permanently disable this feature in Windows by setting a policy. (Otherwise it may turn back on automatically). See

        https://www.tenforums.com/tutorials/5918-how-turn-off-microsoft-defender-antivirus-windows-10-a.html

        You can access the services list on Windows, MacOS and (rooted) Android and stop nearly any service. After working with MacOS and manually installing drivers for network cards through the console, I was surprised to learn that iOS is as locked down as a children's educational tablet. So the real reason that people are angry about this iOS new "feature" is that they know they won't be able to turn it off or control it.

        1. hoola Silver badge

          Re: It's simpler than that

          Based on what you have written here, why just Microsoft?

          Any antivirus vendor has the same capability and in many ways is far better placed than Microsoft. Microsoft may be interested in DRM and piracy of their own software but so much is activated online and reports back anyway now it is a moot point they need to look at local files.

          The "modern" cloud based antivirus solutions such as Crowd Strike can do these sorts of things with even less impact and in many ways I would trust them even less. So much of what they do is surrounded in mystique and snake oil there is very little most users can do to figure out what is happening.

    5. gnasher729 Silver badge

      Re: It's simpler than that

      Well, they would most likely have a legal right not to allow you to upload illegal material onto _Apple's_ cloud service.

      1. DS999 Silver badge

        It is probably further complicated

        not to allow you to upload illegal material onto _Apple's_ cloud service

        Complicated by the fact that Apple outsources a large portion of iCloud's storage to third party cloud providers - and not just one. So your iCloud bits might end up on an Apple owned server, or an Amazon owned server, or a Microsoft owned server, or one owned by some Chinese company if you live in China, and maybe yet another provider if you live in the EU. It is all encrypted so those providers don't know what Apple is storing there for its customers (though Chinese law requires Apple to give up the key to their Chinese provider) but if there are laws that require them to take measures to prevent the storage of CSAM on their cloud they can't do it - I'm guessing there are or the threat of that exists since the big cloud providers ALL check for CSAM already. They have to have Apple do it for them for iCloud files since they don't have the key.

        The big cloud providers can easily check (for unencrypted files or encrypted files for which the password is available to them, at least) since the files are local. It wouldn't be so easy for Apple to do it that way, the checks would run remotely and require a lot of traffic from Apple servers to the third party cloud. So having it checked on the phone as the final step before uploading to iCloud is also the most efficient alternative vs having Apple use its own servers to do such checks like the big cloud providers do.

  2. po

    It's Apple treating its customers with contempt. That's pretty much their business model and, let's be honest, most of their customers love it, the dirty little beasts.

    1. oiseau Silver badge
      WTF?

      It's Apple treating its customers with contempt.

      So what's new?

      It has been going on for decades.

      And (besides Apple's) just who's fault is it?

      Surely not the legions of dickheads with more money than common sense who actually think this is all so cool and convenient.

      Eh?

      Too much for a monday ...

      O.

      1. This post has been deleted by its author

    2. Lord Elpuss Silver badge

      It's very much not their business model; that's why there's such a flap. Apple as a company is known to be paranoid to the nth degree (where n is a large number) about user privacy, which is THE major reason why I went with them as opposed to the competition which quite happily advertises itself as raping and pillaging your data at will in order to monetise you.

      I believe(d) Apple's vision on user privacy not because they told me, but because their current and previous CEOs (a) were both personally paranoid about privacy, and (b) had been on the sharp end of 'outing' in the media due to privacy breaches; so they had felt the pain.

      This whole CSAM initiative is completely out of character for Apple. It's the start of a VERY slippery slope, and a portent of a very dangerous future. I'm not ditching them yet (mostly because there's nowhere better to go), but it's the first time in 14 years that I've actually considered the thought.

      1. Anonymous Coward
        Anonymous Coward

        ??

        Apple may be paranoid about privacy, but it is their privacy they are paranoid about, not yours.

        Een with tracking turned off Apple devices have been found.'phoning home' with your location, and sending your contact data back, and your website history, photos that were not sent to cloud, etc..

        Apple staff have breached privacy, copyright and abuse laws by sharing people's private intimate photos a number of times.

        Or when they inadvertently breach privacy by allowing someone to clone a celeb's account (and thus their phone contacts and photos) simply because they rang up and pretended to be that celeb.

        Just because they don't advertise they are selling your data, does not mean your data is safe with them. It also doesn't mean they are not 'selling' your data using more discreet methods, such as when someone buys advertising with apple and asks to target specific types of people.

        If you want actual privacy get Linux, and set it up with reasonable security. Or install windows and disable the phone home components or setup network monitoring to block any unauthorised connections and ports. There is reason that Apple locked down the ability to switch those components off in the terminal.

        1. Lord Elpuss Silver badge

          Re: ??

          "Apple may be paranoid about privacy, but it is their privacy they are paranoid about, not yours."

          Apple IS paranoid about user privacy, but don't misunderstand me - it's not for altruistic reasons. Safeguarding user privacy is a major differentiator between Apple and Android, and is the reason Apple are able to charge higher prices. Take that away, and you're left with.... higher prices. That's a way to go out of business real fast.

          "Apple staff have breached privacy...."

          The only companies this doesn't happen to are companies with just 1 employee. And even then only if he's dead. You can't design for perfect, because humans will always be human. You can design to make something the very best it can be, and deal with edge cases appropriately when they arise. Which is exactly what Apple did, and does.

          "If you want actual privacy get Linux, and set it up with reasonable security. Or install windows and disable the phone home components or setup network monitoring to block any unauthorised connections and ports."

          If you think you can beat Apple's security with a homebrew setup, you're either one-in-a-million or very naive. And even if you are that one-in-a-million, the other 999,999 out there aren't. Which means it's not a sustainable business model.

          1. LDS Silver badge

            "is a major differentiator between Apple and Android"

            No, that's only the marketing. The real reason is Apple wants its user data fully under its control. Apple wants everybody else off its platform unless they bring money directly to Apple coffers.

            Anybody who can collect data on iOS/macOS can make money using them directly without Apple getting any. Privacy for Apple means "our user data are Apple's only, and only Apple can use them to make money". They will still process those data and use them to make money, you will still get targeting advertisement and user profiling, just anybody on Apple platforms will need to pay Apple to reach their target audience. And Apple knows most of its users are in segments that appeal to sellers.

            It's not different from what Google does with Android, just Google is still hesitant to close the platform completely because being ads its main business it will undergo antitrust scrutiny much faster than Apple that could claim it doesn't hold a "dominant position" in that market. Still, Apple knows it can make some billions out of it - and it's strictly closing its platforms to competitors. Apple knows mobe/pads replacement cycle will become longer soon, and know it needs new revenue streams, and anyway some billions more are always some billions more.

            Than it just need to turn the reality distortion field to eleven and tell users they're not like Google....

            1. Lord Elpuss Silver badge

              Re: "is a major differentiator between Apple and Android"

              ” It's not different from what Google does with Android“

              It’s very different. Apple monetises the ecosystem, Google monetises the users. And the ecosystem, but that’s secondary.

        2. Anonymous Coward
          Anonymous Coward

          Re: If you want actual privacy get Linux,

          That massive wall of text just for: "Can't you wipe it and put a proper OS on there?"

          Linux on mobile is not a viable consumer product.

          But don't let that stop your tired predictable paranoid diatribe.....

          1. Pascal Monett Silver badge
            Thumb Down

            Come again ?

            "Linux on mobile is not a viable consumer product "

            What is Android based on again ?

            Oh right : Linux.

            1. Lord Elpuss Silver badge

              Re: Come again ?

              Android != Linux.

              It depresses me that this needs to be said here.

              1. hoola Silver badge

                Re: Come again ?

                And IOS, remind us of what the underlying system is?

                What Android or IOS are based on is irrelevant to this discussion. Both are in contact with the mothership constantly and will became very homesick if you prevent that for long enough.

              2. Pascal Monett Silver badge

                Re: Android != Linux

                From the Wiki article I linked to :

                "Android is a mobile operating system based on a modified version of the Linux kernel and other open source software "

                I didn't say it was Linux, I said it is based on Linux.

                1. Lord Elpuss Silver badge

                  Re: Android != Linux

                  Your post was suggesting that because Android is somewhat based on Linux, it proves the point that Linux is a viable consumer product on mobile. That's like saying that because a Big Mac is somewhat based on a cow, it proves that it's a viable option to fertilise your land and keep your grass trimmed.

                  It doesn't work like that and you know it.

        3. Lord Elpuss Silver badge

          Re: ??

          "Een with tracking turned off Apple devices have been found.'phoning home' with your location, and sending your contact data back, and your website history, photos that were not sent to cloud, etc.."

          Pics or this didn't happen.

  3. elsergiovolador Silver badge

    Pear shaped

    Apple claimed they had no access to the source images. So if there is a match and their staff "investigates" low resolution version of the image, how are they going to know if what they are looking at is actually what should be reported? Would the staff report it "just in case"? Nobody, on a likely minimum wage, is going to risk being crucified by the media that they reviewed pictures and not reported them when something bad happened.

    Since they can't see the source images, as they claimed, then nothing stops China from inserting hashes of Winnie the Pooh.

    Now they are going to ID everyone in the photos. It's only a matter of time when cameras will be "always-on".

    Is this a Gerald Ratner moment for Apple?

    1. devin3782

      Re: Pear shaped

      Alas I fear not, the reality distortion field is in full affect, it'll take a whole load of false positives before they wake up.

    2. Anonymous Coward
      Anonymous Coward

      Blackberry moment

      As Snowden put it back in 2013:

      Snowden then moved on to Blackberry Limited, chastising the company for “following an AT&T model” and accusing it of providing access to its clients’ personal messages to the governments of Canada, the United States, and India. “Blackberry will be erased from the pages of history” for its operating procedures, he said. “The customer is not really their customer, the state is their customer.”

      Here Apple have attempted a dishonest walk back, but the problems still remain.

      AI is an imperfect algo with lots of false positives, even if it was 1% FP, then given the small number of actual target images among the huge number of personal and family photos, it means statistically ~ 100% of images reported will be false positives. Even if they attempt to only flag if 32 images flag... its still 32 x noise = noise. And of course that 32, will become 1 as soon as challenged.

      Showing your review staff only some photos makes the review less accurate, showing them more, makes the review more invasive. Showing them less detail makes it less accurate, showing them more makes it more invasive. There is no fix here, for false positives that is customer friendly, you are calling your customers pedos then asking your staff to secretly spy on those pedos customers to prove it!

      They've realized the issue with various rogue states demanding their image sets be used for political and draconian purposes. Saying "we will only use sets from 2 countries" ignores the reality of that situation. The cat is out of the bag. US and UK, Russia and Belarus, Iran and Pakistan, China and Hong Kong, Saudi Arabia and Afganhistan, these pairs might do a little double act to humour Apple, but I think they will just ignore Apple's condition and still make the demand anyway.

      Apple will comply, there is no legal basis for applying one countries limits to overrule another countries laws.

      It's all too late, once they implemented that, announced that, they cannot unimplement it, they cannot walk it back.

      FBI got their Apple backdoor, just like RCMP got their blackberry master key.

      Too late to walk it back now.

      https://www.vice.com/en/article/kz9kaa/exclusive-canada-police-obtained-blackberrys-global-decryption-key-how

      1. Lord Elpuss Silver badge

        Re: Blackberry moment

        "even if it was 1% FP"

        It won't be 1%. Or even 0.1%. More likely 0.000000001%, in order to be considered anywhere near "acceptable"* for release.

        * "Acceptable" for Apple may mean 0.000......001% etc. For me, on an issue like this where the consequences of false positives are so severe, the only acceptable FP rate is 0%. No ifs or buts. And since that's not technically possible, they should not be doing it.

        1. Anonymous Coward
          Anonymous Coward

          Re: Blackberry moment

          "More likely 0.000000001%"

          To get that number so low, you'd have to use a huge training set of *distinct* other salt images on that AI.

          You'd have to train it on say, 1 million CSAMs, plus 1e16 other salt images.

          The result would be an overtuned AI that would endlessly flag false positives and miss false negatives, because it would need a feature-sets so fine grained, as to be useless noise.

          And the size of the ML model would be untenable.

          So no way, it will 80-20, their 32x images suggests they're aiming for 1% of accounts flagged, they will make the mistake of thinking the flagging is independant, and assume it will produce 20%/32 = 0.6%.

          Lets not ask where that 80% of normal customer images comes from shall we? Because the salt comes from somewhere that corresponds to their customer set!

          So, even if it was only 0.6% of accounts flagged as pedo, those will head off to be vetted by your Indian basement teleworkers. Indian teleworker, because they're cheaper.

          That teleworker will have signed a contract with an agency that has a contract with Apple, and pinky swear not to abuse your data, and Apple trusts that worker, and that agency, but not the customer.

          They could have flagged it to you on the upload to their servers: "this image violates our terms of service....[appeal][skip]". They could have used a hashing also, that has essentially zero false positives. But they had to try to make it 'better'.

          Dumb, now they've opened the door to device scanning and false positives and AI models, the outcome is just inevitable.

          In the original press release it talked about text messages too. I see that's fallen off the press release like they're hoping nobody remembers that.

          1. TiredNConfused80

            Re: Blackberry moment

            I think that (and I may well be wrong on this) they are not looking at a photo and making a decision on whether that is child sex abuse or not. I think what they are doing is generating a signature of the photo in a standardised way and then comparing that signature against signatures already made (in the same industry standard way) of known sex abuse images. If the signatures match then that should mean it is the same photo. I don;t *think* this is designd to catch new images just flag up the presence of old already known and flagged ones.

    3. Tessier-Ashpool

      Re: Pear shaped

      I can’t be arsed to read all their blurb again in detail. But I seem to recall that the review staff are said to be just there to confirm the accuracy of the reporting process itself. Reading between the lines, I read that as: if we suddenly get a massive uptick of autosnitching, someone with eyes will be there to hit the kill switch.

  4. Zenubi

    - "Apple is happy talking to us about how good its camera app is, we can't tell. And we shouldn't trust it – or anyone in this game – until we do."

    And not even then !

  5. NightFox

    A critical mistake in the third paragraph: "...will scan all those you have sent and will send to *iPhotos*" should be "iCloud Photo Library".

    iPhoto was replaced by Photos many years ago, but more significantly that is/was just a local app, it's only if you're using the iCloud-based service that Apple will scan those photos AFAIK.

    1. Electronics'R'Us Silver badge
      Megaphone

      Capability

      it's only if you're using the iCloud-based service that Apple will scan those photos AFAIK.

      That is how it has been presented; the problem is that the capability to do far more exists simply because it is a local app that (as far as I can tell) the user has no control over.

      The discussion around privacy and capability needs to take place because this capability is, for some three letter agencies and their equivalents around the world, a wet dream come true.

      1. NightFox

        Re: Capability

        A lot of people seem to see this as the emergence of a new capability that could now be open to abuse. However, that capability has existing for years already - this is just the case that Apple are now talking about implementing it. It's potential for abuse is no greater than it was previously.

        I'm not saying there's nothing to be worried about here - just that the genie's already out of the bottle and whichever way Apple goes with CSAM scanning doesn't really change that. Concerns that at oppressive regime or whoever could persuade Apple to scan content for something else were just as valid 5 years ago - all this does is serve as a reminder of what technology is capable of.

        1. ThatOne Silver badge

          Re: Capability

          > that capability has existing for years already

          So what? That doesn't excuse implementing it. The capability to shoot anyone you don't like has existed since the invention of gunpowder, yet this is still frowned upon...

          My point is that "because we can" isn't an excuse. The question to ask oneself at any moment is "should we?".

          1. NightFox

            Re: Capability

            My point wasn't that Apple should or shouldn't be doing that. It was more that people are seeing this could be a step to something more sinister, missing the point that the 'something more sinister' never required this step to happen first.

            1. ThatOne Silver badge
              Happy

              Re: Capability

              > My point wasn't that Apple should or shouldn't be doing that

              Yes, that was mine... Sorry if my post sounded like I was trying to invalidate your statement, I didn't, your point is indeed valid. I was just using it to jump to what I think is very important, that some things should remain undone. We don't need more vectors for surveillance.

            2. doublelayer Silver badge

              Re: Capability

              "missing the point that the 'something more sinister' never required this step to happen first."

              It did though. They could implement it at any time, but in order for the sinister consequences, they needed to. Before, they had the option to take that required step, then become sinister. They have now taken the step. Yes, they could have implemented this in secret years ago, but the fact remains that they did not.

        2. elsergiovolador Silver badge

          Re: Capability

          You are missing the point. The issue is not the capability, but doing it without a warrant and at scale. This is where they crossed the line.

      2. mbdrake

        Re: Capability

        As more devices gain dedicated machine learning hardware, what's to say that these kind of techniques are not going to be present in other brands' devices?

        1. big_D Silver badge

          Re: Capability

          It isn't just ML, you can do this without, it just takes more processing power and is harder with manipulated images, but the original CSAM database has been in use for around a decade now, but restricted to online service until now.

          If governments can point to Apple doing CSAM checking (with or without the additional ML bit to improve detection of manipulated images), it makes it easier to force it on other manufacturers as well.

          Maybe the year of Linux isn't all that far away, after all...

          1. ThatOne Silver badge

            Re: Capability

            Don't focus on CSAM, that's literally just the old "somebody think of the children" foot in the door.

            Much like the article's author, I don't think they will catch a single paedo with it, but using that consensual, difficult to question subject they will have advertised they can do silent surveillance. Now governments worldwide will apply with their own, more realistic, user cases.

            1. TRT Silver badge

              Re: Capability

              Possibly, possibly, but then when various governments around the world start playing the 'Think of the Children' card as part of their electioneering, and then when they get into power then kick those lobbying TechGiants right in the 'nads by NOT giving in to them about technical infeasibility in ensuring that they are not a vehicle for the distribution of whatever kind of immorality is the flavour of that four years... Apple can say "We're doing everything feasible and reasonable to ensure that the spread of kiddie pr0n / donkey pr0n etc is not something we are facilitating"... and STILL be able to say their messaging is e2e encrypted, backups are securely stored etc etc...

              They effectively dodge the feds thumbscrews on encryption etc by putting an AI fed agent into every device on the inside of the user encryption wall.

              I mean, they're getting nail gunned by the press now, BUT (and this is where I'm probably being somewhat naive and too optimistic), if they can present a mechanism that satisfies the feds and gets them to back down from pressuring the law makers into weakening encryption... then that's a win, no?

              1. doublelayer Silver badge

                Re: Capability

                "BUT (and this is where I'm probably being somewhat naive and too optimistic), if they can present a mechanism that satisfies the feds and gets them to back down from pressuring the law makers into weakening encryption... then that's a win, no?"

                I'm afraid your adjectives are quite correct in this case. If they put a spy on the endpoint, then the debate over encryption could get dropped. But that's because they won and we lost. If they can force all users of encryption to turn over the cleartext so they don't ever have to decrypt, then the result is the same: repressive countries do whatever they want, criminals have an attack vector to get the data, all of the reasons we want encryption are neatly circumvented. True, that wouldn't necessarily apply to everything, and a few people who already know why and how to encrypt could use open source software to do so, but if that ends up happening, they just start the encryption legislation again. If we lose 95% of our goal, that's a pretty clear loss already, especially as nothing in it prevents them from later taking the remainder.

                1. TRT Silver badge

                  Re: Capability

                  "that's because they won and we lost. "

                  Oh yeah. I knew there had to be a catch somewhere.

                2. Charles 9 Silver badge

                  Re: Capability

                  IOW, they've figured out the best way to beat encryption: attack the human interface, the one point "outside the envelope" where the juicy stuff MUST be decrypted

                  1. hoola Silver badge

                    Re: Capability

                    And that is the crux of all these services. At the point a human needs to interface with the data (images, words or sounds) it has to be in a human useable format and that means decrypted. Just like all these video conferencing tools that have to support telephone dial-in. At the point the data stream needs to talk to the telephone system it has to be decrypted, there is no other way. Now it may be encrypted again if IP telephone or mobiles are being used but at some point it has to be in clear.

        2. ThatOne Silver badge
          Big Brother

          Re: Capability

          They obviously will, you just won't necessarily know about it.

          And this will certainly spread to all kinds of consumer devices, because there are more and more governments out there who want to make sure you don't watch/do/say forbidden things. With the worldwide rise of fundamentalisms both religious and political, there is big money to be made creating all those little personal snitches the victims will pay for. Many companies will go "if we don't do it somebody else will" and soon your own computer and TV will check if you're an obedient, god-fearing model citizen, or a perverted criminal to be removed.

          1. Someone Else Silver badge

            Re: Capability

            And this will certainly spread to all kinds of consumer devices, because there are more and more governments out there who want to make sure you don't watch/do/say forbidden things.

            Like ours.

            And it doesn't matter what country you're reading this post from, it's still ours.

  6. Chris G Silver badge

    "It's on device, therefore it's private.

    Roll up! Roll up!

    Getcher bridges, NFTs and sausages onna stick right here!

    Everything is guaranteed.....

    1. Anonymous Coward
      Anonymous Coward

      Re: "It's on device, therefore it's private.

      The only thing private here is the CSAM hashset, where is it?

      It's a set of hash numbers, it is *not* child abuse images, a number like 0x11827392674652927 that represents some hash function of an image, so there's no reason not to put it out for review.

      With the hash set, we could hash images and check them against the database to see if they're in the database, to identify false positives in that set and make the set better.

      e.g. if a hash matches a tiny tove image, then its not child abuse. If it matches a medical image its not abuse, if it matches a clothed image its not sexual, it if matches Sears catalogue, its not abuse.

      I'm guessing the set comes with a contract, and the contract stops you publishing its false positives. Is that correct? But why would you do that?

      Given the expansive way its been compiled without antagonist checks, it will be full of garbage by now, they'll have shoved everything in there unopposed, "for the children" is practically a super-power to them, they'll shove it in knowing it will never be challenged or reviewed.

      Apple should also release their AI model, and we can datamine images that get flagged by Apple's AI among our family photos and be warned of what they're doing there.

      1. Anonymous Coward
        Anonymous Coward

        Re: "It's on device, therefore it's private.

        Well there is no way you could check the hashes without breaking the law (e.g. if you Googled or searched them on peer-to-peer) as you'd then be in possession of them. Unless of course you hashed legally held material and looked for collisions I guess but I suspect this might be a fools errand.

        Furthermore Apple is using a non-standard algorithm to hash the images, very similar to Microsoft's PhotoDNA technology by the sounds of things. Essentially measuring edges and boundaries of objects to create a 'fingerprint'. The closer the fingerprint the similar the image. Effectively the only person who can equate the hash the the actual image is ProjectVIC/NCMEC who have the database of images and the hash list. The reason Apple will have to encrypt it is a hash of an image is considered personally identifiable information (in the UK at least). One assumes Apple will make reports to NCMEC who will check the original image before dissemination to law enforcement.

        Remember this process; scanning material and comparing against known hashes is standard industry wide. The bit which appears new is the on-device scanning of one's iCloud 'outbox'. The truth is though, this isn't new; most apps hash a file before uploading it on the device. Apple cannot hash in the cloud as its encrypted by time it gets to them. The fact is, Apple is hosting vast quantities of illegal material which it has no idea about. Paedophiles using Apple's iCloud sharing functionality is unfortunately common; you can understand why Apple might be uncomfortable with this.

        As someone who works in this field, I can assure you that the database Apple have sourced the data from is about as good as it can get. It is routinely used to prosecute people to a standard of 'beyond all reasonable doubt' in the UK and abroad. However it's not perfect and does contain 'subjective' and erroneously graded material; when it comes to prosecution, humans still check. To make arrests or get search warrants law enforcement need to have solid evidence. This is why Apple have the set a threshold - you have to have numerous matching images before you're even flagged. The margin cases should be filtered out. It will still be like shooting fish in a barrel though.

        The database itself is actually compiled by a series of 'voting'. One person cannot decide if material to be stored in the database is illegal or not - several law enforcement or NGO personnel have to cast votes before material is confirmed.

        Apple haven't said which categories of CSAM will be flagged or what the flagging threshold is so it'll be interesting to see when the results start coming through to us.

        You've made an error regarding clothed images. In the UK at least, certain clothed images are capable of being deemed as CSAM.

      2. TRT Silver badge

        Re: "It's on device, therefore it's private.

        Presumably because one could reverse engineer a nonce image which could then be used to flood the system and render it useless. Unless there's some kind of double hash thing going on. I don't know. But there are enough 'privacy warriors' around who would join in I think to defeat this.

  7. Omnipresent

    What has apple NOT seen?

    Ever thought about how apple has seen most of the world have sex? LOL. You put your phone down next to the bed when you do. This world is TERRIFYING.

    1. elsergiovolador Silver badge

      Re: What has apple NOT seen?

      Just wait when they develop AI consent detection as a justification for always-on microphone.

      1. TRT Silver badge

        Re: What has apple NOT seen?

        Now THAT'S a patentable idea.

    2. big_D Silver badge
      Black Helicopters

      Re: What has apple NOT seen?

      No phones in the bedroom here...

      1. elsergiovolador Silver badge

        Re: What has apple NOT seen?

        Wait until it will be mandatory to have a phone with you at all times, like a tag. When phone detects being left alone, it will automatically call your handler...

        1. Someone Else Silver badge

          Re: What has apple NOT seen?

          Wait until it will be mandatory to have a phone with you at all times, like a tag.

          George O. would have never, even in his wildest dystopian nightmare, ever thought that the proles would be convinced to spend their own hard-earned money to purchase their own telescreens.

          1. LDS Silver badge

            Re: What has apple NOT seen?

            Actually the Proles didn't need telescreen - they were kept under control in a different way. Internet porn, and social media and their fake celebrities are actually the closest thing to "1984" ways of controlling the proles. And there were whole Party departments to create such material.

            But of course there are those "dangerous people" who can't be controlled that way you have to track and frame. Yet Orwell was a socialist, so he thought telescreen should have been State-mandated., as SSSR would have done. Authoritarian capitalism instead ensure you will buy them yourself - after all business is business....

            1. ThatOne Silver badge
              Devil

              Re: What has apple NOT seen?

              State-mandated but paid by the user is the way to go, combining the advantages of both: Making sure everybody got one, and also making a nice benefit selling them.

          2. Zippy´s Sausage Factory
            Meh

            Re: What has apple NOT seen?

            I'm sure I've read somewhere that he decided to keep as close to the present as possible, in order to make it more believable to the reader at the time. Hence the telescreens were manned by people at the other end, there was no science fiction other than telescreens having built in cameras. I'm afraid it's a long time ago though so I can't swear as to where or even if my memory isn't playing tricks on me.

          3. doublelayer Silver badge

            Re: What has apple NOT seen?

            "George O. would have never, even in his wildest dystopian nightmare, ever thought that the proles would be convinced to spend their own hard-earned money to purchase their own telescreens."

            No, he did. In one scene, a person says that no telescreen was installed because it didn't seem worth the expense. Now it turns out that guy was lying, but it does imply that the telescreens were purchased.

      2. TRT Silver badge

        Re: What has apple NOT seen?

        Safe sex nowadays seems to mean videotaping the whole shebang so that one can later rely on it in court.

  8. Eclectic Man Silver badge

    Lovers or dolphins?

    https://www.reddit.com/r/illusionporn/comments/1qtpyl/what_do_you_see_in_the_picture_two_lovers_most/

    What would Apple's AI make of that image, I wonder?

    OK so nothing to do with child abuse, but there are many images which could be interpreted in several ways, and I wonder how Apple's AI would cope, and how 'sympathetic' the forces of law and order might be on receiving an alert?

    1. Anonymous Coward
      Anonymous Coward

      Re: Lovers or dolphins?

      You give AI more credit than is due.

      This is how AI, that was trained on dogs, sees a supermarket:

      https://www.youtube.com/watch?v=DgPaCWJL7XI

      There are no dogs in that supermarket, all the dogs it sees are false positives.

      It sees what its trained to see. How many false positives their AI will generate depends on the training set it was programmed on. My guess is they used 80-20, 80% customer photos to 20% their child porn training set, which means it expects to see 20% of images as child porn.

      Which would result in a massive amount of false positives.

      1. Yet Another Anonymous coward Silver badge

        Re: Lovers or dolphins?

        Don't worry, this system hasn't been trained on any child porn images. You think the police/feds/TLAs care about child porn?

        It is however excellent at detecting photos of police (US edition) or Arabic writing (Euro edition) or HK flags (Chinese edition)

      2. Hero Protagonist
        FAIL

        Re: Lovers or dolphins?

        “This is how AI, that was trained on dogs, sees a supermarket”

        You have completely misconstrued what that video is. In the comments, the creator says he used DeepDream to create it. The Wikipedia entry for DeepDream states:

        “The software is designed to detect faces and other patterns in images, with the aim of automatically classifying images. However, once trained, the network can also be run in reverse, being asked to adjust the original image slightly so that a given output neuron (e.g. the one for faces or certain animals) yields a higher confidence score. This can be used for visualizations to understand the emergent structure of the neural network better, and is the basis for the DeepDream concept….After enough reiterations, even imagery initially devoid of the sought features will be adjusted enough that a form of pareidolia results, by which psychedelic and surreal images are generated algorithmically.”

        So it’s not how the Ai “sees” the original image; all those “false positives” are intentionally generated by iteratively modifying the source image. It’s more like a video effect of “make everything in this image look like a dog.”

    2. big_D Silver badge

      Re: Lovers or dolphins?

      Nothing, as it isn't in the CSAM database...

    3. Someone Else Silver badge

      Re: Lovers or dolphins?

      Yeah, and what color would Apple think that dress is?

    4. Anonymous Coward
      Anonymous Coward

      Re: Lovers or dolphins?

      Just to be clear - the AI in this case ('neural hashing') is about comparing images stored on devices with near-identical *known* abuse images, it is not about 'interpreting' new images to assess them for being CSAM. It's a slight step above binary hashing (MD5) as it allows changes due to compression and aspect ration change to still match.

      They are using a different kind of AI to identify dick pics/nudes on kids phones but that is different to the CSAM detection system.

      As to law enforcement receiving an alert, it's not quite how you imagine. Apple compile a CyberTip report to NCMEC (a charity in the US). The report contains enough information to prove an offence has occurred (e.g. an audit log of the detection, copy of the hash, perhaps a copy of the image itself) and information to identify the user e.g. usernames, IP addresses, emails etc. NCMEC assess the report and confirm the image is illegal. Only then is it disseminated to the law enforcement agency where they believe the user lives. The agency then check the images and information themselves... Only then does it go to an officer, who guess what? Checks...

      Often the tip offs are just that - enough to get the cops through the door. If someone is a paedophile, examination of their devices will quickly provide ample evidence of possession and/or distribution of CSAM.

      1. BloggsyMaloan

        Re: Lovers or dolphins?

        >Just to be clear - the AI in this case... is about comparing images stored on devices

        Just to be clear, the principle in this case is about Apple analysing an iPhone's content, which most users assume to be private, without explicit, informed consent, thus setting the scene for further abuse of privacy.

        1. Anonymous Coward
          Anonymous Coward

          Re: Lovers or dolphins?

          To imply Apple are analysing all of the images on your phone is disingenuous. It is analysing the images you intend to upload to their servers. If you don’t upload them, they don’t get scanned

          1. BloggsyMaloan

            Re: Lovers or dolphins?

            >To imply [sic] Apple are analysing all of the images on your phone is

            >disingenuous. It is analysing the images you intend to upload to their

            >servers. If you don’t upload them, they don’t get scanned

            From the same anouncement...

            "The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos."

            https://www.apple.com/child-safety/

            Such images would appear unlikely to be intended for uploading to Apple's servers but, rather, for (what was once thought of as) private use. It might be surmised that such images might be identified by scanning all sent or received images, i.e. a different, additional, type (*) of scanning from that involving matching hashes against a database.

            (*) Which suggests that Apple is scanning images originating from non-Apple users, as well as those from its own disciples.

            Who owns the phone? Who decides what software it runs?

            1. Anonymous Coward
              Anonymous Coward

              Re: Lovers or dolphins?

              Ah, that’s a different technology and solution Apple has confusingly launched at the same time. My comments refer to their known (previously identified) CSAM iCloud upload detection solution. The one that informs the authorities if you attempt to upload too many known CSAM images to your iCloud account.

              My understanding is that the AI for detecting dick pics and nudes is for child accounts and does not report back to Apple but I agree for it to work that must be scanning everything unless you disable iMessages and other supported apps

  9. mark l 2 Silver badge

    Since this CSAM only applies (at current) to the US, what happens if someone buys a Iphone from the UK/EU and then uses it in the US? Will the scanning not take place or as soon as you connect any Iphone to a US network will it download the updated version of the photos app?

  10. Andre Carneiro

    I only wish there was a custom ROM for mobile devices built with actual, proper privacy in mind.

    I’ve had iPhones for a long time but this one is just too much.

    1. Anonymous Coward
      Anonymous Coward

      "I’ve had iPhones for a long time but this one is just too much."

      wait for apples next possible advertising pitchs:

      "You didn't buy an iphone?, are you a pedo?"

      "prove your not a pedo, buy an iphone!"

      "Android users are most likely pedo's, be vigilent report them, and buy yourself an iphone to show your not one!"

      1. Anonymous Coward
        Anonymous Coward

        I, for one, welcome our Fruity Overlord Protectors...

        Do you know how frickin' hard it is to get decent dark room chemistry at a reasonable price now that everyone and their mother has gone digital?

        It'll be great once there's a rising demand for this stuff. The pr0n merchants always set the pace.

    2. Graham 32

      There are a few. And those I've seen recommend, without irony, you buy a Pixel phone to install it on.

  11. LDS Silver badge

    "if Apple were to put out [...] its capabilities, intentions and safeguards

    No matter how the technology and the safeguards inside Apple works, this is a violation of basic rights and no private entity can have such powers, because it can evidently be easily abused. We've seen State abusing such powers already, imagine what a private entity accountable to no one but its shareholders could do.

    BTW, all devices will fall under this control, or there will be a group of people "more equal than others" that will be exempt? I mean, politicians, Apple executives, their friends... would have had Epstein the same iPhone of the plebs?

    1. Anonymous Coward
      Anonymous Coward

      Re: "if Apple were to put out [...] its capabilities, intentions and safeguards

      Is it that unreasonable that Apple don't wish to host CSAM on its servers? If you don't use their cloud services, you material isn't scanned

      1. LDS Silver badge

        "If you don't use their cloud services, you material isn't scanned"

        Now - maybe. What ensures it won't be soon? Apple promises? From a private entity whose mission is to make shareholders happier, and whose management is accountable only to them?

        What independent authority checks Apple is not abusing the system, for example to obtain other kind of information? Who ensures all users are treated equally - and evidences are not destroyed or used to blackmail someone?

        There are sound reasons why such powers are reserved to States and their law enforcement agencies under judicial oversight, and must never assigned to private companies.

        Eisenhower warned against the "Military–industrial complex" - now it's time to think about the power some IT companies achieved, and his warning is valid for this new complex.

        1. Anonymous Coward
          Anonymous Coward

          Re: "If you don't use their cloud services, you material isn't scanned"

          I guess for the same reason they don’t simply switch the mic on 24/7 and stream the audio to themselves… They would get caught doing it eventually and would lose their business.

          We trust companies aren’t doing bad stuff to us when they don’t say anything, even though they might doing bad stuff. The moment they come out very publicly and disclose a very limited amount of on-device processing which numerous over vendors already doing, we suddenly distrust them?

          My take on this is Apple are simply the nightclub bouncer… you can do what you like on your phone but the minute you try to bring stuff through our doors, we’re going to enforce a condition of entry, in this case checking you’re not uploading CSAM to us.

  12. xyz123

    Apple made a promise not to obey GOVERNMENT requests for data.

    Internally they created their own NGO/Shill company and can freely pass data to whichever government pays them. UK and other governments are considering dropping iPhones as government phones can hold passwords and very high security information.

    This way the NGO gives data to the government, Apple has "technically" kept its promise, but gives over whatever the government wants.

    They also admit internally that child porn is step 1. next will be ALL images, then text documents, emails, logins and passwords, keylogging and office documents etc. Tim Cook calls it "the slow push"...basically introduce "think of the kids" option to get people used to the idea, then secretly and slowly expand the reach.

  13. Nifty Silver badge

    "Cupertino can see things you people wouldn't believe"

    If Apple phones could talk... Oh, they can.

    "I've seen things... seen things you little people wouldn't believe. Attack ships on fire off the shoulder of Orion bright as magnesium... I rode on the back decks of a blinker and watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments... they'll be gone."

    (From the original script, before Hauer's rewrite)

  14. Tron Bronze badge

    All your base are belong to Apple.

    At this rate the Chinese government will order all of their citizens to own an iPhone.

  15. Cat Empire

    Canaries & Coal Mines

    It's interesting that no one has considered this may be Apple's way of releasing the canary without being charged behind closed doors in a FISA court.

    Being required to implement "something" to provide the backdoor required by the enemy that is the state and remain silent as to the real motivation or face retribution/consequences, what would you do? The most obvious is to use their own argument, "Think of the children", to justify it.

    Apple's favourite testing ground for new technology adoption is Australia, why only the USA for this one?

    This action has the distinct aroma of the gutter slime found in the Land of No Secrets Anymore where even the shadows lie.

    Sorry, it doesn't pass the stench test for me.

  16. FlippingGerman

    iPhone

    I was leaning very heavily towards replacing my current phone with an iPhone. I'm now leaning so far the other way that I have a concussion.

    This all makes me very sad. Just when I was starting to trust that actually, maybe Tim Apple really cared, bam, only joking, we're gonna check through your private stuff to make sure you're not a paedo.

    1. BloggsyMaloan

      Re: iPhone

      >Just when I was starting to trust that actually, maybe Tim Apple really cared

      Profit is Good. Capitalism is God.

      I name you as a True Believer, a born-again consumer of Apple Pie and claim my prize.

    2. jtaylor Bronze badge

      Re: iPhone

      There are better reasons to buy an iPhone. There are better reasons to not buy an iPhone.

  17. Anonymous Coward
    Anonymous Coward

    Side before to Apple, Android users are...

    I wonder if Apple realizes (or planned this) that if they got people to accept this, they'd eventually be able to smear Android users as "they've got something to hide", or just "Android users are pedophiles", and it's too fragmented for Android to follow suit. More market share for Apple.

  18. Anonymous Coward
    Anonymous Coward

    China crisis

    How will this work in places like China, Saudi and Russia? If they know an Apple phone can detect undesirable images? Will they decide on a different criteria for undesirable images?

    1. Charles 9 Silver badge

      Re: China crisis

      I suspect this is the reason Apple is implementing it. Especially in China, where they could lose access to a huge and growing market (and cheap manufacturing) if they don't.

  19. BloggsyMaloan

    Privacy

    You're thinking it wrong.

  20. gnasher729 Silver badge

    What Apple should do

    First some principles: Apple should be allowed to not let you upload illegal images onto their cloud servers. And Apple should be allowed to help you keep images that you don't want to see and that could get you into legal trouble away from your phone. So here's my suggestion:

    1. When your phone tries to upload an image to Apple's cloud servers, Apple should be allowed to reject them without telling anyone about it - except probably marking the images on your phone so you know they are not uploaded and therefore not backed up. That's completely legit if it filters out illegal material. Probably should have a button that let's you say "no, this is not illegal, have a human look at it". Which you don't have to press obviously.

    2. Everything else is opt-in with no way for anyone to check if you are opted in. Something in settings, where you can opt-in or opt-out without anything being displayed. If you can't remember whether you opted in or not, you opt-in or opt-out again.

    3. If you are opted in, images that you attempted to download might be rejected. Apple makes sure that legally you didn't download the image, and that legally you didn't attempt to download the image. Probably a status 403 (or is it 401? Can't remember). And Apple doesn't tell anybody about it. So if you go on a regular legal porn site and somehow the wrong kind of images got mixed in, you are not in any legal danger.

    I think that would be quite an acceptable solution.

  21. drankinatty

    Don't know what the all the hubub is about, my iphone 4 still works fine, icloud what?

    Chuckling, not that I have anything I would care about others seeing on my phone, pictures of pin headers soldered to MPU9250 chips are pretty boring, but sometimes it is comforting not to have new tech. Nothing new runs on the old iphone (still works fine as a telephone, calendar interfaces with my groupware backend, mail works, etc..), but since it has nothing to do with icloud -- there is no chance any on device or off device scanning will be taking place.

    On the serious side, and despite the potential back-dooring of end-to-end encryption as a side-effect. if you don't control where your data resides, you cannot expect that other eyes (or AI's) won't be scanning it for some purpose, whether for targeted advertising or sending to the Feds. How many times do we have to hear about the misuse of our information before we wise up and prevent the information from getting out of our control to begin with?

    There are still open questions about what is reasonable e-search and seizure, and attorney-client privilege vitiating disclosures on the legal side (just for having your data hosted elsewhere). There is medical record privacy on the medical side -- and despite all the assurances of absolute protection and privacy countless patient records have been divulged.

    Why should we believe the assurances this time? What could possibly go wrong with a new layer of AI added to cloud based services that tell you up from it will be mining your data. Sure, I trust them, when they say there will be no abuse or misuse -- NOT...

    1. gnasher729 Silver badge

      Re: Don't know what the all the hubub is about, my iphone 4 still works fine, icloud what?

      Just saying: There's no backdoor. Everything is done through the front door. Your phone obviously needs to be able to access your photos, or you couldn't see them. If you upload photos to Apple's cloud server, then normally the photos would be encrypted and then sent. What's planned is that the photos are checked, and then it's encrypted and sent and Apple remembers that there's a naughty photo. No backdoor for Apple, but front door. No access for anyone else.

  22. Brandino

    Ok

    I dont have an issue with that..they can even take selfie od me while taking a dump …AND if child security is at risk why not Protect The kids…

    1. Anonymous Coward
      Anonymous Coward

      Re: Ok

      It is strange to have so little self-worth and yet still have splashed out cash to become an iPhone owner.

      And of course we want to protect the kids. We want to protect their privacy from scum like Apple.

    2. Warm Braw Silver badge

      Re: Ok

      I dont have an issue with that.

      I'm sure an opt-in mechanism can be provided for you.

  23. Anonymous Coward
    Anonymous Coward

    Scope creep

    Phase 1 - We're scanning your images to protect children from abuse

    Phase 2 - We're removing all the files that we thing might be used identify you, to protected your privacy.

    Phase 3 - We're removing all content that Apple didn't sell you, to protect our profits.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021