If this doesn't make people stop using Iphones... then what?
Apple is about to start scanning iPhone users' devices for banned content, professor warns
Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain. The neural network-based tool will scan individual users' iDevices for child …
COMMENTS
-
-
-
Friday 6th August 2021 03:29 GMT Blackjack
Apple: We care about your privacy that's why we made Facebook and all those people whose revenue depends on tracking you angry.
Also
Apple: We will scan your photos, but is to look for banned content like people that abuse children. Do not worry, there is absolutely no way your photos that do not have anything that's banned content will be leaked and used against you.
-
Friday 6th August 2021 14:24 GMT Zippy´s Sausage Factory
And, of course, don't worry. There's no way we'll give in to the Chinese government and let them control the list of banned images. Or the Saudi one. And even if we did, we won't just immediately hand over your name, phone number and current GPS location. At least, not until the new laws they're already sharpening get passed...
-
Friday 6th August 2021 18:31 GMT David 132
Given that the CCP have passed laws that basically mean criticizing them, anywhere in the world, is an offence, better never ever have photos on your phone of Tiananmen Square Tank Guy, or the Tibetan national flag, or Uighur cultural imagery, or the flag of the independent nation of Taiwan.
Or hope that those at Apple responsible for this get beaten with the Cluestick.
-
Friday 6th August 2021 23:36 GMT John Brown (no body)
"Or hope that those at Apple responsible for this get beaten with the Cluestick."
I wonder if Apple Legal are aware that Apple have the ability to scan data on a users phone? I'm sure various law enforcement agencies who spent large sums of money having suspects phones "cracked" because Apple said it was not possible for them to do it, will all be mighty pleased to hear the news.
-
-
-
Friday 6th August 2021 17:11 GMT Jimmy2Cows
And of course, we'll never get it wrong, and send your details to law enforcement without checking, since we're not allowed to check as that could risk viewing illegal images.
But don't worry, the police will definitely verify what's found really is illegal before smashing your door in at 4am and terrifying your children. Because somebody needs to think of the children.
-
-
-
Friday 6th August 2021 05:23 GMT Anonymous Coward
Actually it's a cunning Apple marketing ploy ...
... If you stop using your Apple phone now then you'll be put on a Paedo Terrorist Watch list.
... If you're using a competitors phone that doesn't implement this technology then you'll be put on a Paedo Terrorist Watch list.
... so you better sign up now with Apple to show that you're 'Clean'.
... The Watchlist may then later be extended to include anyone who expresses any opinion that AI could possibly construe as opposition to The Party in power.
Nothing to Heil nothing to Fear.
Now just waiting for Priti Patel to jump onto this Band wagon.
-
Friday 6th August 2021 11:12 GMT Anonymous Coward
Re: Actually it's a cunning Apple marketing ploy ...
and Nicola Sturgeon, who will demand all "adult material" is included, what with the SNP policy to ban what THEY deem "pornography" (strong public opposition and attendant loss of votes being the only barrier at the moment to them, though they are committed to "altering public attitudes to make such material socially unacceptable") for those who doubt, read up on Equally Safe...a policy on the surface with laudable aims (making women and girls as safe as men in day to day life, however no word of "free" and there's the rub......) but with some seriously Orwellian concepts tucked away.......outlawing "pornography" (as women who participate "harm other women indirectly") and "exploitative sexual practices" (again what THEY deem expolitative)
No wonder they are so touchy with comparisons to dystopian regimes and a certain German government....bit too close too home for comfort likely......
-
Friday 6th August 2021 14:14 GMT Bbuckley
Re: Actually it's a cunning Apple marketing ploy ...
And once they march through the 'obvious crminal' material, next will be identifying those that do not agree with 'diversity' (as defined by them) and so on and so forth until we really get to the Orwellian future that this inevitably leads us to. Or. We (the people who object) take up arms and destroy them and free ourselves. World War III anyone?
-
-
Saturday 7th August 2021 20:40 GMT Fruit and Nutcase
Re: Actually it's a cunning Apple marketing ploy ...
"UK government spends more than £163,000 on union flags in two years
Purchases have increased across departments, revealing embrace of the flag under Boris Johnson"
"Let them
eat cake[wave Union Flags]" says World King Borisps
@katrinab - I see you've been downvoted by the resident Boris fan club
-
Tuesday 10th August 2021 02:21 GMT Claverhouse
Re: Actually it's a cunning Apple marketing ploy ...
Or in England, not having sufficient numbers of Union Flags in the background of every photo ...
.
Unlike, say, America, very few English households display or even have a Union Flag.
.
.
I have one somewhere, maybe rolled up in a box in store, but it's a small one, probably left over from a Boy Scout village hall, or a small boat in the 1930s to 50s. Not that I would fly it anyway as a jacobite.
.
.
The American version of Boy Scouts are actually chargeable with respectfully disposing of people's Old Glory, via the Flag Code with no desecration. Just one of the many reasons we consider Americans to be nuts.
-
-
-
Saturday 7th August 2021 09:43 GMT Anonymous Coward
Re: Actually it's a cunning Apple marketing ploy ...
As safe as men?
We need to implement laws that shave an average 5 years off a woman's life and increase their risk of suicide and heart attack then.
We also need to ensure that women have more industrial accidents too.
Only then will women be as safe as men.
-
-
-
-
Friday 6th August 2021 06:50 GMT 45RPM
As I understand it, this technology has been developed because Apple has built its service in such a way that it can’t scan in the cloud so it has to scan on device. Everyone else, Google included, scans in the Cloud. One way or another, you’ll get scanned - it’s just a question of where.
In the case of Apple’s system, violating images can be decrypted by law enforcement - not all images.
And yes, I can see how this might be expanded to include other ‘crimes’ (inverted commas because the definition of what is criminal depends on the state - in one country, for example, it might be criminal to blaspheme but not to stone someone and vice versa). I hope that Apple will be restrictive about what it scans for - terrorism, paedophilia and that’s about it, but pragmatically speaking, with governments clamouring for a back door into our devices this seems like a sensible middle ground. Let them see what is pertinent to the case, and nothing else.
-
-
Friday 6th August 2021 08:40 GMT 45RPM
Re: re: this seems like a sensible middle ground.
A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met.
The way government legislation is going worldwide a tech company has several choices in order to ensure compliance…
1) build in a back door which allows law enforcement agencies (and anyone else who gets the keys on the dark net) to scrobble whatever they like from your device.
2) store everything on the cloud and scan it there (this is the most common choice that big tech makes, especially since they’re already scanning for the purposes of advertising)
3) scan on device and only provide keys to law enforcement for data which is illegal (this is the route that Apple has taken)
4) give up, and only make dumb devices with minimal functionality.
Honestly, it’s Hobson’s choice. I don’t like any option, but option 3 seems to be the least worst option. My principal concern with it, as I said previously, is what constitutes ‘illegality’ - but that concern applies to all the other options except 4.
Genuine question. Leaving aside any issues of platform partisanship, if you were a big tech making the revolutionary new SabroniPhone, and assuming that you didn’t want to get legislated out of business, which option would you choose - and why?
-
-
Saturday 7th August 2021 17:52 GMT katrinab
Re: re: this seems like a sensible middle ground.
Picking up on this, anecdotal evidence seems to suggest that children from Thailand and other countries in the region are more likely to be victims of this particular type of child abuse. Children elsewhere do get abused, but it appears they are less likely to be photographed while being abused.
Does the "AI" decide that any child who looks Thai must be a victim of child abuse; and therefore reports any Thai familily taking perfectly normal photos of their children doing perfectly normal things that children do?
-
-
Friday 6th August 2021 10:08 GMT Anonymous Coward
Re: re: this seems like a sensible middle ground.
What other crimes might Apple users have committed?
Why stop at calling your customers pedos, why not also call them terrorists? Perhaps they have Jihaddist images? Perhaps 911 images in a folder labelled 'favorites'?!
Perhaps they have animal abuse imagines? Maybe you could save a cute puppy from their horrible abuse.
Why stop at text scanning of messages they send, why not also scan their emails? Not just the attachments, you're going to scan the text so why no email text too? All those forbidden crime code words these cosa-nostras might use.
Why stop at a US provided test model, why not also scan a Chinese one, provided by China? For images that are illegal in China? Tiannemen square tank boy?
Or a Russian set provided by Putin? Images of opposition propaganda and other illegal images?
Perhaps images of Mohammed? In muslim countries, so Apple users can be stoned to death.
Isn't that also a crime your customers might be committing in those countries? Shouldn't those users be stoned to death?
Are they drug dealers? Maybe scan their messages and images for drugs and drug related paraphernalia?
Is there anything suspicious about where they go and who they meet, maybe pass their GPS track for approval too?
I mean these Apple users, they's such scum, that Apple needs to protect the world from them, maybe we just stop and search everyone with an iPhone. Just in case.... oh, right, that's what they're doing here.
-
Friday 6th August 2021 11:17 GMT CountCadaver
Re: re: this seems like a sensible middle ground.
Or LGBTQIA folks in many non western countries
Images of Pride flags, support for trans people
People belonging to the "wrong faith"
People who have committed apotasty
People who are members of the "wrong party"
People oppose / support independence
People who criticise "the leader"
Its a very very steep and extremely slippery slope towards the horrors of Airstrip 1.....that or the UK as envisioned in Futuretrack 5 (Robert Westall - young adult before young adult), a very underrated near future dystopia where citizens are constantly "psy scanned" for impulses such as violence, anger, suicide etc and lobotomised...
-
Wednesday 11th August 2021 22:58 GMT Fruit and Nutcase
Re: re: this seems like a sensible middle ground.
For images that are illegal in China?
https://duckduckgo.com/?q=winnie+the+pooh&iax=images&ia=images
-
-
Friday 6th August 2021 10:23 GMT confused and dazed
Re: re: this seems like a sensible middle ground.
A thoughtful response. But I think Apple are big enough to just say not.
They will not scrutinise what it on your phone, (whatever the method). Have the debate with legislatures out in the open.
Today it's pedos and terrorists, tomorrow it's activists.
-
Sunday 8th August 2021 19:56 GMT Scott 26
Re: re: this seems like a sensible middle ground.
>A thoughtful response. But I think Apple are big enough to just say not.
Except the Chinese ICT market is big. (I mean rally big). And Apple, ultimately wants a slice of that pie. And if the CCP says "you want to operate in our country, then here's some images we want you to scan for"... I bet Apple bends over.
-
-
Friday 6th August 2021 12:30 GMT tip pc
Re: re: this seems like a sensible middle ground.
I’d rather they be honest and state exactly why they are doing this, not try and dress it up as a positive.
If there is a directive every vendor must comply with then they should just come out and say so.
To comply with xyz we have developed blah blah blah.
It’s the underhandedness that I can’t stomach.
Be honest and let us trust.
2 months time this will be largely forgotten and no one will care, people will be scrambling for the next super I thing with off the charts neural cognition etc etc.
I might buy up vintage 2020 tech and hope the new oppression software won’t run on 2020 vintage tech.
-
-
-
Monday 9th August 2021 07:47 GMT ForthIsNotDead
Re: Linux phone, here I come...
@cyberdemon - yes - I suspect the powers that be will be making justifications for such tooling any tine now!
If Apple are planning to roll this out on their phones, then it just makes sense that they will also roll it out on their laptop platforms. Microsoft will surely follow on Windows. I mean, THINK OF THE CHILDREN!
-
-
-
-
Friday 6th August 2021 12:48 GMT FIA
Re: re: this seems like a sensible middle ground.
A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met.
A third party designed the software, initiates the scan, decidedes the criteria and can then decrypt what they find if they so choose. That sounds like a third party scanning the device to me.
What about option number 5....
5) Tell your customers, shout about it loudly, so public opinion can rally and tell their legislators what they think?
I understand the situation Apple is in, but we as a society don't have to accept this kind of nonsense.
There are many example of over broad early 2000s 'terror' laws since being used on regular folk; and plenty others of overly broad AI being used as a mallett. (Amazon firing workers by AI for example.... )
-
-
Monday 9th August 2021 16:01 GMT Cav
Re: re: this seems like a sensible middle ground.
That is flawed logic. Do you seriously think people will still turn up with shoe bombs knowing that they will now be checked? That bizarre logic disregards the fact the remediation for a problem actually works.
It's like arguing that anti-virus software is unnecessary because you haven't been infected with a virus, because of the anti-virus... Malware changed precisely because anti-virus software was developed.
The same applies to terrorists. Why would you stick to a method for which you know there is now a counter-measure?
-
-
-
Friday 6th August 2021 15:26 GMT Geez Money
Re: re: this seems like a sensible middle ground.
"A third party isn’t scanning your device though. The AI on the device is scanning your device. It’s not leaving your device at all unless certain criteria are met."
Give me a concrete and enforceable (from my end) guarantee that the AI will never make a mistake and send a non-abusive image off the phone and that the technology will never be used for anything else and maybe we can have this conversation.
-
Friday 6th August 2021 19:43 GMT MrDamage
Re: re: this seems like a sensible middle ground.
Fuck off with your apologist bullshit. Magical thinking that only the good guys will have the decryption keys, and no-one else will figure it out, or there won't be a mail leak by a disenchanted ex-employee?
Remember Microsoft's Secure Boot fiasco in 2016?
Or how about when The Shadow Brokers auctioned off NSA tools and exploits?
This shit don't work, and never will.
-
Tuesday 10th August 2021 11:21 GMT YARR
Re: re: this seems like a sensible middle ground.
The way government legislation is going worldwide
Governments in authoritarian countries should have no influence over our data privacy in non-authoritarian countries. If the authoritarian countries insist on violating data privacy it should be done via an API that surrenders access to a separate government-mandated software application that does the analysis/spying function. By default, with no government-mandated software installed the API should be inactive and respect the user's data privacy.
Given most western nations now have GDPR-like rules governing access to personal data, this processing should not happen without user consent. There should be a clear opt-in rather than a mandatory surrender of consent buried in the legal terms of service.
-
Tuesday 10th August 2021 11:46 GMT Anonymous Coward
Re: re: this seems like a sensible middle ground.
"Given most western nations now have GDPR-like rules governing access to personal data, this processing should not happen without user consent. There should be a clear opt-in rather than a mandatory surrender of consent buried in the legal terms of service."
But most of them have outs for when the governments themselves want access. After all, he who sets the rules...
-
-
-
Friday 6th August 2021 19:20 GMT martyn.hare
Re: re: this seems like a sensible middle ground.
Please, just read the damned technical documents.
Their system doesn’t scan photos which are only stored on your device because it depends upon uploading the on-device hash AND the encrypted payload to iCloud Photos for any actual matching to occur, which is then all completed server-side. Your device also can’t know if any given image is known CSAM or not without uploading all of the data,
The technical paper says it only matches against the known CSAM database which is purely partial-matches based on hashing, not machine learning like people have implied. It’s Microsoft PhotoDNA but refactored to securely do half of the task on-device. It can’t match new images which haven’t first been added to that known CSAM database just like PhotoDNA can’t. But it can have false positives technically. To avoid that being a major issue, they implemented a threshold so one has to have at least more than one positive known CSAM match for anything to matter. To further ensure occasional false positives don’t colour any reputations, all devices produce synthetic false matches deliberately, such that pretty much everyone shows up as potentially having some CSAM on their devices at random at some point. The synthetic false matches do not provide valid data in their payloads and can’t be used as a way to unlock access to real images but cause enough flagging to occur to protect everybody against false accusations based upon numbers of matches.
The design of this new system is clearly built with the idea of later deploying iCloud end-to-end encryption. Otherwise, they’d just be scanning photos entirely server-side after they’ve been uploaded to iCloud like the rest of the industry is already doing and they wouldn’t be making an announcement about it like it’s a big achievement. If they’re not going to implement full end-to-end encryption of the actual photos then this would be a pointless circlejerk as well as an unnecessary hit to PR since I bet most lay people don’t even know PhotoDNA is a thing.
-
-
-
Friday 6th August 2021 12:46 GMT General Purpose
not scanning your device, exactly
What Apple's currently saying is "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image" (https://www.apple.com/child-safety/). That excludes scanning across the phone's library of photos, or across the iCloud Photos library, or the iPhone's iCloud Backup.
The technical summary linked at the end goes into more detail. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
As it uploads an image, the device runs the matching process and creates a "voucher" for it. The voucher includes the match outcome and an encrypted "visual derivative". When the number of vouchers indicating matches reaches some threshold, the "visual derivatives" in those particular vouchers are decrypted and a process of manual review and action begins.
Whether the whole idea of checking your photos is good or bad, they've clearly put a lot of thought into avoiding scanning entire libraries.At this stage, anyway.
-
Friday 6th August 2021 13:41 GMT Anonymous Coward
Re: not scanning your device, exactly
So it falsely accuses customers of being pedos PROBABLY, MAYBE, PERHAPS but its OK, the ones that flag more strongly goes to a subjective review of a person you don't know who will then make assumptions about the rest.
So while a man holding a little boys penis in a photo will be labelled a pedo, a doctor holding a pediatric patient male genitalia prior to circumcision is not, because 'Apple magic cool something or other'*.
* An actual real world example there, he's an excellent doctor by the way, not a pedo, or maybe he is a pedo I don't know just from the photo's of kids he takes as a doctor doing a before shot.
"Whether the whole idea of checking your photos is good or bad, they've clearly put a lot of thought into avoiding scanning entire libraries.At this stage, anyway."
So its OK, because they only look at SOME of your photos to make the decision as to whether you're a pedo, not ALL of your photos... at least at that stage. And they did think about looking at ALL your photos, so they clearly understand there's a problem here, so that makes it ok right?
Cool, I'm totally OK, with people I don't know calling me a pedo in secret meetings based on limited info.
TAKE MY MONEY! You had me at "VOUCHER"!
Seriously, the world needs to be protected from me, and even though I'm not a pedo, if Apple says I am, then they must be right, and the world needs to be protected by putting me in some sort of magsafe stylish Apple shackles. Because after all, isn't that why you buy a phone? So that people can secretly look at your photos and make accusations against you behind your back? And if Apple make an accusation, who am I to say Apple is wrong and I am right? Did I make a stylish phone? No, I did not! Am I the third largest manufacturer in the world in the smartphone segment? No, I am not! I think that's conclusive then, I must be wrong. Slap those magsafe cuffs on, and take me away!
-
Friday 6th August 2021 16:55 GMT Cybersaber
Re: not scanning your device, exactly
"* An actual real world example there, he's an excellent doctor by the way, not a pedo, or maybe he is a pedo I don't know just from the photo's of kids he takes as a doctor doing a before shot."
My son was circumsized shortly after birth, and there were no pictures taken. Maybe your doctor is related to Joey Tribbiani's taylor?
-
-
Monday 9th August 2021 15:01 GMT Cybersaber
Re: My son was circumsized shortly after birth, and there were no pictures taken.
I did question the need to do that to my boy. I was told by my family doctor that circumcision was for health benefits, and the benefits were listed to me. Now, regardless of whether the current teachings of the medical community still reflect that, I was acting under the advice of my medical doctor at that time.
A century from now, people may very well demonize you for an action you took because it was the best informed decision you could make at the time - just like every generation since the dawn of recorded history has done to societies predating their own.
If you went to your doctor and asked him what you should or shouldn't do in response to COVID-19, are you wise to follow his advice, especially after receiving a convincing explanation of why that makes sense to someone who was just barely a legal adult (as I was when my son was born?)
Think of that before you demonize the rational, informed decisions and the people that made them based on what was known at the time.
-
Monday 9th August 2021 16:44 GMT tip pc
Re: My son was circumsized shortly after birth, and there were no pictures taken.
"I was told by my family doctor that circumcision was for health benefits, and the benefits were listed to me. Now, regardless of whether the current teachings of the medical community still reflect that, I was acting under the advice of my medical doctor at that time."
Trust me I'm a qualified expert.
If you're wealthy you get different much more informed answers from someone who has actually considered the circumstances, understands all the prospective treatments and weighs those up against the facts at hand.
For us plebs you'll get what ever dictate from on high they are trying to push. If its your choice they will lay it on thick and every "expert" will state the same thing.
Turn 40, take statins. if your weight is more than some chart states then you're obese and will die early.
They won't take into consideration if you're active or not.
walking 40 miles a week & cycling 40 miles a week plus 10,000 steps a day does not count & they treat you like a 40 stone sedate bed ridden person.
-
-
-
-
Friday 6th August 2021 17:59 GMT tip pc
Re: not scanning your device, exactly
"As it uploads an image, the device runs the matching process and creates a "voucher" for it. The voucher includes the match outcome and an encrypted "visual derivative". When the number of vouchers indicating matches reaches some threshold, the "visual derivatives" in those particular vouchers are decrypted and a process of manual review and action begins."
was just about to post much the same
the crucial bit is that these new capabilities will be in a future iOS 15, OSX12 update.
so potentially not invoked for now.
The trick will be to stay on iOS 14 OSX11.
Apple recently made a change so that older versions of code will still receive updates and its not mandatory to go to the latest major version if you don't want to.
I'm running the next version betas on my MacBook Pro, phone & iPad.
I'll need to research and see if they are safe or I need to down grade to iOS 14/ OSX 11
-
Friday 6th August 2021 18:25 GMT Nifty
Re: not scanning your device, exactly
"before an image is stored in iCloud Photos, an on-device matching process is performed for that image"
Are you a WhatsApp user with an iPhone? If so, from time to time you may receive photos from anyone able to look you up in WhatsApp. Those photos are stored on your camera roll right now. Go and have a look. In the default configuration your camera roll is synced to iCloud for the last 100 or so images. (You can disable this feature in WhatsApp > Settings > Chats).
Now have another think about the implications.
-
Friday 6th August 2021 18:43 GMT Anonymous Coward
Re: not scanning your device, exactly
Thanks for the link, it's handy to be able to discuss this in terms of what they are actually proposing as opposed to all the arm waving conjecture.
That said, based on the outline I am not too enthused about the process. ML just can't deliver on this, and it's a black box that even it's creators can explain how and why it thought it found a match. In addition human review is both taxing for the humans and prone to human failures.
This system, even before it's scope starts to creep, will unintentionally destroy peoples lives, while still failing to catch and prevent all incidents of actual child abuse. The only question is scale. And on the first day, a politician will tearfully state, if it saves even one child....
-
-
-
Friday 6th August 2021 12:39 GMT FIA
As I understand it, this technology has been developed because Apple has built its service in such a way that it can’t scan in the cloud so it has to scan on device. Everyone else, Google included, scans in the Cloud. One way or another, you’ll get scanned - it’s just a question of where.
As noted in the article, Apple does scan what you upload to iCloud.
But this is a world apart from device scanning.
If I store some stuff in your garage, you have every right to know what's in there, or tell me to jeff off. But that doesn't give you the right to come round to my house and root through my loft.
I really hope this dies quickly, I don't like Android. <sigh> Back to Nokia then.... I suspect the 3310 still has 50% charge anyway...
-
Monday 9th August 2021 01:35 GMT Anonymous Coward
Illegal search and seizure
Interesting that law enforcement could not do what Apple is doing without a warrant. Perhaps this is another application of outsourcing.
"It is a cardinal rule that, in seizing goods and articles, law enforcement agents must secure and use search warrants whenever reasonably practicable. . . . This rule rests upon the desirability of having magistrates rather than police officers determine when searches and seizures are permissible and what limitations should be placed upon such activities. Trupiano v. United States, 334 U.S. 699, 705 (1948), quoted with approval in Chimel v. California, 395 U.S. 752, 758 (1969)."
-
-
-
Friday 6th August 2021 13:13 GMT FIA
Re: If this doesn't make people stop using Iphones... then what?
I disagree, these things are on a scale.
I know Google scan my email, but in return I get Gmail. I understand this and made the choice, and it's a choice I'm happy with.
Other people aren't and choose not to use it. That's fine.
My phone, however, is my digital safe. It contains a record of my life to a greater or lesser extent. It allows me to communicate with my bank, it contains all my communication applications and a good few months of my recent photos. It also contains the 2FA apps I use.
Retrospectivly I do not want to give Apple the right to rifle through all that. (I don't use iCloud) That is not a choice I've been informed about.
-
-
Saturday 7th August 2021 09:38 GMT Anonymous Coward
If anything it might cause more people to use iPhones.
"I can't be guilty m'Lord, I use an iPhone, ask them if they found anything on my phone".
If you're a perv, just keep a clean iPhone and have Apple as part of your defense.
Apple is about to start defending more deviants than it catches.
-
-
Thursday 5th August 2021 22:19 GMT doublelayer
Two possible approaches
There are two methods this could take:
1. A model is created from the photos on Apple's end and the phone uploads its pictures to a server at Apple to do the comparison. This involves a mandatory leak of data which a user can't disable and, as Apple doesn't own the devices themselves, is currently illegal.
2. A model is created by Apple and sent to user devices, which scans the pictures onboard and sends the result to Apple. This is more likely to be legally viable, but it is going to cause a lot of problems as the processors in a mobile device are a lot weaker than a server and most models for picture comparison are likely to be large. There will at a minimum be complaints about the network bandwidth and CPU time needed to run this check, especially as I assume the model will get run every time a user takes new pictures and whenever new source material is added causing a model update. In addition, they are either going to have a lot of false positives or false negatives, meaning they'll need some method of determining whether an image is a false positive. Automatic uploads are still legally questionable, so this might result in a lot of suspicious reports which can't be verified. With the alternative that it is mostly useless though, I don't know whether they will accept a high false negative rate.
-
Thursday 5th August 2021 22:40 GMT elsergiovolador
Re: Two possible approaches
They are just going to change the license where somewhere in small print you'll consent to that.
If you decline, then your phone will stop working.
They are getting away with not allowing other app stores to run on iPhone, I can't see why they wouldn't be able to do this.
They'll give a wink wink to three letter agencies about them being able to scan phones to their heart contents and it will be more legal than scrambled eggs on Sunday morning.
-
Friday 6th August 2021 13:37 GMT General Purpose
Re: Two possible approaches
According to Apple, it's not your #1 and only partly like your #2. Yes, they send a database of hashes to your phone, but (they say), they don't scan the phone, they test an image as it's uploaded to iCloud Photos. In terms of bandwidth and CPU time, that's feasible. They say they'll start a manual review of an account when some threshold number of matching images is reached, not on each individual match.
-
Friday 6th August 2021 16:19 GMT doublelayer
Re: Two possible approaches
Yeah, that's number 2 exactly and the provisos still hold. In order to do the scanning, they will need to send each phone a copy of the model built from a big database. That's going to be a large file. Running it takes time. Not to mention that, although they're not scanning everything yet, there's little doubt that someone will find out that models get updated and they will need to use their new model to recheck the old pictures. Furthermore, there are people who don't upload photos to iCloud. I am one of those, mostly because I don't take many photos, but also because I have only the free storage and I don't want it filled with random pictures taken for temporary reasons. The scanning as specified wouldn't scan mine at all, so they're almost certainly going to change it so it does.
-
-
-
Thursday 5th August 2021 22:26 GMT Peter Prof Fox
A stalking-horse for copyright protection
I have a copy (of uncertain provenance) of that iconic 1977s session by the Five-aside-archers with Beetle Wulvis on Bass Guitar. Sony Music just happen to have signed a hoover-up contract with somebody who claims to have some derived title to the tracks. In England they might sue me under copyright law but if it wasn't currently for sale they can't show a loss so that's two-fingers to them. (They have to demonstrate a loss.) But with other (c) enforcement regimes there are other consequences. Furthermore, having and playing (in England) is neither a crime or a civil tort unless they can prove I'm not entitled. (For example my copy came from Big Joey Frobisher himself.) But in this new world YOU HAVE COPYRIGHTED MATERIAL ON YOUR DEVICE! GO TO JAIL. DO NOT PASS GO. is the default position these mega corporations expect us to accept.
-
Friday 6th August 2021 17:08 GMT MachDiamond
Re: A stalking-horse for copyright protection
"(They have to demonstrate a loss.)"
No, they don't. If the Copyright was registered in a timely manner, they can get statutory damages which are penalties defined by law, not commerce.
I have images that are not up for license. If somebody stole one of my backup drives and published one or more of those images, they could be liable for up to US$150,000 for each infringement. They are all registered and I choose to not offer them for various reasons. I used to do much more journalistic work and I have images that are rather ugly for one reason or another. One of the rights included with Copyright is the power to say "no".
-
Monday 9th August 2021 08:51 GMT Anonymous Coward
Re: copyright protection..this is how it plays in the real world
Got some good lawyers and the very deep pockets to pay for them? Otherwise your "copyright" is basically worthless. Even then you better hope you get a judge who knows even the basics of copyright law.
I knew a starving artist type whose iconic image was blatantly ripped off by a software company. Those of you of a certain vintage will know both the image and the software. The artist was so tech illiterate that he only heard by accident about the rip off. He approached the software company asking for an exceptionally modest license fee considering how much money they had made (many millions) but despite them parading the ultra progressive politics in public they told him to get lost. A friend of the artist, a famous musician all of you would have heard of, was so disgusted by this shabby treatment of his friend that he put up the money for lawyers to sue.
So the starving artist had deep pocket support and very famous backers with access to serious legal firepower. Plus a witness willing to swear under oath that they had heard first hand one of the founders of the software company tell the story of how they came to use the starving artists image for the software. It was done in the full knowledge of where the image came from and who owned it and they never had any intention of paying any license fees for it.
Sound like a slam dunk? Nope, the first hearing was in front of judge who knew so little about copyright law that he threw out the case because the work had not being registered with the Copyright Office. Even though the very experienced copyright lawyer working for starving artist pointed out that was not how copyright law works.
The case would have been refiled, everyone was telling the starving artist it should be, but the starving artist, who was genuinely one of the nicest people you could possible meet, decided that life was too short, and anyway, the fact that so many people had rallied around to support him was consolation enough. He was really touched the fact that so many had gone out of their way to help him.
And the happy ending? Of course not. The founders of the software company despite promising the staff they would be going public so the employees would get a nice bonus for all their years of hard work sold the company and kept all the money for themselves. One founder used their new found (serious) wealth to bankroll one of the most high profile ultra left political organizations of the last few decades. Every time I see that organizations name bracketed with some demands for equity, rights, against "oppression" etc etc, I think of where that organizations money originally came from and all the people that were screwed over to make all that money. There was a very large number of them.
Including that starving artist.
-
Friday 6th August 2021 21:47 GMT Long John Silver
Re: A stalking-horse for copyright protection
That was my first thought too upon reading the article.
The proposal as stated needs examining in the light of cynicism and pragmatism.
I don't doubt most (near all) executives of companies tempted by this technology support curbing, by feasible means, the spread of illegal pornographic images. Doubtless as private citizens they would report suspicious materials via existing channels. That can be deemed a moral imperative.
There is no moral, legal, or business obligation to set their companies up for systematically monitoring 'content' passing through their hands. There would be absolutely no commercial sense. If by chance they fall upon it then citizen obligations arise.
Introduction of the technology for purpose stated in the article is most unlikely to be efficient/effective use of resource to tackle the underlying problem which is creation of recent (potentially traceable source) images for which there is prospect of the perpetrator manufacturing more through continuing abuses.
Other readers have made plausible argument why using databases of images and/or an AI capable of differentiating hitherto unknown images depicting harm from an innocent photo would raise immense ethical and legal problems merely through fact of the screening process taking place. Add to that differing world jurisdictional criteria for distinguishing abuse from art (age being a factor too) then the only safe ground upon which screening could operate would be that of images so dreadful that agreement by legislatures is near certain. The last option may be justifiable but why oblige business to seek needles in haystacks? After all, one suspects the most prolific abusers to be phone and Internet savvy and therefore not prone to stuffing their material onto clouds. A similar point arises regarding the other weasel concept justifying surveillance: terrorism.
Tracing and curbing copyright infringement makes more sense. Even companies not in the business of creating or distributing copyrighted 'content' could be induced to take part in screening customers' data. Copyright rentiers would pay a fee for the service.
The shaky ground upon which copyright rests has been revealed by the Internet to all who care to look. Purveyors of film and music are fighting rearguard action. They are becoming ever more desperate. Unofficially streamed sport (consequent upon rampant price-gouging by official outlets) has joined the centre of attention. On the bright side, academic publishers have lost the battle but most don't yet know it.
For some time I have thought the final battle will take place on the turf of proprietary operating systems. Here and elsewhere I have mooted Microsoft will discover a lucrative market by offering Windows, at least household versions, as Internet access guardians. 'Consumer' versions are pretty much battened down regarding scope for user modification and bypassing some features. For instance, updates, especially those purporting to correct security flaws, can, at most, briefly be delayed. 'Windows Defender' is almost mandatory and it should be easy to make it so.
'Defender' represents an engine in place to serve copyright rentiers, surveillance unavoidable from advertisers with their trackers and from other agencies, plus snooping for illegal 'content'. 'Security updates' will be means to refresh hash databases and so forth. Returning data to Microsoft home poses little challenge because Windows is doing so all the time already. Given low cost high bandwidth connections and agile processors in devices the ordinary user will notice no difference. Those needing to push devices to their limits already have, or will, migrate to open source operating systems where users have complete control over configuration. 'Gamers' will stay with Windows only because Linux versions of popular games are sparse. Perhaps they will remain scarce when 'games houses' grasp the great potential for Windows to control DRM more than now.
'Defender' upon recognising files containing unlicensed 'content' could have several options: disable playback/viewing, erase the file, and call home with details. 'Defender' could also render Darknets inoperable; either prevent Windows from running on them them or, with greater subtlety, make access slow or by other means unreliable.
Introduced stepwise and without public fanfare these measures need not be noticed by most users. MSM, in thrall to governments and copyright moguls, is unlikely to raise a fuss. Muttering could be stifled by appeal to welfare of children (not just regarding abuse) and to fear of the ubiquitous terrorists lurking behind trees.
That, I suggest, will be the final battlefield and I cannot predict the victor. One outcome will be surveillance societies with restricted freedoms and human culture all but dead; the enthusiasm with which people have adopted the chaotic Covid narrative makes this plausible. The other will be abandonment of rentier economics. This will release considerable opportunities for nations to use more productively the resource currently channelled, usually overseas, to parasites rather than genuinely creative people. Release of restriction on 'derivation' should lead to cultural renaissance: hitherto unparalleled innovation and reduction of grossly inequitable wealth distribution.
-----
Released under the Creative Commons Attribution 4.0 international licence.
-
-
Thursday 5th August 2021 22:36 GMT Ian Mason
So Apple have solved the problem of what to do with that pile of cash?
That'll be handing it over to all the people who sue them for this. If the police in most civilised countries need a warrant to search your possessions for unlawful material, what authority do Apple claim for this gross abuse of civil liberties? What theory of law do they have that they think they have carte blanche to start searching through people's phones?
What do their marketing department think about all that money that they've wasted touting Apple's privacy credentials now that another part of Apple has just completely trashed those credentials overnight.
Really Apple? With all your shouting about privacy I really expected better from you.
Anyone who says "Think of the kiddies" and there's no doubt going to be some here: It's always used as the excuse for inserting the thin end of the wedge, and then at the first excuse whacking the other end with a bloody great mallet. A recitation of the evils that follow bending the rules of civil hygiene for some "special case" ought not to be necessary. But for those thinking "Well, it's only going to affect paedophiles" anyone with one jot of sense knows that if this is permitted then there will be another "good and worthy" case permitted, then another less worthy until it trickles down to the point where your phone's camera will feed the onboard AI, which will note the double yellow line you just parked on and immediately debit your bank account the parking fine and put points on your digital driving license.
-
Friday 6th August 2021 11:22 GMT CountCadaver
Re: So Apple have solved the problem of what to do with that pile of cash?
Govts in most "civilised countries" will simply just pass very broadly worded "child protection" legislation that explicitly allows this and creates severe legal penalties for attempting to block this or bypass this (same way as refusing a breath test in many places gets you in front of a judge) along with implication that you are a child molester....Nothing like public pressure to get the proles to comply for fear of ostracisation, violence, vigilante murder etc....
-
Friday 6th August 2021 14:01 GMT GruntyMcPugh
Re: So Apple have solved the problem of what to do with that pile of cash?
Privacy is a fundamental human right. At Apple, it's also one of our core values. Your devices are important to so many parts of your life. ... We design Apple products to protect your privacy and give you control over your information.
Privacy - Apple (UK)https://www.apple.com › privacy
I guess there's something in the small print about Apple Privacy. being slightly different from how we understand privacy.
-
Friday 6th August 2021 17:54 GMT DevOpsTimothyC
Re: So Apple have solved the problem of what to do with that pile of cash?
give you control over your information
Your information, not your content :(
Looks like Apple is the first big company to openly state it's starting down the thought police route. While the other companies might monitor and profile you I wasn't aware they also informed the authorities if they didn't like something you did (unless it was to steal their IP)
-
-
-
Thursday 5th August 2021 23:09 GMT David 132
Re: Won't Someone Think of the Children?
"We're only using it to scan for images of kiddie porn. Are you on the side of the kiddie fiddlers?"
"We're only using it to scan for images glorifying terrorism. Are you a terrorist?"
"We're only using it to scan for images of serious crimes. Why do you sympathize with murderers?"
"We're only using it to scan for images of banned ideologies. What, are you a neo-Nazi?"
"We're only using it to scan for images of racist or TERF nature. Such thought is wrong and is banned."
"We're only using it to scan for images showing support for Antifa."
"We're only using it here in China to scan for images showing you support the Hong Kong democracy movement."
"We're only using it here in Spain to scan for Catalan separatist imagery..."
I won't stoop to repeating the old Pastor Martin Niemöller quote cliché, but there's a definite slippery slope argument to be made here.
-
Friday 6th August 2021 10:50 GMT Anonymous Coward
Re: Won't Someone Think of the Children?
For those who think this will only affect paedophiles, I suggest they read up on what happened to Julia Somerville at https://en.wikipedia.org/wiki/Julia_Somerville#Allegations in the pre iPhone age. If I were you I would make sure that you never take a photo of your kids on your iPhone after Apple introduce this unless you want to risk the knock on the door in the middle of the night from Constable Plod.
-
Friday 6th August 2021 11:23 GMT Anonymous Coward
Re: Won't Someone Think of the Children?
From what I understand... at moment tech shall see if you have any of the 'known bad' images.... that depict TRUE CHILD ABUSE...
But ... if it is based on a Neural Net.. learning that naked children are a sign of abuse... then there shall be a problem.
-
-
-
-
Friday 6th August 2021 09:42 GMT jdiebdhidbsusbvwbsidnsoskebid
Re: Don't use your iPhone in church
Surely you won't actually get prosecuted in cases like that: As soon as the photos are shown in a court to be innocent, the case will be dropped.
But what worries me is that the route to that absolution will horrible for anyone falsely accused. Police will be knocking on your door at strange times, you and your family's computers will be seized, your employer won't trust you anymore and when your identity inevitably gets leaked, the local paediatrician* haters will be smashing your windows every night for months while the slow gears of the law grind away.
For things like this, I'm normally of the "if you have nothing to hide you have nothing to fear" mindset. But as soon as I read that Apple will be using AI to find incriminating evidence, I worry. Unless they have real humans very early on in the image identification process, the false positives that the AI will inevitably throw up could cause a lot of hassle - the sort of hassle that can never be undone.
*https://www.google.com/amp/s/amp.theguardian.com/uk/2000/aug/30/childprotection.society
-
Friday 6th August 2021 11:14 GMT Lil Endian
"the local paediatrician"
Yes, that has been at the forefront of my mind too.
Social perception when accusations of paedophilia have been made erroneously does not easily reverse, if at all. When idiot zealots can't even read...
It's life destroying for those involved.
Of course, there will always be cases of genuine concern which are shown as innocuous, in all areas of law. But when the case is child abuse related people assume "no smoke without fire" indefinitely. A tough situation indeed.
-
Friday 6th August 2021 12:07 GMT Anonymous Coward
Re: "the local paediatrician"
> Social perception when accusations of paedophilia have been made erroneously does not easily reverse, if at all. When idiot zealots can't even read...
> It's life destroying for those involved.
My wife is a teacher - the last thing I want to have to deal with is a false alert from some dubious piece of software that's been "verified" by a half-wit whose only qualification for the job of staring at other people's pictures all day is that he is too stupid to get any other job.
-
-
Friday 6th August 2021 11:27 GMT CountCadaver
Re: Don't use your iPhone in church
HAHAHAHAHAHAHAHAHAHAHAHAHA
You seriously believe that?
Given how even in the UK that judges have an eye on politics and err on what they deem "the safe side" i.e. someone urinating in the street was put on the sex offenders register....gross but hardly rape, notice how insidiously worded "sex offender" has become, encompassing quite a range of stuff that a large percentile of the populace wouldn't deem "sex offences" and no one able to challenge it for fear of being classed as a "pedo apologist" or a "peeeeedo" themselves
Giving people an unrestricted right to vote was a piece of stupidity....look where its gotten us.......
Something to be said for those who make everyone else's life hell being "reminded of their place"
-
Friday 6th August 2021 18:07 GMT tiggity
Re: Don't use your iPhone in church
And given the massive closure of public toilets in the UK over the last decades, urinating in public often a necessity (with age you find micturition more frequent and unless you massively dehydrate yourself in advance (not a good idea) a long walk as a senior citizen often needs a "loo break" - & often alfresco if no facilities round e.g. countryside)
-
-
Friday 6th August 2021 17:17 GMT MachDiamond
Re: Don't use your iPhone in church
"Surely you won't actually get prosecuted in cases like that: As soon as the photos are shown in a court to be innocent, the case will be dropped."
If you wind up in court as a defendant charged with possession of child pr0n, your life is over. That arrest will show up in the Big Data files that large companies use to track people working for them and job applicants. The file that says all charges were dismissed even before trial doesn't sit side by side with the arrest notice. These Big Data companies are also not subject to the same sort of laws that credit reporting agencies are. There are many more of them and you may not have heard of most of them. They have no legal requirement to share the information they show for you with you. They have no legal requirement to purge any information that may be untrue. All you will notice is that you are passed over for promotion over and over or you don't hear from companies you have applied to. You'd have to be very lucky to have somebody tell you that there is derogatory information about you that earned you a down check. At least at that point you will know your life is screwed and it's time to go into business for yourself.
-
Thursday 5th August 2021 22:41 GMT billdehaan
People went to digital photography to get AWAY from this
Back in the 1990s and early 2000s, there was a "think of the children" panic in Canada, and crusaders went on the tear to get the police and government to "do something" to stop it.
In the middle of this climate, I know of three cases where people ended up getting visited by police investigating them for alleged child pornography.
One case was a Japanese anime, as in, a cartoon, with no actual humans being filmed, let alone children.
The other two were the result of photo development. Those old enough to remember actual film cameras know that unless you had a darkroom, chemicals, and skill, you needed to go to a photo developer to convert your raw film into actual snapshots. Camera stores did it, of course, as well as specialty outlets like Fotomat, but one of the most common photo development places was, oddly enough, the pharmacy. And it was pharmacies that called the cops on two people getting their photos developed.
The first case showed the shocking picture of a nude 5 year old boy with his pants on the sidewalk with a scantily clad 3 year old girl next to him. In other words, a 3 year old girl snuck up on her big brother and pants his swimsuit on him. Mom happened to be taking pictures of her kids in the pool, and couldn't resist getting a snap of her kids pranking each other.
The second case was similar, with a grown woman in a bathtub with a 2 year old boy, who decided to make an obscene gesture to shock his mommy just as Daddy walked in. In other words, a typical "Jim, get in here and see what your son is doing" family moment.
Fortunately, in both cases, the police officers were parents themselves and not idiots, and when they visited the families and saw that the kids photographed were the children of the photographers, they realized that the photo developers had completely over reacted. But as you can imagine, those families stopped sending their film out to be developed, and went to digital photography.
Now, you don't even have to drop your film off to have busybodies report you to the cops, your camera vendor will do it as soon as you take your picture.
There's no way that this won't be abused, both by companies, and governments.
-
Thursday 5th August 2021 23:20 GMT JimboSmith
Re: People went to digital photography to get AWAY from this
Not forgetting Julia Somerville and her ordeal. https://en.wikipedia.org/wiki/Julia_Somerville#Allegations
-
Friday 6th August 2021 06:57 GMT MrBanana
Re: People went to digital photography to get AWAY from this
Back in those days, while on holiday in Sicily, in a very cold hotel bedroom, I had to go to the loo during the night. I don't wear pyjamas so made a quick dash. My wife snapped me, "in motion", on the way back. Boots processed the film, and placed a "this picture is underexposed" sticker over the offending area. Just to stress, it was January, very cold, marble flooring, so it was an unusually small sticker. Otherwise, a larger, overexposed sticker would have been more appropriate.
-
-
Friday 6th August 2021 14:16 GMT GruntyMcPugh
Re: People went to digital photography to get AWAY from this
In the mid eighties I had a Saturday job at Woolworths, and we did photo processing. My boss had a knack for spotting nervous customers,.. so would make a little mark on the bag their film went in. When the pictures came back, he'd check for skintones. He wasn't often wrong. Apart from his wrongness in doing what he did, of course.
-
Friday 6th August 2021 17:47 GMT Arkeo
Re: People went to digital photography to get AWAY from this
That's why I rarely use my cheap-a$$ Droid for photos--got a Nikon DSLR for that thankyouverymuch. I started to fear something like this would happen when Google nuked Picasa (local) in favour of GPhotos (online). My photo library (mostly sea- and city- and land-scapes anyway, occasional portraits) is safely offline. And why my ancient Picasa 3.9 installer is safely stored in every (offline, of course) backup I manually do. And I'm on Win11, but still using it. Call me old-school...
-
-
-
-
-
-
-
Friday 6th August 2021 14:29 GMT Anonymous Coward
Re: Real cameras
Why don't they turn on the camera and remotely take a look at what suspected pedos are up to? Perhaps they're molesting kids WHILE Apple is busy going through their review process or paperwork or something.
At least turn on the mic remotely and have a listen to save some kids from those pedo Apple customers!
I really don't get what the problem is, if your not a pedo, hold up the phone and wave it around whereever you, are so an Apple employee can confirm "not-molesting' to his own satisfaction. Perhaps that Apple employee will be as professional as gandalfcn.
-
-
-
Friday 6th August 2021 11:32 GMT CountCadaver
Re: People went to digital photography to get AWAY from this
UK said anime would land you on the sex offenders register as a "pseudo photograph of child abuse imagery / indecent image" along with "making child abuse imagery" (courts ruled that a computer downloading a file counts as "making" but the public see it as "taking images of children being abused" and the prudes keep pushing the term "kiddie porn" to make a link between pornography and child abuse so they can outlaw pornography....
We are headed rapidly into a fascist dystopia....worse than the Inquisition....
-
Friday 6th August 2021 16:15 GMT MalIlluminated
Re: People went to digital photography to get AWAY from this
An important lesson from those days was “always invite the girl at the photo counter to your parties.”
Today, digital cameras exist, as do multiple means of online and offline storage. I can’t see where this invasion of my privacy will actually solve the stated problem.
It seems the real problem is that people can’t be trusted not to do atrocious things. I’m thinking of torture and blowing up people in other countries because you don’t like them. So I think when the government is ready to allow me to root around in their phones and cloud storage for immoral behavior, perhaps a compromise could be reached.
-
-
Thursday 5th August 2021 22:45 GMT Anonymous Coward
It doesn't surprise me
Apple has a history of kowtowing to various country's demands for suppression of or access to data.
This seems like an attempt to avoid a US mandated backdoor with a "limited" front door.
I wonder if the initial research was done to help China monitor their iPhone users.
-
-
Friday 6th August 2021 14:29 GMT Cuddles
Re: It doesn't surprise me
Money. Never make the mistake of thinking Apple, or any other large corporation, has actual principles. They never cared about your privacy, they simply calculated that claiming so was a good PR point to compare themselves to competitors (ie. mainly Google). As long as they thought it would be more profitable to make a show about privacy than to join in the spying themselves, that's what they did. Now, they think there is more money to be made by making a show that they're thinking of the children than there is standing up for privacy. Nothing changed other than the details of a cost/benefit analysis by the bean counters.
-
-
-
-
Friday 6th August 2021 11:29 GMT karlkarl
Citation? No need. You can easily run the experiment yourself. Just try to copy a bunch of cracked games onto your desktop and observe the results.
You will experience similar to:
https://www.reddit.com/r/Piracy/comments/9xuzp5/windows_keeps_deleting_crack_files/
https://www.bleepingcomputer.com/forums/t/752884/windows-defender-results-are-very-confusing-when-detecting-cracked-software/
https://www.reddit.com/r/TPPcrack/comments/3lgvyj/windows_defender_just_keeps_fucking_up_the_crack/
Then try to copy and execute a known malware or trojan. Very likely you will observe nothing at all (or at least until files start to go missing and adware appears) ;)
-
-
-
-
-
Friday 6th August 2021 07:48 GMT Cereberus
Semantics
Apple will not hold an unencrypted copy of the database:)
Apple will have the ability to remove files from the database and decrypt them, but the database itself will remain encrypted.
"Only to the usual Apple haters and the terminally paranoid"
Does terminally paranoid mean you aren't paranoid enough? After all you were paranoid and thought everyone was out to get you. You were right but didn't take enough precautions and they got you.
-
Friday 6th August 2021 10:32 GMT Splurg The Barbarian
I would be very, VERY surprised if it is actually a database of photographs/images. It will mote than likely be a database of known hashes of offending images, which will be added to by law enforcement agencies who are examining hardware and identifying these images.
Same is done in UK with our version.
-
-
-
Friday 6th August 2021 13:02 GMT Splurg The Barbarian
I certainly do not support Apple, far from it. I have many, many criticisms of Apple and the "cult". But anyone believing that Apple have been given a database of indecent images depicting the sexual abuse of children needs to give their head a wobble. The authorities, certainly in the UK which I have professional experience of the processes, have a national database of known images in which the hashes are stored NOT the images. This is what it is compared to and will be flagged up.
MS do this in their cloud offerings and have done so for at least a decade. The issue for me is the fact that it appears from the statement, that this will be done on the DEVICE, to images set for upload to iCloud. Also once the precedent is set for doing this on an individual's device what is then next?
-
Friday 6th August 2021 16:29 GMT doublelayer
You can't train an AI on hashes to detect new offending material, which is what they said they wanted to do. I don't think they're going to do anything other than create that model from the data, but if they're doing it at all, they'll need a method of running it on the real pictures. They could easily develop this on a database which they don't hold and from which they can't extract the images without sending out an alarm, so it doesn't mean they're storing it themselves or in perpetuity.
-
-
-
-
Friday 6th August 2021 13:05 GMT Ian Mason
You can't train an AI on hashes, it has to have the original images.
In the UK at least, as originally put into law, mere possession with no regard to the intent of possessing such images is a criminal offence.
This originally led to a regime of selective prosecution just to work around the sheer stupidity that the police were committing criminal offences by retaining the same as evidence. I believe that particular stupidity has been legislated away, but mere possession is still strictly illegal for individuals/companies whether they know they are in possession or not, and whether they are in possession for what anybody would see as a legitimate purpose (e.g. to create hashes, preserve evidence to hand to the police etc.). Witness the senior Met. police officer (Ch Supt Novlett Robyn Williams) who was prosecuted for possession when she claimed not to even know that someone had sent her the material.
-
-
-
Friday 6th August 2021 00:13 GMT Chris Gray 1
Bandwidth!
I don't own any Apple devices (surprisingly, I'm OK with the walled garden, but can't afford them and since I run Linux, which has poor support for getting images out...), and it now looks like I never will.
My current cell-phone is an 8-year old Samsung S4, and my data plan is tiny by most standards. Any attempt by Google to do this sort of thing will result in me going back to a "feature phone", for purely financial reasons.
-
-
Friday 6th August 2021 08:33 GMT Anonymous Coward
Re: A new iPhone meaning for "jailbreak
It's a good idea, in principle, just like nuclear energy - can be a force for good but some people will find ways to make it otherwise.
My initial concern, even before it gets deliberately misused, and like a few others have commented, is how many of the pictures of your own children/grandchildren will be treated. Like the ones of them playing in the bath, or toddlers on the beach, for example. Someone has already referenced the Julia Somerville incident; if it's all automated such cases will escalate before common sense gets a chance.
-
Friday 6th August 2021 09:46 GMT gandalfcn
Re: A new iPhone meaning for "jailbreak
"Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage"
"comparing them against a database of known child abuse imagery. If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.
Since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry. But researchers worry the matching tool – which does not “see” images, just mathematical fingerprints that represent them – could be put to different purposes."
-
Friday 6th August 2021 10:40 GMT Anonymous Coward
Re: A new iPhone meaning for "jailbreak
Problem is that will weaken Apple's 'sorry, we have no way to break in to the perp's phone' excuse to block unlocking/accessing data, as they have in the past.
If Apple staff can review images then why can't <TLA> have access to everything at the drop of a court order
-
Friday 6th August 2021 10:44 GMT Anonymous Coward
Re: A new iPhone meaning for "jailbreak
No it looks for an AI model to *approximate* that image set. So yeh, naked kids in bath are likely a high false positive.
I don't know how it works from there, you get flagged as a pedo or something? An officer visits and demands access to your phone on the basis that your a pedo and he has probable cause provided by Apple. They'll presumably pour over any images and text and emails and browser history and porn surfing and so on to asses how much of a pedo you are? Or perhaps you're some other criminal?
I don't know if Apple will automatically pull your data from your phone for them, so this search might happen remotely without your knowledge. If you don't see it, it doesn't count right?
But that's OK, they're protecting your kids from you, and you did mention how you like to take naked photos of your kids in the bath, under the non-de-plume 'Gandalf', which I have to say is awfully suspicion!
-
-
-
-
Friday 6th August 2021 01:57 GMT uncle grumpy
After fighting off Ring and Alexa, my hardware has risen against me. Will there be any place to migrate too? Off course not, the dominoes will fall in rapid succession. Bill Barr must jumping for joy. Imagine the zeal the next regime will use, although I’m sure the current regime is supportive as well. I’m so pissed at the betrayal of these weasels I can barely see straight.
-
Friday 6th August 2021 02:03 GMT Clausewitz 4.0
Solution
Just like what Signal Private Messenger did with "aesthetic" messages for Israeli-Cellebrite (this chapter isn't finished, expect more news), the solution here is also simple.
Phone-Devs can embed hundreds of digitally-created-naked-children-fake-photos (not real ones) into files not viewed by the user, including fake geotags like for example FBI offices, Apple offices, or even the Pentagon.
I do not endorse adult games with children, but this tech must go.
-
Friday 6th August 2021 09:37 GMT Anonymous Coward
Re: digitally-created-naked-children-fake-photos (not real ones)
there are steps to make this illegal too. I mean, chopping off a digitally created man's head off, stamping on it, turning it into pulp (possibly to the sound of digital onlookers applauding, etc, - perfectly legal. Though you have to pay to play the... 'game' (yeah, let's call it a 'game', cause it's a game, advertised as a game, eh). Raping digital children, probably not quite illegal, but... Interesting, and not one-sided argument.
-
-
Friday 6th August 2021 04:08 GMT Pirate Dave
"I don't know exactly what the neural network does"
Those are very, very important to machine-learning for the T-800 class.
So, err, Apple made a huge stink in the media for a few months about refusing to unlock a phone for the FBI (or was it the CIA/NSA?), part of which, if memory serves, they claimed was because of "customer privacy" and end-user "trust" of Apple. But then they take it upon themselves to scan everything on EVERYONE'S iPhone in the search for kiddie-porn? So if they find such pics, they won't alert the authorities, right? Because "customer privacy" and "trust" and all that. Right?
I guess next year, they can focus on searching for pics of animals being abused. The year after, they can look for wives/girlfriends being abused. Followed by searches of pics of supporters at unpopular political party conventions. Damn, if they'd just turn on the mic as well and record every sound, they'd add a whole new dimension to their search endeavors, and would win great favor with the Party.
Apple has gotten half of the population voluntarily addicted to the greatest societal-suppression device ever invented. If someone checked, I bet Stalin's corpse has got a full woody right now.
-
-
Friday 6th August 2021 13:10 GMT Splurg The Barbarian
Yup. Worked as a Forensic Computer Analyst for a police force, anyone using the phrase "child pornography" was always corrected. The phrase was never used in the department by us.
Pornography is legal and used by people for titillation. These are "indecent images depicting the sexual abuse of children". Always feel the term pornography diminishes a little the watching, collection,creation of these images.
-
Sunday 8th August 2021 12:01 GMT Anonymous Coward
That's the problem with images that are sexual or not depending on the sexual arousal level of the viewer.
In your heads you're professionals trained and infallible and not mapping your sexuality to your judgement. Yet I bet lots of images you class as kiddie porn will not involve actual sex, and the sexual component is in your (the viewer's) head. In effect you're sexuality is filling in the blanks to turn it into a crime.
Can you sell your claim? Well you don't need to, because having classed the images as kiddie porn, those cannot be viewed by the general public to pass judgement on your claim. This is why an accusation does the damage here.
"Pornography is legal and used by people for titillation. "
Bestial porn? Lets call it "animal abuse" shall we? I'm sure it will be added to the image set at some point. Best to do the marketing now.
Obviously if you're an iPhone user this is a real danger. They need to view their photos through the eyes of the perverted childless Apple employee who might be viewing it, and the officer keen to keep this suspicionless search going, and the AI that's been designed to return false positives.
-
-
-
-
Friday 6th August 2021 05:13 GMT DS999
I'll bet this has to do with Section 230
The gist seems to be that it only scans photos when they are uploaded to iCloud (but does the scanning on your phone, before uploading) If you don't use iCloud, no scanning. That makes this pretty clearly targeted at Apple's (and their iCloud partners') potential liability if Section 230 changes are coming - which is likely in some form given that both parties in the US want to see reform. I wonder if they got a heads up from legislators about the content of a forthcoming bill?
From what I saw in other articles, it will sort of keep a count of photos it thinks are potentially child abuse related and only have a human review them if there are a sufficient number. So a few false positives of your kid in the bath hopefully won't be a problem, and in any case Apple is only able to look at what is on iCloud, not what is on your phone itself (well, theoretically they COULD look at what's on your phone the same way that Microsoft could access everything on your PC since they control the software but that's not how this appears to work - it relies on Apple's existing ability to decrypt iCloud backups)
-
Friday 6th August 2021 09:49 GMT gandalfcn
Re: I'll bet this has to do with Section 230
Correct "Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery."
but the Apple haters and terminally paranid just knee jerked rather than using their brains.
-
Friday 6th August 2021 10:34 GMT Androgynous Cupboard
Re: I'll bet this has to do with Section 230
Yes I was wondering that too. Apple will know that no matter how well intentioned this is, it's not the kind of stuff people want running on their phone - everyone else's is fine, of course. So I'd assumed this was more about their liability for storing the content. Suspicion largely confirmed then.
-
-
Friday 6th August 2021 11:28 GMT Lil Endian
Re: I'll bet this has to do with Section 230
Instead of scanning images in the cloud, the system performs on-device matching...
"On-device" is mentioned a dozen times in Apple's own statement.
You clearly have not done any research.
-
-
-
-
Friday 6th August 2021 17:02 GMT DS999
Re: I'll bet this has to do with Section 230
Section 230 is a lot broader than that, and would leave a lot more liability gaps if it was fully repealed as it is the only thing that protects anything done "on a computer". The phone company is not liable if I call up the president and make a death threat, due to a separate law passed many decades ago, before computers even existed. There would be no equivalent law protecting Google if I used Gmail to email him such a threat if section 230 was repealed. Whether the action is public or not has nothing to do with it.
Transmitting child porn across state lines is against federal law. So without some type of liability shield for the carrier (i.e. like the laws holding Fedex and other delivery services not responsible for unknowingly delivering child porn polaroids from one of their customers to another) they would be in violation of the law.
-
-
-
Friday 6th August 2021 05:22 GMT ThunderCougarFalconBird
This can create an unwanted precedent. It is possible to push files to someone's device due to the web caching function all browsers utilize. There's an HTML technique that allows a website to pre-load images in preparation to display in a later page. Or not at all. I was able to push images to people's computers just by getting them to go to a survey page I set up. While they were filling out the survey, I was dumping hundreds of questionable images from the site "Stile project" on to their computers. Then I asked them if they had any NSFW images on their computers, they adamantly said no. I then went to the cache and pulled out all the images I loaded onto their computers.
If you have someone you want to get in trouble, then you can do the same thing with this silly Apply scan. The bad part is that with my method, any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge...but a machine has no such consideration. Machines just do. they don't think. This can be a real problem
-
Friday 6th August 2021 10:04 GMT Jason Bloomberg
The bad part is that with my method, any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge
That's where I hide my dodgy stuff
Just kidding but any prosecutor worth their salt will be arguing that's what defendants do in order to gain plausible deniability of their crime, hoping to put their crime beyond reasonable doubt.
If you are using an 'I didn't put it there' defence in court then it obviously hasn't convinced prosecutors and there's no guarantee it will convince a judge or jury.
-
Friday 6th August 2021 11:42 GMT Irongut
> any human looking at where the image is located (the web cache) would be acutely aware that this was pushed to the device without the user's knowledge...
Any human finding a dodgy image in the web cache should realise that it was cached by the browser while the user was intentionally looking at dodgy websites. Oh dear your hiding place actually incriminates you more.
-
-
-
-
Monday 9th August 2021 10:48 GMT Draco
Re: I'm sure this builds on Apple's robust and secure Face ID tech ...
Here is the "mystery" link without being shortened:
https://www.thesun.co.uk/news/5182512/chinese-users-claim-iphonex-face-recognition-cant-tell-them-apart/
---
You can check a bit.ly url by appending a + to it. This causes bit.ly to show you the original URL and the date it was created.
Mystery URL:
https://bitly.com/3fxRLsr
Mystery URL revealed:
https://bitly.com/3fxRLsr+
But ... don't take my word for it. Create a short URL at bit.ly, copy it, paste it into the address bar, append a +, press enter and see bit.ly reveal the original address and creation date of the short address.
-
Tuesday 10th August 2021 12:04 GMT Anonymous Coward
Re: I'm sure this builds on Apple's robust and secure Face ID tech ...
Thanks for that, but, ugh, so it was a link to The Sun, that's dodgy content even more vile than I had thought of! ;-)
(Actually, just reading the full link text says enough about what the article is about, that we wouldn't have to sully ourselves by actually following the link, thankfully.)
-
Tuesday 10th August 2021 22:13 GMT dave 76
Re: I'm sure this builds on Apple's robust and secure Face ID tech ...
You can check a bit.ly url by appending a + to it. This causes bit.ly to show you the original URL and the date it was created.
--------------------------------------------------------------------------
It's not worth the effort, I just ignore all bit.ly links and never follow them. If it is important, send me the full link so that I can at least visually verify that it is going to the right site.
-
-
-
-
Friday 6th August 2021 06:44 GMT Dinanziame
How are they training their model??
We know how the hash technique works, and that doesn't require Apple holding a training dataset.
Considering even humans have been known to disagree on what was objectionable or not, I wouldn't trust an ML model to do the job... Especially when false positives have such consequences.
-
Friday 6th August 2021 09:50 GMT gandalfcn
Re: How are they training their model??
"Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery. If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified."
-
Friday 6th August 2021 12:43 GMT Anonymous Coward
Re: How are they training their model??
So Apple staff will trawl though your private photos, violate your privacy, because their software made a false accusation against you to justify it. Potentially false flagging you as a pedo in their opinion, a life changing disasterous consequence.
And when they fail to catch the first pedo with their review? The review will be the first thing ditched. All false positive flags will be passed, 'just in case' and all such customers will be labelled as pedos sans review.
"National Center for Missing and Exploited Children"
But these are preexisting images according to you, not missing kids. You keep changing your justification.
Customers know they're not pedos, they don't know if Apples AI correctly identifies that, or if Apple's staff will falsely identify a person as underage, or label image as abuse just to be on the safe side, for Apple corporate policy reasons.
Why would they trust Apple, if Apple does not trust them?
Why would they trust you? You seem to keep changing your claim, and if you do reviews of images, you might be equally as prone to flights of fancy to justify a position you incorrectly took sans evidence.
-
-
-
-
-
Friday 6th August 2021 10:43 GMT Androgynous Cupboard
Re: Be afraid...
The Saudis chopped up Kashoggi, a Washington Post journalist, in their embassy and got in a world of pain for it(*). Seems like a lot of work to me - why not just use NSO's Pegasus to get access to their phone, upload some kiddie porn to it and let the Feds sort him out?
(*) OK, not as much as they should have.
-
-
-
Friday 6th August 2021 07:30 GMT TVC
No doubt systems have improved but..
Quite a few years ago my corporate system was able to scan graphics in email looking for pornagraphic images and forwarding any it found to my IT team. Apart from the tit and bum photos there were loads of innocent false alarm pictures of kids in baths etc, taken by parents.
It will only take one false alarm to screw up someone's life or career.
-
Friday 6th August 2021 09:40 GMT Anonymous Coward
Re: It will only take one false alarm to screw up someone's life or career.
It only takes one ALLEGATION on social media to screw up someone's life or career.
...
arguably, I would say: it serves him / her right (all of you, really), for being on 'social media' in the first place, but then, some jobs make it virtually mandatory to be there...
-
Friday 6th August 2021 10:00 GMT Anonymous Coward
Borders & Employers
when entering some countries a lack of any social media profile is a red flag and a long wait and search.
And without it you won't even get short listed let alone interviewed (although I also know of a successful candidate being excluded when the Chair of the place saw a couple of night out pics when they were much younger and "that won't do")
Call me Marvin, and yes the diodes do ache.
(black helicopter)
-
Friday 6th August 2021 13:23 GMT Anonymous Coward
Re: It will only take one false alarm to screw up someone's life or career.
"arguably, I would say: it serves him / her right (all of you, really), for being on 'social media' in the first place, but then, some jobs make it virtually mandatory to be there..."
Which, if you want to see benefit payments keep coming, you *have* to use it to find work as one of your sources.
-
-
-
Friday 6th August 2021 07:40 GMT fpx
Nothing to Worry About
There is only a low probability of a false positive.
After the SWAT team breaks down your door at 4 am and confiscates all your PCs and phones and other electronics, it will only take them a few months to scan it. Then you will only have to answer a few curious questions about "can you explain *this* and *that* on your hard drive" even though this and that has nothing to do with the original find.
No problem, that will all clear up after only a few years.You will be unable to work without your gear, and everybody around you will be very suspicious, but that is a small price to pay for society as a whole.
Low probability times a few billion users? Meh.
-
Monday 9th August 2021 02:20 GMT the Jim bloke
Re: Nothing to Worry About
Lets not forget malicious SWATting, where some dipshit makes false 911 calls and armed SWAT teams descend on the victim.
As mentioned earlier, incriminating images can be pushed to a target phone.
Only way this technology would be acceptable is if we could trust both those implementing it, and using it - and that just isnt going to happen.
-
Monday 9th August 2021 11:02 GMT Anonymous Coward
Re: Nothing to Worry About
"After the SWAT team breaks down your door at 4 am and confiscates all your PCs and phones and other electronics, it will only take them a few months to scan it."
At a previous company, we reported receiving dodgy images to our publicly-published email accounts. The police arrived and seized the hard disks from the email server and the computers of the staff with those accounts. They told us we'd get the disks back when they finished investigating.
That was in 2005. They are still waiting.
More worrying on a personal level: in the UK, if the suspect arrested on suspicion of this sort of offence has children, they are not allowed to remain in their own home while the investigation takes place. That puts an extra twist of the knife on delayed conclusion/justice.
-
-
Friday 6th August 2021 10:50 GMT Splurg The Barbarian
Re: Question
No comment on the rights or wrongs of this announcement, but in answer to your question very easily. The system from the announcement isn't using AI to scan photographs, it is comparing against a national database for f known indecent images of children. This has been created by uploading has values of.imahes.ffound by human examiners and will have been created over years. The idea behind them is it limits the amount of exposure examiners have to indecent images depicting the sexual abuse of children as they only have to deal with previously unseen images or edited versions of previously known images.
This is how it works in the UK, with the UK's image database.
-
-
Friday 6th August 2021 08:20 GMT chivo243
I just scanned my phone
75% of photos on my phone are of switch panels, cables in ceilings, broken connectors etc. about 15 % is vacation photos, I saw on with my son and a friends daughter in the swimming pool. Should I be worried? Should I off load that pic to another non-apple storage?
What a slippery slope, I'm all for protecting children, where is the better way?
-
Friday 6th August 2021 08:43 GMT tip pc
Re: I just scanned my phone
“ 75% of photos on my phone are of switch panels, cables in ceilings, broken connectors etc”
We are both obviously deviants!!
Can you imagine the interrogation by someone who doesn’t understand?
I’ve also studied chemistry & physics, someone brought up on McGyver would have me locked up for a very long time.
I also take photos of wiring harnesses and inside machines etc before disassembly to ensure they go back the same way. I’ve seen house wiring with connections going to the wrong colours, someone sometime checked and corrected a mistake but left the wrong colour sheaf. A quick photo ensures the correct sheaf can be put on the correct wire when time comes to reconnect!
-
Friday 6th August 2021 10:55 GMT Splurg The Barbarian
Re: I just scanned my phone
No you shouldn't with regard that specific question. Unless the image matches a hash value of a known image that is stored in the US image database then it won't be flagged up under any circumstances.
Microsoft do amd have done this on their cloud storage systems for at least a decade, but isn't really shouted about.
Will the system stay at that, and whether any on device processing regardless of the fact it is images that are to be sent to iCloud are the main questions I have.
-
-
Friday 6th August 2021 08:22 GMT aerogems
I applaud the effort
The intent is a good one on this, assuming it's not just some fevered rantings of a conspiracy nutter, but once you open that door it will only be a matter of time before it is applied to other things. I also have a pretty big problem with the idea of the assumption of guilt this implies, and the fact that there is surveillance being done on people without any sort of judicial oversight... you know, those pesky warrant things that require showing probable cause, and sets strict limits on what the police can search for. Much easier to just assume everyone is a potential kiddie porn consumer/distributor and go on endless fishing expeditions until you finally find something.
Mushroom cloud because I figure that is about how well this will go over with people if the idea isn't murdered in the womb.
-
Friday 6th August 2021 08:35 GMT tip pc
Every is now under suspicion
This is a shock.
For a company who have been holding the candle on privacy for so long, if they now will treat everyone like a potential suspect I no longer want to be funding them.
I also thought iCloud was end to end encrypted. I’m shocked it’s not.
I’m looking at re doing my home server stuff and was looking at the Mac mini. I might still look at it if I can run something other than osx on it.
-
Friday 6th August 2021 19:01 GMT chivo243
Re: Every is now under suspicion
I don't think running macOS will be the problem, don't connect it with your AppleID, never, ever do that... It's that the iPhone is tied to the AppleID.
I am starting to feel as I've been maneuvered into a safe place that might not be so safe from another perspective.
-
Friday 6th August 2021 21:53 GMT Arkeo
Re: Every is now under suspicion
Your reasoning is sound, but practically impossible for the average Joe/Jane, and frankly a pain in the neck even for a skilled user on Droid: you can make a useless, or basically fake, Gmail account just to activate the phone and use, say, mailbox.org or whatever for real email. But then Gmaps (basically the only G-feature I use) would still link your movements to your Gmail account and therefore to your phone number and IMEI...
So we'd still be fsck'd, wouldn't we?
If I'm wrong please correct me...
Cheers
-
-
Friday 6th August 2021 08:45 GMT Pascal Monett
"scanning individual users' iPhones"
I'm sorry, on what authority ?
Has Apple been integrated into a special Police branch ?
What right does Apple have to scan individual users' private property and report the results ?
Another case of a tech giant making social and police decisions on its own, without any mandate to do so.
I was never interested in Apple gear.
Now Apple is on my blacklist.
-
Friday 6th August 2021 14:09 GMT SImon Hobson
Re: "scanning individual users' iPhones"
They just have to put in in the "no-one has a couple of days spare to read it all" agreement you have to sign before any modern stuff works and it becomes legal - as in "we can do it, we asked for permission (on page 273 of 425 pages) and you said yes".
https://www.onelegal.com/blog/fantastic-clauses-hidden-in-contracts-and-eulas/
-
-
Friday 6th August 2021 08:53 GMT tip pc
iCloud private relay? Can we trust anything fruity at all?
What now for iCloud private relay?
What about iMessage
What about faceid
Photos app has been scanning faces in photos for years, will it upload those recognition hashes to nsa now?
Will I get my door smashed in because an algorithm got it wrong & the humans in the mix needed to make their targets?
Do I have to now roll my own everything to ensure I retain privacy?
I don’t use TOR as I see that as joining in with unsavoury types but this move is pushing more people in that direction weather they like it or not.
Unintended consequences, a little like banning the sale of cigarettes, normal people who never touched illegal stuff will suddenly be more likely to interact with drug dealers to keep their habit going.
-
Friday 6th August 2021 08:57 GMT mark l 2
Typical 'won't someone think of the children' response to give yourself permission to search through 10s of millions of innocent users photos to find a handful of law breaking people. Of course if you object to it you are siding with the pedos. Yet no doubt those who did use their Iphone to store illegal image will now stop using an Iphone and switch to Android since they know the scanning is occurring now.
It reminds me of back in the pre digital camera days where people would get the plod knocking on their door after the photo processing company reported the photos of their kids naked in the bath to the police as kiddypron.
This is just another way of showing that despite you spending a grand on your new iPhone its NOT your phone it belongs to Apple and they can decide what you they do with it.
-
Friday 6th August 2021 09:03 GMT Anonymous Coward
I'm going to repeat a comment I found in the Washington Post that pretty much says it all:
"The first problem with what Apple proposes is that it cannot be performed legally in Europe, and, in some parts of the world, accessing someone's content without their explicit permission even carries sentences that come with a mandatory stretch in jail - it can only ever be done by local police, controlled by an investigating judge.
The second problem is that it puts Apple in jeopardy when they miss something and can thus get sued by the victim for not acting.
The third and most important problem is that it creates a backdoor for abuse by other entities, almost as bad as the idiotic idea to weaken encryption that shows up every seven years or so. Until now, Apple was seen as the safest device to use in an online age full of hackers and ransonmware criminals and that status took years of doing the right thing to achieve.
Apple just undid all of that with a single announcement that practically every lawyer with an ounce of common sense would have warned them against.
Don't be the police. That's what the police is for."
Isn't this called "pulling a Ratners"? If not, it should be. It's moronic.
-
Friday 6th August 2021 09:43 GMT Anonymous Coward
re. accessing someone's content without their explicit permission even carries sentences
but this would be with their EXPLICIT consent, first time they open their new iphone and click on that big, green, juicy button that says: 'AGREE!'* (yesyesyesgimmegimmeshinyshinynownowNOW!!!!)
*to everything
-
-
Friday 6th August 2021 09:10 GMT Anonymous Coward
That's instant jail for whoever tries that in Switzerland
As far as I know, accessing someone's content without their explicit approval carries not just a fine, but a mandatory jail sentence there.
The only time you get to access someone's content as a provider is under court order, and even then the extract goes to a very small set of poilice people and an investigating judge who then assess if there is a crime in progress.
Sure, they can try this on Americans because they have at Federal level so many laws breaking privacy that there is probably a fully legal path to do so, but in the GDPR zone I can't see this one fly either.
-
Friday 6th August 2021 14:14 GMT SImon Hobson
Re: That's instant jail for whoever tries that in Switzerland
But didn't you read that gazzilion page long licence agreement before clicking "I've read and accept it" ? Somewhere it'll ask for permission, and you'll have explicitly given them permission to do this. So potentially completely legal under GDPR.
I say "potentially" because GDPR also prohibits burying stuff like this in long agreements, and also prohibits making such acceptance a requirement where it's not actually required for the product or service to work. Look up how long Max Schrems has been going at FaecesBorg for - and that's probably how long you can wait for any practical enforcement action.
-
Friday 6th August 2021 18:21 GMT Fred Flintstone
Re: That's instant jail for whoever tries that in Switzerland
So, time, once again, for that excellent Freefall cartoon.
Enjoy.
-
Friday 6th August 2021 18:31 GMT Anonymous Coward
Re: That's instant jail for whoever tries that in Switzerland
.. and that's not even mentioning that newer "legitimate interest" permission BS which can only have come about by some serious
bribinglobbying.It basically doubles the amount of shit you have to opt out of to ensure you have at least a legal basis to go after them, with bucketloads of deceptive design and deliberately misleading labelling to make sure you then still choose to allow it all.
I'm generally against violence, but I've arrived at the point where I'm convinced that fines no longer have any impact, and percussive education may have to be made mandatory to stop the tide.
In this context, a certain car brand which ends on "edes" will get it in the neck soon. I've been trying to unsubscribe from their systems since 2019 and complaints have not helped, so now I'm about to have some fun with them at European level for multiple violations.
I'm through with being nice or gentle, that has yielded zero results.
-
-
-
Friday 6th August 2021 09:11 GMT bellcore
Phantom enemy
This is just like the battle against E2EE, it's always "For Ze Kinder", yet the reality is that it's hardly ever used in child abuse cases. In Germany, child abuse cases account for less than 1% of monitoring orders. It's used for drug offenses. They don't care about child abuse, they care about their authority.
https://tutanota.com/blog/posts/why-eprivacy-derogation-bad-idea/
-
Friday 6th August 2021 09:13 GMT Lil Endian
Exfiltration of Imagery?
So, Mr & Mrs A have legitimate and legal imagery of their children (eg. at play in the bath).
The only difference (arguably) between this and illegal content is in the motivation in taking, and usage of, the images.
To all intents and purposes this would be a positive hit for the ML system. A human will be required to assess the difference moving forwards. So now the images Mr & Mrs A assumed were private are have been observed by someone they really didn't want to see those images. Without a warrant? With what amount of training?
Does the imagery then get uploaded to the ML system in some way to improve future operations? Well, that'd be a bit illegal.
We all agree no connected system is unhackable. How long before that cache is exfiltrated maliciuosly? If indeed Mr & Mrs A's imagery was stored, it's now in the wild and they'd have every right to burn the morons that facilitated that. (I know that's a big "if".)
The difference between the context of use with a given image (family pic vs child abuse) in some cases is purely in the eye of the beholder. Certainly not for an ML system to differentiate, and I really, really would not want to be a human viewing images to distinguish the difference.
-
Friday 6th August 2021 10:32 GMT Anonymous Coward
Re: Exfiltration of Imagery?
Maybe always ensure your kids are fully dressed in your iPhone photos, just in case.
I'm sure the nice officers pouring over your private family photos flagged by this AI have no malice. They are the thin blue line between Good and Black, Wright and Wong.
But just in case.
@Hackable... well NSO (Israel military intelligence derived hacking group) hack iPhones with their Pegasus software, so now they can also get journalists and politicans arrested on Apple auto pilot. But that tool is only used with Israeli government approval, so you're safe.... you're not one of these 'pro-Palestine' people right? Good.
-
-
Friday 6th August 2021 09:18 GMT Anonymous Coward
I already know of a deliberate miscarriage of justice
- Physical security guy helps a female escapee of a Middle East family settle in the UK
- A few weeks later, a break in in the office. Nothing is missing
- A few days afterwards, police gets a tip on child pornography
- Office is raided, office Mac (used by everyone) is taken
- Technical "expert" (outsourced contractor who has only ever touched Windows) finds a pic (yes, one) in an iTunes backup
- Police only sees statistics, so chap gets convicted for child porn. From a tech perspective there was so much reasonable doubt it should not have even ade it to court (I reviewed the files and am about to hand this off to some human rights people who may be able to act).
He lost his livelihood, and could not even see his own kid without being accompanied as a result of a revenge action, eagerly assisted by the local constabulary who could not spell IT without having to look it up.
Apple is about to offer mechanisms to make that a lot easier. Well done.
-
Friday 6th August 2021 09:30 GMT TRT
Shock horror... they are already doing something like this and have been for years.
Every now and again my iPhone pops up a little message that I have a new memory (let's get this straight right here and now - I loathe this "feature" and am a little bit disgusted that it's not something I can turn off - I live in hope that they'll give it a toggle switch).
The bloody thing has been through all my photos whilst I'm not using it and has labelled all the photos of my cat (this HAS to affect battery life - there are many thousands of these!), decided that I went on holiday with the kids during these dates and collated all those together, recognised that a big bunch of pics was taken at work on the same day an thought it must be an important event (network cabinet inspection prior to a tidying up session).
The next step on this path is somewhat creepy, however... identifying potential kiddy pr0n, hashing it, and then sending that hash to be... what? compared to a hash database of known imagery? Given to the Feds along with my phone number?
-
Friday 6th August 2021 09:43 GMT Anonymous Coward
Re: Shock horror... they are already doing something like this and have been for years.
Yes, I fully agree with you: I do not appreciate it analysing my pictures either, and there's no way to kill that off despite that being in principle a privacy problem. The MacOS photos application also does this.
-
-
Friday 6th August 2021 09:42 GMT cupplesey
Apples ad campain 'What happens on your iPhone stays on your iPhone'.....so Apple lied then?
What happens when they inevitably get it wrong? Can you sue them for liable or false accusations. Of course i don't condone illegal content but its the thin end of the wedge for big brother/NSA level control and monitoring.
Dosn't this also violate the US constitution? What about other countries citizens, are they being also watched but not informed just like the NSA.
-
Friday 6th August 2021 09:45 GMT gandalfcn
"Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage"
Or
"The neural network-based tool will scan individual users' iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today."
-
-
Saturday 7th August 2021 18:14 GMT gandalfcn
"What’s the difference?" So you are saying the cloud is the same as a personal device, correct?
"The bad bit is someone has determined that all your photos and documents need to be checked for CSAM, because YOU are likely to have that stuff in YOUR collection."
Do you upload everything to the cloud? I don't, because I don't give a stuff.
didyou bither to read what the process actually is? Obviously not.
what is extremely sad is that all the self proclaimed IT experts here don't seem to understand the difference. The sae with a few other things tech. They ignore facts and abuse anyone outside their bigoted orthodoxy.
-
Sunday 8th August 2021 21:49 GMT tip pc
“ Do you upload everything to the cloud? I don't, because I don't give a stuff.
didyou bither to read what the process actually is? Obviously not.”
I had enabled iCloud photos on all my devices.
I have 21 years of digital photos ~200GB, my partner has ~700GB’s. I have local backups but iCloud made it easy to have a cloud backup and also meant every photo was available even on devices without the storage space.
iCloud photo sync also ensured photos appear on my Mac without having to actually sync the phone.
Needing 2tb iCloud ensures all phones tablets & macs fully backed up in iCloud too.
I trusted apple with my privacy and felt my data was safe with them.
Not anymore
-
-
-
-
-
Friday 6th August 2021 10:57 GMT Anonymous Coward
One the one hand you accuse people of disliking this because they are "Apple haters" then you claim it is or will be done by everyone else (are they also Apple haters)? In other comments you talk about your love of taking photos of naked kids in the bath, and your confidence in the AI's ability to not flag you as a pedo.
You're really all over the shop here.
I get you want to deflect this, but you clearly don't understand what AI is or how people don't want AI flagging them as pedos. Even Apple loving customers.
-
Friday 6th August 2021 09:55 GMT This is not a drill
Remember PHORM
Phorm was being touted by BT, TalkTalk, etc as a way of protecting users from nasties on the Internet.
It was absolutely not about monitoring what everbody was doing so that you could sell the data and 'tailor' a users internet experience based on whoever was paying to most to push their products.
Apple won't be happy until they can control everything you can do and see on your iCrap device. I've never owned an Apple product in my life, never will, and the work iTurd I have forced on me is only used to read work emails, nothing personal.
And yes I know that Google, facebook, telco's can and do monitor everything, but at least they don't pretend that it's for your benefit.
-
Friday 6th August 2021 10:10 GMT Sgt_Oddball
How the hell...
Does this AI figure out the users 'intent' ? I mean yes some content is obviously vile and should be treated as such but what of a user taking a photo on a beach with a naked toddler running off, refusing to be restrained by a 'bathing suit' happens to be in the background? Or if the kids are being cute in the bath so you take a family pic? What of having a group of children dancing in the back of a camper van at a communal meet up and one of them decides his clothes aren't for him (I dare not guess the reasons why)?
As always with these things I suspect nuance will be lost (probably because the devs are looking for it on a map of France) </sarcasm>
-
-
Friday 6th August 2021 10:48 GMT TRT
Re: How the hell...
My understanding from the much more in depth Reg article is that they're using the AI-like technology that already goes through photos on your phone or on your MacOS device (OK, the ones in Photos anyway - there's no law that says you HAVE to store those files in Photos.app) to spot potentially dodgy images, and create a hashed version of it in some form that still allows comparison after minor edits are done, send the hash off to Apple for checking against a database of known dodgy images that are in circulation, and then... what? if it finds a match shops you to the feds? grabs all your phone history and contacts and people you've circulated the image to or received the images from and thus profiles a paedophilic ring? Those details aren't clear!
They could of course do the same for any criminal activity... it's just easier to justify piloting it with kiddie porn. Photos of stolen cars circulated amongst gang members looking to offload a hot motor? Farm equipment is a hot crime at the moment. Sexual abuse of adults? Revenge Porn? Reconnaissance photos of banks, jewellers, wealthy domiciles, industrial premises with valuable IP sent around gang members? Beatings given to transgressors of Gang Rules? I believe these are often shared.
I'm not saying if it's right or wrong, it just appears to me to be the thin end of a wedge.
-
-
Friday 6th August 2021 13:14 GMT TRT
Re: How the hell...
No. Not necessarily. I meant what I said.
1) There's no need. Photos works with or without an iCloud account.
2) The phone "backup" to iCloud is supposed to be encrypted with a device or account specific key
3) Who provides the processing power to analyse all of this stuff? Why not distributed computing? Though there are better tasks I could think of for a semi-asleep iPhone to be working on - how many millions of iPhones are there on the planet now?
4) The iPhone itself is signed in and active and the data are "unlocked" when the phone is on, supposedly "secured at rest", so you can't just nab the flash storage or an image of it and then use it on another device with a different CPU etc etc They've got to justify all that "oh, only we can repair it with our parts for your safety" crap.
-
-
-
Friday 6th August 2021 14:03 GMT TRT
Re: Who has the right to check on your device without a warrant.
I wondered about that bit... I mean the technology to do this I think is already embedded into Photos, and that can work either with or without iCloud, so WHY do Apple specify that it's the photos destined for iCloud that are scanned (on device I hasten to add!) and given a metadata ticket? Is it perhaps that they operate under the flag of "it's actually heading out to OUR infrastructure, and so we are obliged to protect ourselves within reason from accusations of being a haven for illegal content"? It's not that it's checking YOUR device, per se, it's checking the data heading FROM your device TO their device.
Oh, and I know that Apple have separate processes that do the actual uploading and downloading between Photos and iCloud - at least on MacOS... the background task is forever going wrong on my laptop - the fans will come on hurricane force during the night sometimes when it gets its NICs in a twist.
Hm... I'm sure the lawyers have checked on this. Apple have quite a few of those, I hear.
-
Friday 6th August 2021 19:06 GMT Lil Endian
Re: Who has the right to check on your device without a warrant.
Good take on it TRT.
So rather than a misplaced attempt at "civil duty" it's misplaced self-preservation. As Apple wouldn't be liable (if at all) until the illegal content hit their platforms. If they are indeed launching a pre-emptive strike (on a yet-to-happen transfer) they're well overstepping their authority. It's tantamount to having a law officer observing your every move, including in your home, "just in case". So yep, Apple are trying to be a corporate version of a police state.
-
Friday 6th August 2021 19:46 GMT TRT
Re: Who has the right to check on your device without a warrant.
Well.. what they describe as happening is not so much that they prevent dubious content from hitting their server at all, but that they flag (because it's not a perfect matching process by a long chalk) that certain individual items / content MAY be dubious, but if any one user / device accumulates enough flags, then they balloon goes up and they go into responsible self-preservation mode and say "Hey, coppers! We think this person might be putting us in a bad legal place... so can you take this further, please? Have a look into it?" Then their big, red shiny, corporate bottoms are covered.
-
-
-
-
-
-
-
Friday 6th August 2021 18:24 GMT tip pc
Re: Trust
I've been buying apple crap as my main stuff since 1991. I've trusted them for 3 decades, sending 10's of thousands there way in the process.
They had my trust.
They don't anymore.
I don't know who to trust now.
It does look like an update to the forthcoming OS's is needed for these "features" to work.
so for now need to rollback to the current releases and not update.
-
-
-
Friday 6th August 2021 11:56 GMT Lil Endian
Follow Your Own Critisim
[This is largely a copy of my post on page 2, to which you [gandalfcn] have not responded.]
Instead of scanning images in the cloud, the system performs on-device matching...
"On-device" is mentioned a dozen times in Apple's own statement.
You clearly have not done any research.
Why do you persist in erroneously attempting to correct posters?
-
Friday 6th August 2021 13:33 GMT TRT
Re: Follow Your Own Critisim
Ah... yes, thanks for the link. Interesting... though I'm curious as to how they know that they can combine thresholding with CSAM ticketing and put a figure on "false positives" like "one in one trillion chance per year of incorrectly flagging a given account".
Almost sounds like they've been trying it out with test data taken from something like memes circulating via WhatsApp (auto-add images to Photos), or Photos shared publicly on Facebook or something.
-
-
Friday 6th August 2021 19:49 GMT TRT
Re: Follow Your Own Critisim
Yeah, but something about the way you put it made me actually want to read it whereas I was put off earlier by the potential of being faced with the usual reams and reams of Apple legalese and technobabble, but it was actually pitched just right - very understandable by the average reader.
-
-
-
Saturday 7th August 2021 18:15 GMT gandalfcn
Re: Follow Your Own Critisim
I didn't respond because I didn't see it. OK. You also didn;t seem to have actually read and understood what you cited.
"The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos."
Entirely separate things.
You;re welcome.
-
Sunday 8th August 2021 16:01 GMT Lil Endian
Re: Follow Your Own Critisim
I didn't respond because I didn't see it.
Understandable. Thanks for responding.
1. You've been claiming and reclaiming that others are wrong because they're conflating cloud & device, restating "...in the cloud...". I've pointed out that on-device is correct.
2. You've not refuted my statement. You've pulled a different quote for your own purpose. Which fails as it still does not qualify your earlier false corrections of others' statements re: on-device.
You;re welcome.
Your attempt a patronising me gains my sympathy.
-
-
-
-
Friday 6th August 2021 11:13 GMT Tessier-Ashpool
Appeasement
There are two types of people who complain about encryption: those who think of the children, and those who think of the terrorists.
I imagine Apple is doing this to appease dumb legislators who want an end to encryption. But that would only appease half the complainants at most. Scanning for terrorist content would be irresistible to governments with this kind of technology in place. That would be next on the list for sure.
A very slippery slope.
-
Friday 6th August 2021 12:53 GMT Anonymous Coward
Dream on......
@Tessier-Ashpool
Quote: "....who want an end to encryption....."
*
So....books like Bruce Schneier's "Applied Cryptography" need to be banned, and all the (thousands of) copies burned in the public square!!!
*
Banned books!.......book burning!........encryption is out of the control of politicians, civil servants, and assorted (private) control freaks.
*
......dream on.......the encryption genie is long gone from that once hidden magic lamp!
-
-
Friday 6th August 2021 11:41 GMT Anonymous Coward
How good is Apple AI at dealing with base64 and IDEA?
This Apple initiative is simple MISDIRECTION! The masses will "think of the children" and the bad guys will think of a way of avoiding the scrutiny (see below).
*
Is it a JPG or TIFF or RAW? Maybe it's a recipe for Black Forest Gateau? What was the key used for the IDEA encryption? Apple AI might have a few problems!
*
KUTktHLwrCNGmD2/gUDz8dqm0fNyVWbHjLE6oCl7UJEVBEUWFmHAm3qhzEK+B9juexE5aZHBFfh4
7qyZm4ABQ0T+13gzTh8cg4KlAwdDK5VNyDR23XuKsbG27cvVr0wQZR37AaBeRrSeG4Pe5KMY0aI3
D2mEcRXEk0JQ8ImpeEMJ1XtLEz7ey0dnarktOemDWSaaa4iG2mQ0GmltYQ0puneMmaWnfBaCP8m0
RShGRkkW05hCiXHga6qg2k0pF13kHUqApeoUPj55rrJOOWAfcXhlv75bd0KfKhkdc6weCvwKyoyx
JjcPe3EhDy0yZdyufuNakKho8JcBiMrpbFBxmmbl1rHpwhnnNRegf7oOGpVP+3iaN2RzryS9qAD+
iB7kZIUZ6Yn+g8G23xMmHkXLs2Kiseq9/ry5vraz0wITznmlnOLZM2brr/J174i0oLkwje0ppg/w
55HfHRDXtL8bAvR2ecFia9z9wdZW0/RYqHLhOoWMIbzUBBaEl3VMCbsJT2N2xhWgKwi3iBybYRrE
b9vDOSroeN6bbp640FDEoCIPJeIUCTi2O6DjftXImZvQ0MoKxOwlfpc388vb6vumjLoFcbOPpXa4
OABh7Nq2nCX3A24ySiTBjofGwufxaOaorxFHLGFCjFGH0FnQH4KaLkHVTnfwkrcdJHRl5SBWF/W1
/YwV3skJJl9YNEQ503e4awnc3GVwyo+WE0jM/imgslt6W2WvT8MHWElHwcBxw01pqz1OGwWvaBsk
14bwjivum/bS7+8nso+MYKESbPVRz1K+GQP8aeJAww6dpisq6cJSMph2jxAyb6ke1P4gDChkVRTw
VN3Qx/7OkippTDSLtbpYyqpPcRxRowxibfXzGuUqZca25CAplhpKCsCM9DRKzUIvkIEVfYFF0Llu
Rl4JtVU/OUrHIXBtLY8lPW3cjKZ1M2ajVP1YCN80fkwx4PZuKXXYmmfEYi6HapPJ2rE3o5kGaXYY
OrBefEw0529xzJ8R5ddFyYHffBlYDnJr092tzAFIfch//T/s3ljslQ2V+K73EQ8n8LKiUZZpERZz
hgyfCQfT7s7ATkiTfwIIeFi4Elynea5esT9LBlk1lkNjjNXHXZKdxGSGl/uTt9xV/PlWaHOkFhOI
BDMQRKzED0MJmuwVb5bS/vJGu37xaeyYG9PU7rVGiSfGFsWHrklpLkFFWIxYpQtUKom2oTekV2XP
4+dmsieXEjXt3H7jN6PCFG1CFm6IUFS4Ok8zRhxDvXn7c1FR2Nd+v+fwO5oU4MjTZpg/dvpAUzIl
HnJp9dWGotkGqLPL9dg76vm9he+Emc0mybM9JyNO88jfcYXQcg3qM0GFlDEkMe7cDUtczNcFzSDz
YDV8Y0Lj4bJNjpPvhv4KeZ8De6L1eOy5wPjF2rh53F8DBhQ8bdFPm6qNjYaQ4fO/lpK1Rv0iGXWc
XA6KMypW4zYoDlVekt1y7lKIwk6yMJhlTRiYzCW1hn15Wou9BCtX4eYIJwOhSshOQKMbDzKRZSYv
ToGWMolwKvHVOEUJ1QvjoGS6rOQS45c+71wC45luYyj3zqB2zl4fgl9hDgkg5r12E9y63pbfYmeN
4SLTil1Y3PYVm41fbEH7cq9BVSB0hGl5nh+Xg0N7TePCkPF8RZeKU7w0/GZ39Sm63AGIYUlnZCyY
RcLEZYn1MGUB+WQOZnJT0AhdbeXBrglC2Cr9kSBZCCKNrQbxFy8GDeH69oV31x57ayl5mjqEQGuR
SV1DXpaz2CGW32m/mfMDLMSC3PAvOJYj8qZ8dp5ELsUZKJ6o5P2prA0T9ckNI+b7gTaK5K7kyDPd
xlZKD9z5Z/c=
*
Let us know when you know what's in this example!
-
Friday 6th August 2021 11:58 GMT bronskimac
Fourth Amendment?
I'm pretty sure the US courts would view this as a breach of the Fourth Amendment of the Constitution of the United States of America. There needs to be "probable cause" to carry out any search. I don't see "You've got a phone" without any other evidence, as probable cause to search it. "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."
Of course in the UK our Members of Parliament (MPs) will love it and rush through any changes to legislation needed to make it happen, whilst continuing to exclude themselves from any such searches.
-
Friday 6th August 2021 17:47 GMT Irony Deficient
Re: Fourth Amendment?
There needs to be “probable cause” to carry out any search.
Note that the Fourth Amendment only constrains searches by the government — see the Supreme Court’s majority opinion in United States v. Jacobsen :
The first Clause of the Fourth Amendment provides that the
“right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated. . . .”
This text protects two types of expectations, one involving “searches,” the other “seizures.” A “search” occurs when an expectation of privacy that society is prepared to consider reasonable is infringed. A “seizure” of property occurs when there is some meaningful interference with an individual’s possessory interests in that property. This Court has also consistently construed this protection as proscribing only governmental action; it is wholly inapplicable
“to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any governmental official.”
A legal defense against non-governmental searches by individuals or organizations would need to based on other areas of the law, e.g. on trespass or online privacy legislation.
-
Friday 6th August 2021 20:41 GMT Arkeo
Re: Fourth Amendment?
Shouldn't that "private individual" also abide to the US Constitution? After all it's *the* fundamental law by definition, every State or Territory wishing to join the nascent US had to sign and accept it, did it not? If only a Federal agency should abide to the Constitution wouldn't that become an extremely dangerous loophole or precedent? If it's valid for the 4th why not the 1st, why not the 13th?
-
Saturday 7th August 2021 18:13 GMT Irony Deficient
Re: Fourth Amendment?
No one stated that a private individual should not abide by the US constitution. If you read the US constitution, you will find that much of it does not directly affect private individuals; Article I. primarily deals with the powers of and limitations on Congress, Article II. primarily with the powers of and limitations on the President and Vice-President, Article III. ditto with the Supreme Court and its inferior Federal courts, &c.
New states had to accept the US constitution, but I don’t know if “signing” it was part of that acceptance. New territories were creatures of Congress that were organized as such only once controlled by the US, so any acceptance in their case was performed by Congress.
The Supreme Court did not state that only Federal agencies had to abide by the constitution. As was quoted in the case above, the opinion stated that the first clause of the fourth amendment to the constitution only applied to the government (and you could follow the links to past cases within the case link above to find the opinions that served as precedents); that case made no other determination on any other part of the constitution.
Regarding your second question, unlike the first clause of the fourth amendment, the first amendment explicitly constrains Congress, and the thirteenth amendment still allows slavery and involuntary servitude as punishment for crimes, and explicitly gave Congress the power to enforce the amendment through legislation.
-
-
-
Friday 6th August 2021 12:09 GMT Anonymous Coward
Though Crimes
I am having to post this with a disposable account as in this day and age I am basically guilty of heresy for speaking out against the insanity of our times.
Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone. That is a fundamental principle of a free society which has been acknowledged for if not decades, hundreds of years. By all means prosecute the distribution of such material - but to prosecute possession is returning us to the Middle Ages hunting witches again.
It is only because of radical feminists and other moralists creating moral panics in the 1970s and 1980s are we faced with the present situation in the 21st Century, which is extremely dangerous in a highly interconnected society such as ours, where it is very difficult if not impossible to stop people from unintentionally coming across such material.
We might as well have a nuclear reactor in our own homes, yes it's very useful and provides lots of free electricity but one day it can melt down destroying the entire family.... Do you see the analogy I'm making? Because the Internet is just as dangerous. The penalties for possession of such material and being put on the sex offenders register are so horrific, it's totally unreal. It's like a real life nightmare. I cannot believe I'm typing this here in 2021. What has happened to our country?
We criticize dangerous products that are unsafe and burn your house down or electrocute you, but why can't we criticize the dangers of the Internet and all the ridiculous draconian laws involving it? It is as if the law itself and the crazy ways it's made is beyond discussion?
Nobody should have to fear their own computers (unless they are doing major hacking/fraud/sending death threats, etc...).. This simply cannot be happening in a free society.... It looks like we are no better than China - it's just over different stuff here in the West....
-
Friday 6th August 2021 12:51 GMT Lil Endian
Re: Though Crimes
AC: Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone.
I see no reason for any "man (or woman) on the street" holding child porn legitimately. I genuinely feel for those that must view such material as a part of their work (the judiciary springs to mind).
Obviously delimiting those with the right to data retention and those without creates the opportunity for grey areas, but it's handled already. When the right/position is abused justice must be enforced vehemently. The controls must be stringent.
Carte blanche "it's ok, I didn't do it"? What? Mr X gives pics of his own kids to Mr Y, and vice versa. That is fine with you?
Edit: I certainly do not consider Apple as being in the group with a possible mandate to retain these images in any format, or to "investigate" anyone.
-
Sunday 8th August 2021 06:35 GMT Anonymous Coward
Re: Though Crimes
@"I see no reason for any "man (or woman) on the street" holding child porn legitimately. "
They aren't. You have no evidence otherwise. You falsely claimed they do, in your comment, to justify a speculative search of their private media with your AI, without a single shred of evidence.
I bet power-trip officers and spooks are salivating at this. What they're doing here "FOR THE CHILDREN" is establish the right to search the digital media of people WITHOUT SUSPICION in bulk preemptively against their own search set.
[1] I noticed you did not say "everyone/anyone", you said "man on the street", so I'm curious who you think has a legitimate reason that caused you to prefilter there.
[2] You're not even talking about child porn are you, you're talking about an AI APPROXIMATE and SKEWED scoring of images trained on a set CLAIMED to be child porn!
A speculative AI model based on a training set provided by the searcher! So not even "FOR THE CHILDREN", for an AI estimating of age algorithm / estimation of sexual activity algorithm, representing 0.00000001% of all images in real life, yet trained on them as if they are 50% of images. A false training set designed to give a lot of false positives, where normally it would return limit-zero results if trained on a set of all images.
Apple just threw away their customers privacy right as a fundamental legal principal here.
-
Sunday 8th August 2021 16:41 GMT Lil Endian
Re: Though Crimes
I'm not sure if you're the original AC, so to save confusion I'll call you.... Bob.
Hello Bob,
[@] I was responding to AC's statement/scenario, that's obvious. I'm not speculating or justifying anything. It's quite clear what I was saying, I'm not sure how you skewed it unintentionally.
[1a] Answered in my previous comment: ...those that must view such material as a part of their work...
[2a] Bob: You're not even talking about child porn are you... To clarify: I'm talking about legal jurisdiction and unmandated warrentless searches of private property. The justification (illegal content) and method (ML) are totally irrelevant.
[2b] Bob: "...a set CLAIMED to be child porn!" Apple: ...the system performs on-device matching using a database of known CSAM image[s]...
[3a] AC: "Nobody should be prosecuted for possessing or viewing any text, image or audio recording, unless they created it themselves by abusing someone." In your country maybe, but in mine retention of child porn is criminal (which is totally supported by me).
[4a] @Bob/AC: You didn't respond to the Mr X/Y scenario, so is that fine with you?
-
-
-
This post has been deleted by its author
-
-
Friday 6th August 2021 13:06 GMT Rtbcomp
Big Brother is Wrongly Accusing You.
My biggest concern is reliability. We keep hearing stories about bank accounts of innocent people being summarily suspended or closed due to a computer suspecting fraud, how long before people are accused of being called paedophiles because some software says so?
I've just tried to sell a Monopoly game on Ebay, only to have the listing banned because it contains the word "Monopoly" which according to Ebay's computer means I'm selling some sort of gambling product. I got round it by spelling "Monopoly" backwards. It seems to have let thousands of similar listing through though.
-
-
Friday 6th August 2021 19:23 GMT Lil Endian
Re: Monkeys
Nice :)
"See no evil. Hear no evil. Speak no evil." is the 'modern' version, I'm guessing sanitised by the Victorians. AFAIK the original ended with "Do no evil." which had the monkey covering his nuts.
Interesting in that "doing evil" is the basis of law yet it's omitted in the phrase. "Speak no evil" is covered by law, and now the law is moving into the other areas.
[WTB a monkey-covering-nuts icon!]
Edit: I was wrong about the Victorians, seems it's just a variation (Three Wise Monkeys)
-
-
-
Saturday 7th August 2021 00:12 GMT Fruit and Nutcase
Re: Sure it's to go after the worst of the worst now....
Microsoft Office and Analytics just needs a few tweaks to do it on the desktop - if the capability is not there already
https://www.microsoft.com/en-gb/microsoft-365/business/myanalytics-personal-analytics
https://docs.microsoft.com/en-us/workplace-analytics/myanalytics/mya-landing-page
-
-
Friday 6th August 2021 15:08 GMT Buttons
Global Vampires
I don't think many people will doubt that technology can be used for nefarious purposes whatever the original intention. There are no morals when it comes to the deployment of IT and the use of data, its business. These people are not our friends.
Scanning on device content and matching it against a set of data defined by Individuals/Groups who will undoubtedly have a view, even if they attempt to be neutral in some way, will lead to errors in the results. I expect the technology has improved, but I'm thinking of the Met Police's attempts at rolling out face recognition in London. There were many false positives, AIUI, and it had real problems with people who were not born with pale skin.
While Apple just wants to ensure that we're no erring and therefore not a danger to society, it is a model which can be quickly expanded, to include other services, onto the devices that we so happily buy to track our activities. I think a few people have already mentioned ways in which Apple can 'Improve' their service and I'm sure they're right. After all a gun is a useless bit of metal until someone adds bullets, points it and pulls the trigger. People cannot be trusted to do the right thing even if they can agree what the right thing is . . .
More than this, I feel that scanning in the way that Apple propose will do two things,
1) Apple will become a police force, an influential arm of law and order. Should they have that power?
2) By scanning devices I think that they have removed the presumption of innocence. We will all be guilty before being proved innocent.
Now apply that device scanning to all your other misdemeanours. You know what they are: Jumped a red light recently? Had deadly thoughts about your neighbour and told a confidante?
Its a proposal that wants us to accept an overt form of surveillance, but of course 'Nothing to hide, nothing to fear. OK?
I love 'Big Brother'! and I'm up to date with my subscription.
-
Friday 6th August 2021 17:07 GMT cartledger
They already patented disabling the camera at certain times in certain locations. The example in the application was music concerts but we know it will ultimately be protests and discreet filming of establishment wrongdoers. With this technology, they will even be able to find and delete your images and videos after the event or that have been shared with you.
-
Friday 6th August 2021 17:20 GMT Cybersaber
This is an encryption backdoor for anything on the iPhone.
Per https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
in the "On-Device PSI Protocol" section that deals with how the image scanning and matching works, there is a bit of handwaving. They detail how it is _intended_ to work, but it all relies on a secret key held by apple. Any image can be decrypted with this secret key, even though they say it can't if it doesn't match certain image descriptors. All you have to do is change the descriptors.
This is an encryption backdoor prima facie, and just because you design a backdoor for one use, doesn't mean malicious actors won't find a way to confuse/repurpose this anti-CSAM mechanism for their own ends.
-
Friday 6th August 2021 17:31 GMT TheProf
Richard Pic
Every so often there are stories in the news outlets regarding women who've been sent 'dick pics'.
This seem to happen because a miscreant on public transport has taken advantage of the simple Apple provided method of sharing pictures.
As far as I can tell, if the receiving iPhone's Airdrop is set to 'Everyone' then a thumbnail of the obscene image is presented on screen.
Substitute dick pic for child porn on an iPhone with pre-update firmware.
Question: If the receiver of the image rejects the image is it removed from the iPhone without leaving any trace? Would the iPhone scan the incoming image to determine if it is 'legal'? How loud is the siren on an iPhone when it identifies an 'illegal' image and how long would it be before the baying mob set upon the innocent victim of cyberflashing?
-
Friday 6th August 2021 17:43 GMT Anonymous Coward
Whaaaa?
1) the tech is now useless because anyone with a real reason to fear it has already ditched their iPhone as a result of the coverage,
2) therefore, the only people who are going to be flagged up by this are false positives, each of whom will no doubt go through a nightmare time whilst law enforcement eventually gets round to acknowledging that the tagged pics were of tulips, or something equally innocent,
-
Friday 6th August 2021 18:14 GMT tip pc
Calmed down and checked the detail
Apple's press release
https://www.apple.com/child-safety/
detail of interest
These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
from a 9 to 5 article
https://9to5mac.com/2021/06/07/apple-will-let-users-stay-on-ios-14-and-receive-security-updates-even-after-ios-15-is-released/
Apple will let users stay on iOS 14 and receive security updates, even after iOS 15 is released
For the first time, Apple will allow users to stay on the previous major version when iOS 15 ships in the fall. Users will have the choice to stay on iOS 14 and receive important security updates, or upgrade to iOS 15 to take advantage of all the new features.
Previously, Apple would release older security updates to devices that could not upgrade to the latest version. However, if you owned the latest Apple devices, getting the latest security updates necessitated updating to the latest version for iOS.
Presumably, at some point, Apple will require everyone to migrate to iOS 15. You can expect that to happen when iOS 16 comes out next year.
So it looks like the alternative is to not update to iOS 15, iPadOS 15, watchOS 8, or macOS Monterey
I wonder if the new phones and systems released after the new OS versions are out can have the previous OS's installed?
-
Friday 6th August 2021 18:43 GMT Anonymous Coward
Re: Calmed down and checked the detail
I'm going to see if that's possible, as I already run iOS15 beta and MacOS Monterey beta.
This is 100% unpalatable, also because I feel that this represents a MAJOR and frankly unacceptable setback in overall platform security.
Apple screwed the pooch with this one, properly. Years of trust - gone in an instant.
-
-
-
Friday 6th August 2021 18:39 GMT Anonymous Coward
Re: Apple will hold the unencrypted database of photos
I think you may have beaten me by seconds, but I think that point deserves some emphasis anyway.
From what I've heard of other moderation efforts, it's also not exactly a fun job as you're exposed to the depravity of others. I personally would not be able to do that job, it know it would haunt me when I would head home.
-
Friday 6th August 2021 18:43 GMT Lil Endian
Re: Apple will hold the unencrypted database of photos
I've not seen that Ken, but it's a good Q.
I was thinking that the only people able to do the job of viewing these images (without PTSD et al as someone mentioned) will either be paedophiles (as you say) or sociopaths that don't connect on the "human" level.
-
-
Friday 6th August 2021 18:33 GMT Anonymous Coward
So, who watches the watchers?
Doing this almost seems motivated by a desire to collect such imagery. It makes no sense for Apple to destry a reputation for security and privacy built up over years, so it must have some other driving motive. WTF prompted this?
The other fun problem I see is the potential of unauthorised access to ADD things. Say you're an Olympian who just escaped to another country and some dictator wants to destroy your reputation because you made him look even more of an idiot than he was in the eyes of the rest of the world.
Now I have a tool to push some dodgy images into your phone, which I can then "leak" to the press and local law enforcement.
Yeah, well done Apple. Would have been nice if you talked to some sane people first, you know, out in the real world.
-
Tuesday 10th August 2021 11:52 GMT Anonymous Coward
"it must have some other driving motive. WTF prompted this?"
The reason is probably they're deploying a technology that allows them to control whatever you have on your devices (i.e. music not from iTunes...), and the only way to make it acceptable is to tell it's just to hinder one of the most horrible crimes.
Apple knows that sales of iDevices will slowly shrink because it will be harder to add more new technologies, and need other revenues streams in the future, and is preparing.
-
-
Saturday 7th August 2021 11:39 GMT Anonymous Coward
Re: A legal minefield
"I do wonder what do Apple get out of it?"
A smug feeling of self-importance? Wait, no, they already have that.
Gratefulness from their customers? Hmmm, probably not.
Respect from the industry? Oof!
Lead-ins to other government contracts? LOL, "Apple" and "government contract" in the same sentence makes as much sense as "unicorn" and "starship" in the same sentence.
-
-
Friday 6th August 2021 19:41 GMT Phones Sheridan
This online newspaper was BANNED BY APPLE!!! and you won't believe why!
"Apple infamously refuses to talk to The Register"
As the subject says, I am surprised that El Reg have not tried to make this viral. Every other day I read about something else BANNED BY SOMEONE™ and it fills my social media relentlessly. Try harder!
-
Friday 6th August 2021 20:41 GMT Anonymous Coward
And the Apple guy responsible was sued sucessfully by a former employer.
Seems like a very trustworthy guy...
https://financialpost.com/executive/management-hr/blackberry-ltd-ontario-sebastien-marineau-mes
Then you add in all the other stories over the years about Apples "ethical standards" that make Microsoft look like Mother Teresa what could possible go wrong.
Now this story brings up an interesting legal issue. The software must have been trained. No way was it 100% unsupervised. Unless I am mistaken even inadvertent non voluntary viewing of child porn images is a criminal offense in California. Unless part of a criminal infestation or by LEO's. And the possession of the software training images in any form was also a criminal offense.
Sounds like someone did not run the project by the lawyers first.
-
Friday 6th August 2021 21:51 GMT Anonymous Coward
Have They Perfected AI Now?
I hope they have.
Can you imagine an innocent photo being tagged as child pr0n, and what lists you'd end up on as a result? I mean, people never use their phones to take photos of their kids, do they?
And the effect that could have on your life. And how difficult it might be to get off the lists. And how little it would matter if you did if other people had already found out?
I'm thinking of that case a couple of years ago where a black couple took a selfie and Google identified them as 'gorillas'. Among other examples of how good AI has been up to this point.
-
-
Saturday 7th August 2021 11:16 GMT Jonjonz
This does not add up.
How often do multinational corporations suddenly decide they exist to become a vigilante versus one specific type of crime and invest significant resources into the process?
Nada.
How often do multinational corporations get in bed with the state to cooperate in the surveillance and data mining of individuals, hum, sounds like more familiar territory.
Don't pay any attention to this massive AI we slip-streamed onto your device as it eats cpu cycles. It's for the children! Trust us to look after you while we sell every bit of data on you that to the highest bidder (we don't call them that, we call them business associates to skirt the law.)
-
Saturday 7th August 2021 15:28 GMT JavaJester
Why Stop with iPhones?
Now that Apple has shown the world that using technology to surveil and control users is appropriate, why should governments stop with iPhones? There is a whole world of electronic device waiting to be put into surveillance service. The company that ran the 1984 Super Bowl advertisement has all but invited 1984 surveillance on our portable telescreens.
-
Saturday 7th August 2021 15:36 GMT Tron
The return of the Amiga at last, with an OS that is not spying on you.
Apple have just undermined trust in computing generally and in their own products, specifically.
This does beg the question of what we do when we can no longer trust the OS provider not to auto-scan our files.
To OS provider, we can add software provider, cloud provider, Webmail company, VPN and other online software service provider. Maybe even firewalls have ears.
The next popular application may be a sandboxed Works/browser package, but I guess that could be bugged by the OS vendor when it uses the screen or printer.
There is the option of an offline system. Once a system is set up, you should be able to use it offline, encrypting any data that you then feed into an online system to e-mail. W7 works OK offline. Not sure about the latest versions of Apple and MS.
As the three main OS providers are American, governments outside Washington have a problem, as the Americans can simply order Apple to do their dirty work in the name of national security. If you have pre-patent designs for something new on your system, will they be auto-scanned? A non-American next generation anything would be a threat to US national security.
China are going to be knocking on Apple's door real soon with a lengthy wishlist, should they want to continue operating in the Middle Kingdom (whilst mandating Huawei for members of the party).
This is an absolute train wreck that we did not need on top of Covid and climate change. But perhaps it will stimulate a new round of development as companies offer options that offer protection from spyware built into the OS, and alternatives. Raspberry Pi? Distributed systems? Fax?
Of course, if the USG was already doing this, they won't be pleased that Apple has made the whole planet aware of it being an issue.
-
Saturday 7th August 2021 16:07 GMT Charles 9
Re: The return of the Amiga at last, with an OS that is not spying on you.
"China are going to be knocking on Apple's door real soon with a lengthy wishlist, should they want to continue operating in the Middle Kingdom (whilst mandating Huawei for members of the party)."
Maybe that's the reason for all this. China may already be knocking with a list of demands. And unlike last time, Apple's potential counter of packing up and leaving may be accepted because China now has a strong homegrown phone market and may well be willing to go without iPhones in their country. Who's got the most to lose now? China's access to an American icon they can just pillory, or 1 1/2 billion potential customers for Apple?
-
-
Saturday 7th August 2021 22:40 GMT gdbc
They're after your meme's. The only thing this has to do with Pedophilia is the marketing guys getting you and "mum" to accept it. People aren't that stupid Apple. Its lazy, the first thing anyone wanting more power does is state "Its because of the children!". No doubt Google will follow suite. Soon a "hate" image will be you taking the p*ss out of the "wrong" electoral candidate or suggest an election was rigged.
Quick question: Was this announced before or after the US senate passed the "Infrastructure" bill?
-
Monday 9th August 2021 08:39 GMT Irony Deficient
Was this announced before or after the US senate passed the “Infrastructure” bill?
Since the US Senate has not yet voted on the infrastructure bill (apart from a vote to invoke cloture to prevent the bill from being filibustered), the possibilities are either that this was announced before the Senate approved the bill, or that this was announced before the Senate rejected the bill.
What do “our memes” have to do with the Senate vote on the infrastructure bill? Is there some sort of anti-meme legislative proposal buried within it?
-
-
Sunday 8th August 2021 07:10 GMT Anonymous Coward
AI is not able to do this reliably any more than it can identify individuals accurately.
The problem is society and industry has this perverse idea that self-adjusting pattern matchers are "intelligent." They aren't. They're just very, very fast at doing the matches and providing CANDIDATES that need to be REVIEWED by HUMANS.
And we all know how big the tech industry is on hiring competent people to curate posts and content elsewhere on the 'net.
I expect there to be PLENTY of false charges, investigations, and MASSIVE lawsuits against Apple for the slander claims.
-
Sunday 8th August 2021 07:31 GMT R.O.
Today and tomorrow and later
When they say it's about the pedophiles you know right away they are lying and it's really about expanding police state mass surveillance. I guess we should have known Apple's apparent commitment to privacy was just another PR and marketing scam.
Today it's about pedophiles, tomorrow parking and traffic law enforcement then how much you love big brother.
It's a continuum.
-
Sunday 8th August 2021 14:08 GMT Anonymous Coward
It’s difficult to object
If this achieves its objective then what’s not to like, but I fear the doom mongers might be on to something, and this could well be the very thin end of the wedge. Other solutions to this growing problem are not forthcoming further I suspect the scale of this sickness is beyond our wildest imaginings. What a quandary.
-
-
Monday 9th August 2021 20:45 GMT Nifty
Little changed from the first announcement. The wording of the 2nd paragraph, that refers to all IOS devices, not just family account managed ones, says it's an in-phone scan, followed by an upload of suspicious images to iCloud. It does not say that opting the device out of iCloud will defeat this feature.
The flow diagram that Apple initially published did indicate it's regardless of whether iCloud is enabled.
-