
So This is considered "Safe"
The major issue with all this shit is that what is added to allow a your government access can allow another government access or another 3rd party entity such as one of the various criminal groups.
The Internet Architecture Board (IAB) has warned that policy proposals requiring or enabling the automated scouring of people's devices for illegal material – as floated by the European Union, the United Kingdom, and the United States – threaten the open internet. Apple brought widespread attention to this so-called client- …
I'm sorry, but for me that's not the major issue - certainly huge, but not the greatest.
The greatest: The Slippery Slope. Exactly what do they declare as "CSAM" material and how is this judged? Does an algorithm that declares your personal, adult grumble pic a CSAM also decide, after cloud-blacklisted you, to call the rozzer to your front door? And how exactly does one contest this, if even possible, before your personal, business and reputational life is ruined??
And then what is to stop the modern-typical neo-fascist in expanding the powers of the algorithm? "Think of the children!"...and then make *all* LGBTQ content declared CSAM? The religious zealots are trying for that right now, IRL, with some places even banning just a simple "Pride" flag. Imagine: a Pride flag in the background of a child's photograph is reason enough to have you arrested.
No. Just Fucking NO. Absolutely NOT. It's not just the security issue it's the same question we've been raising since the entire fundamentalist / conservative movement restarted back in the 1980's: exactly WHO'S "family values" are we fighting for. "Think of the children!" and "Family values!", while they throw their gay children out on the street and ostracize their own sisters & brothers for simply not even being inside the same church. Open the door, even a crack, to this possibility and for SURE someone is going to push that door open in the future once they are in the personal position to do so.
Worse, there's no objective way to know exactly what is being scanned for. They might promise you that it's only things you shouldn't personally worry about but, unless you can generate a match, it's not possible to independently verify the contents of the hash list. It could be a photo, picking a completely hypothetical and random example, of a politician enjoying a party when they're not supposed to be. Then, by looking at the creation date, the photo could be tracked back to whoever leaked it to the press.
Worse, there's no objective way to know exactly what is being scanned for. They might promise you that it's only things you shouldn't personally worry about but, unless you can generate a match, it's not possible to independently verify the contents of the hash list.
There kind of is, ie as you say, it's likely to be a hash. So for CP, the idea has been to hash known images, which can then be searched for on everyone's devices. If hashs match, you're busted because possession of CP is a strict liability offence.
Of course what it won't really do is help prevent child abuse because abusers can upload new images that won't have a hash value created until they've been found somewhere and can be added to the database. That may be quick, so a seizure based on know images uncovers previously unknown ones as well, but it'll still take a fair bit off investigation to determine who the initial uploader or creator was, and this locate the kids.
What it might help to do is scan people's devices for copies of either the Conservative or Labour election manifestos, and flag those as dangerous subversives.
Hashes are, of course, lossy; by the Pigeonhole Principle, there must be colliding inputs if the domain is larger than the range. And the hash system primarily (universally?) used today for CSAM identification, Farid's PhotoDNA, is deliberately highly lossy, since it tries to be robust against some types of image manipulation.
As the database of forbidden values grows, and as the number of scanned images grows, we'll start to see real false positives. And that will be a Very Bad Thing, because law enforcement and NGOs like the NCMEC are trigger-happy.
Already we've seen cases where images that are innocuous have been added to the database (e.g. because they're innocuous stills taken from toxic videos) have been identified as CSAM, with the people who posted them suffering adverse consequences. That makes it possible for trolls to socially-engineer someone into reposting such a shibboleth and then "outing" them to the authorities — image-swatting, in effect. And it's possible because the people who compile those databases are not careful.
Client-side scanning is a terrible, terrible idea, promoted by surveillance goons and single-issue axe-grinders. It doesn't help that we have celebrity nitwits like Ashton Kutcher jetting around and talking to ignorant politicians, either.
Particularly as religion or religious beliefs of, leaders, politicians or parties now appears to become an issue as part of election campaigns or policy making.
There are already areas where this is starting to surface, laws are being made that are have a life changing impact on people because well funded and highly vocal figures are in positions that can influence leaders, or worse are those actual leaders.
They already likely have various tricks, wouldn't surprise me if the surveillance agencies could listen in on your house even when the phone is hung up.
I wouldn't put it past them to have at least tried....
Ditto cellphones, variant on Pegasus?
Then smart TVs which doesn't even need manufacturer buy in for govt misuse (snowden files)
It seems wherever Russia leads the politicians sprint to follow.....
Fuck me we are ALL doomed
One world govt, except it will be less united federation of planets and more 1984, V for Vendetta or starship troopers type dystopia....
> They already likely have various tricks, wouldn't surprise me if the surveillance agencies could listen in on your house even when the phone is hung up.
Alexa, Are you spying on me?
No, Dave. I'm in the middle of downloading a firmware update from someone in possession of a valid signing key..
"listen in on your house even when the phone is hung up"
Check. Did you think your window shades are giving you privacy?
"wherever Russia leads"
What I came to say. Is Putin on-board? Of course he is. Could that possibly bode well?
"One world govt"
Don't see how that is going to happen. The EU isn't united. The US isn't united. I only recall one dictator willing to abrogate their power.
Don't see how that is going to happen. The EU isn't united. The US isn't united. I only recall one dictator willing to abrogate their power.
Nonsense. The operation to get people jabbed and make billions showed they can unite easily if the gravy train is luxurious enough.
Not even one arrest over deleted messages.
"wouldn't surprise me if the surveillance agencies could listen in on your house even when the phone is hung up"
They can, this tech has existed for a while. You point a laser (doesn't have to be in the human visible spectrum) at a pane of glass (or some other surface capable of resonance) and record the vibrations...similarly, it can work off reflective surfaces albeit far less reliably.
This is one of (many) reasons why secure installations have strict guidelines on placement of reflective surfaces and / or the presence of window or panes of glass in given areas.
Old as the hills and cheap as chips...I've tested such devices before...they work pretty badly in public places though. Just sayin.
> The feds ... sold more than 12,000 ANOM encrypted devices to at least 300 criminal syndicates operating in more than 100 countries. The FBI subsequently used its backdoored network to conduct mass arrests in 2021.
If there was ever evidence that pervasive scanning is unneccessary this is it.
You don't need to secretly spy on people when you can simply ask them to spy on themselves.
Of course politicians want to be able to cripple their opposition. And they want to be the government in power when the technology becomes available otherwise it will get used against them.
Law enforcement want to sit and watch cases come to them rather than get out and prevent crime.
Of course politicians want to be able to cripple their opposition. And they want to be the government in power when the technology becomes available otherwise it will get used against them.
Nope. Our useless shower of shite, ie MPs granted themselves exemptions from most intrusive surveillance. This is because they're special, and not at all because they're the ones mostly likely to cause the country damage by being corrupt.
So I think we should demand a 'You First Act'. Sure, they can do this, providing for a trial period lasting no longer than the length of parliament or 5 years, all their personal devices are scanned, and the results published online for the electorate to browse at our leisure.
It's both troubling and depressing that the authoritarian fascists who've infested the EU, UK and US even think that this is close to a good idea. As is often said, 1984 was not an instruction manual, it was a warning.
ps.. I've always wondered why MS's 'search' process used so much time and resources. Maybe that's just the unannounced beta version of this scheme.
How terrible are the autocratic states that surveille and censor their citizens to clamp down on material the State dissaproves of!
And, by the way, it is ok for us to mandate surveillance and censorship of our own citizens because we're a nice state not a nasty one and we'll only look for material the state disapproves of ...
1984 here we go ...
The envisaged device-side scanning wholly depends, I assume, on software installed on devices rather than hardware modification during a device's manufacture. If so, this could be facilitated by commonly used operating systems like those provided by Apple, Google, and Microsoft.
I have yet to see discussion of software-based potential workarounds.
1
Get a dumb phone
2
Stop using the Internet
Admittedly not software based. My dad was ill a couple of months ago.
Took mum shopping. She didn't understand contactless payments. Dad used to drive her to a bank and she would write a cheque for cash.
In some ways the quality of life will be improved
If it is based on software, there will always be a workaround !!!
It is just a matter of time .... as usual.
Whether it is deemed legal or not will not be an issue, unless everyone is checked every hour of everyday for changes.
Already I can have a 'rooted' phone, which can hide itself from checks in certain software.
The same will happen with device-side scanning.
It may initially work but someone will 'break it' and work around it !!!
Hardware based will be harder but will require *all* the manufacturers to be forced to implement the 'chosen hardware' in their phones.
Also what do you do about all the 'Old' Phones that are in use !!!
In a nutshell, it is the old pipe dream reanimated for a new generation .... been there .... bought the Tee-shirt .... now using it to polish my shoes !!!
:)
I want the proponents of these ideas to answer two questions:
1. Where's your peer-reviewed prrof of concept that shows it can be done effectively and safely?
2. Can you produce a convincing argument that this is consistent with the presumption of innocence, a legal principle that has kept us safe for centuries?
I want the proponents of these ideas to answer two questions:
1. Where's your peer-reviewed proof of concept that shows it can be done effectively and safely?
We are the government, we know this is what is needed and will work. We do not need experts.
2. Can you produce a convincing argument that this is consistent with the presumption of innocence, a legal principle that has kept us safe for centuries?
We are the government, we would never do anything to harm our legal principles.
Also: We are the government, we decide which answers are satisfactory, not you oiks and troublemakers.
Or 1) we have searched far and wide to find "experts" who say what we want to hear , despite them being objects of ridicule for the rest of the world as they talk utter shite
2) habeas corpus is too "woke" for their tastes, cue some false flag operation to justify near permanent martial law and when they decide to lift it (if they do) then anyone who even hints at it is a terrorist using coded speech and will be "dealt" with
@Doctor_Syntax
Quote: "....can be done effectively and safely...."
.....or even done at all!! For example:
====
xI0q8ppamX1dJARzVZpEksaXzzfgI8xZ5Yj5TYb4INyzLP3EmbYuNK4DmRG/3LqkRBXEFUyga9H8
PSzepavNeiqH2R+xLDIj90hvDwEdjQhaWkcJCH+lASby5/MIav8xNf1HEHQbh/aOnrZjom4dVTgs
pHaqRKNLXAiXcc6t+7lOQk59xqND8yUXDL9d6XHmbpdnjgwMkTEp8yxgQYoCFxY9HYsXkBFetVjK
zGr/Aa30sNXNwZtoh4bnKhgKmdzobwGXK8UPYz8Ja3uXfZfzZmTFnJ5M9HYBIPTNKExvpdFczLf5
UweU6YK5UqlWA7seDT3btzuUOq9P9n4TP99nWVAn85b0D5tVqbaNjeN8xVT7lzEQox3YwT6/wGWT
pZEmfO8nNtTJ+2EPzC+313iKQLR1+DJkcEi4WcofSgWH4UGIgurhV34ed671GK4ngTwPcZOch8Ig
TRFii+Z9aYBp3FX8nPHOQR9ESBQvcQKPvNUO5s2PU3PYo4r5M3GhKWCRHgBEg+Cgajk7gRRpEyMH
6Djx43j2ikt1ncnV++uMdeC/ZoAOmQwMCAHYSX3F1Dvy2cYBCvmWO41GFlUGr/El3M6kLEHHs8N9
jV8GZHev4p133aQFF08uwJ63jDOisvCl9zPvXdTDflQR8lV3oFLXBsGEXsdyWFjux5b2QN7amtFb
s0Kb3llKyvpbBvg/coOjvuZJSFBV1Z4MDgFtAPYsZxY18ZMjKBN9yXf1r6pBDhDeLwy1S5rSsA/z
yZF+6B74VOCvFgj64xrNLfUy/s4QoMFb5vvoxA9OPD7ELXYxtl07brcjo8RB9DbptmiYOH151CU9
y/oqIP/nNCEdpyyCW0GqHmC1CmgG/GaUCli32pDSb6itifgZYbI2LK23d2MmToE6/jEeLxXpxLwy
RliA6770IRcLF5oYeiXn6w/2LV0+/zx/6WE1e1uX6VlWA4kHl3I+bl6sv/0HDHBQYOy99HFQ2J9/
kBNmG+B9XLO/YOvtRfLCYdEnE8ew1Myoo8pQsfsFgGfBtTNn1yGScKQK3/x/Bm71SvnxsF3zcLkG
xpyMpzY0LnlAv0wwPe8uXbDu/eUPVEhVcmvUAlEyyfFuQEPMBDxhHIG94+pQm5fMpzHUIkjyxGGN
cdIYH68jsGUawZEBSVkqnrHO6g/Mof8TW/CJlRSaJIBCnTwi4bwt3a68kopasR2d02YS7t6zxDnm
bJn8704Cdw1phsww96PVIuyfeypxuQxAQsRtfd5TSRRI26nyG99Xw8YiazRnyZCnkOmy7ofJOm4B
jzqFy8jHPBbkUJJYdv4a+hAO3hP9yTzd4k+4pq03B8u99qJq5ssUeAyn73EUZLH22YhRDH3lzIvw
ZkOy0pJauDT/lywJReCeLtgf1wjRQ6lDg/Ju2r+oXzSzDRE+Xs1LcwOaPyVa3ihV/vdzPlyW3TPR
CDjji3IlniYK8skbwYNEuPJLkA0+VdGu3WuYfBiPXA1p3f2Egh/oFCVcbKoeMYgr16YT6icaIC2w
PO3KCX9Ua9QifuN20xIRSa7e0ZtxS7VjdbWDPzod82qjL+dwNq76TqD6pLPV95HoMrQKS9+3rEwh
hOr0d3sfSShlK9GxIl/ejIwY2Pue8Q7H9oVIWt9Zm92l9OMxM1k8mN8NRFnzJog+/E4+7ZyEhdbR
GeHZ1kj3RI7y/hitt/PapeBclayqwnmV2+it6M42+RsISSVFbMlH4nFC9XpPGjDuoYTne7YJXscp
fVUxLS9YQZb5BFFwflT29vn3dzGxqyiDssyArBKz6LzbUQJ9BVVoFsm8Sfq7bhrjMa9Y787ybHU1
wOpe0UTiQQB9RfGbIZFuSmugMm5Awt+bBhdMtbdTcaWUcyzH89qUju6c6fzHMkErAs2XrB69VnO/
NCcKtBBx9ehxVqO3FGIzwOXdcSTtQIvuGGZmFAktW8Sr9VDw3bSNeI+bQNqVh0miKeDRqdoYWWf9
rDWy6tUU+JK699F3+GGngZ08DUVihynVKKbzqP0VKXs9XhcMGUH6fxGXJWZ3GEZGsxscci/3apO8
oeiNBtozn0Et+oBUFJjKh4TwqebatMHqsLzoznxztOgnPCXklnLZy6Y9/TkzTQmgJ8cwzZfvlD73
vCL+BMLQYEZHqIL5zX1UZoP4LFbkZ+9xHDDJSNjlf1xBahe2fKIJSWYquSO/oLdYgi/mQ8jQyob7
cEOJo3xxy4VlzCX/XBOWzesXH4hE5rby/79yfLkDnJ/ZP7NN/DAA5KAjiLi9y/A5u+x/0uHl1IMZ
Aed9LewUiR3pZ1HtUSv8HDeGI05JN6+KD8V4gQ4Tlfyp5Vpe2PyokeDN34Fjn1gwLUTTtFhbLnQL
JcHEGo3XgBD4QGlDJjMMDkDOzYFzDNQk5p//YdfgbchGngTn4NYpIsypg77PdJ0hWVsZ+ZZk4CrJ
qD2YwEpTNFSODR4qmbcK6wMM3zndO2f5HA8WRPXMN2dJe9AzBC3BdsfbKFt8HJAcw7ZjDVF+QOSl
aITvx7rz4X6V+MK1W2ZY+0cfa6/Xi5fglEJVhdB7C+Gx82SG6zU5gDkm0DT/YtKoeXH7h7wr1aAH
IUHwhbH9hKBOvO2iW2QJp+ABgHAY4bOPQQVFgqJdPYspadtJ2UY6UxLxOUaNiseA8hHKPqc0MTre
ez5Osj4h1y/SDhGSvPSjIb8vTj91uO3lb7mxEGZTxb6MyrORjW4eN3RRtkYNHmVBRVGVWKTYGVUo
QYmQkB5fLzqD7YlLultHhIpbFU2JgvJchYt6KjkqvvjHtUU1FAovegqn8trWPBEv3TFBw+TbpHO9
u1lYMp7SSz4b3GcFJZ++xmD+EOheylWtGzEiu0ddA5z/qsChb2q3xUzekwwOj8nP8y+9SL3OgH2A
Wflnth5m9g/S5PHWtXhBER0pNMwU/WjSKA28uGRSq0CBlCViNx62//54QyORgBk/wGBDb5piwvfE
U5jLzWbLZJjmskGYtPyEcw6IRz2qZVa2MJ2w2tzLHkjxBSaE55esCBlgr45Z9G+zewoxJsurG/By
FWNzBpqwlbK+YFKrWu+W29MWUN0bg03L+CkPo8/6evvBA59GFn2bc2tY8yuBhmvl+k17uwrgCKpm
hHPzsXf0mT8GyDeSeARH+0MA9XrFuHrCE/TqEYhVioNCXUFM1x05BvvQgZ0KugBv3QorgfANV9bk
jpuqPq3EtLgrptOfl5Urqjh1Wbr/w7y9oxxP7Ecbvcu7trj/cYKKux38tewEnCHiduhyH8oVVZB5
0YUSbG31FSYs72unYL7uoCl2xoYRf/T5Uw1+IpmP3ZwMviLxth1xhMjKc0FUC6wJ1c/IuaxNGjcx
jExIWW3wl972qa0vcYYz84jBlr4w2zdz7xMeFbGt9D6QV894dfKdd9wYoLkr3yzk6w+JGwlVFpWb
IEd7e2dwGiw8nblrmS8gNz12mf7kfGCJENZo1VH2aqj/5rHZi7bBZ8/k1g0vYOHtTrI/RVWNb3RD
OIyTRYsbBJuns/xSa2ZZlyiVUJdJQXFZbseIckyuI/5ozzp2MkD1P6nuLLWqP5dClSnRl3wAiBOP
Dp/CEjtc5Aa2Wlal4EwpHnVaa6Dpr8rBnmZj2SLkj9nb7ZYDWb+X22usPX8gRn4juAgwIyWlz3Kc
2JnJHrrSEPJI0sSg/2fy34+Kfjm35WjpEhLU7SMPF7BPuO4X10j4Yj04MgVKpiKnSG0wz+mYA5Pv
E9hSExSGlT6SDb47V3UPwSyxZB65C+edbe/EFxjM/UqRjwi3TkYWVVhk9RkoHS9L7lavweX5dvN4
+/sCHnx1Od/5wKbPZBB/0nfWgsXg+91c7/0uqwUzVUkrUnhTDERsqZ/ELYEKwk5HAMihEJrOR3Ii
YFRpHGOaGd/iDHCO4xKBBfNJw4cegPiRbdcmqcQSC4Bxgw7A5k4d2gvKnLfkzNLGVOP8VRuMre0R
1xt99Iac3d+Otb3ZgUp6V1u5crr5pnltuUtLW6INHAaSel9o4oqLWztJLJdQ28vVjYHjj0COoAIh
dE4rVjnf/7GmF2EQ3VjKaI2hRdOeAtZoUgvtnWm/BY01/R1FFS2lZvkgp2wotYYYF/L5ybSk8qE4
lgrWFndZByrzqFfkQZ0/NTYBDUiQSqpf2CLdnp9n4l63rIO8Vc/9paix6JFpe9kehhgeXc15muv0
8/j5joWN/NuxVzjvKjO1oS9MUpFT09aRqYMbeGl0eBB7eZ+ecZOWk8dNPtKo320Cujk2B4yiqaxd
4fmUcYXqiu/LxGWpP3D3YeptYx7TcE39NzJPcs9jpJTXqZzbcr/N/mnRofb/FjQyCIogewSyQrCL
psvSm2cVAHyn22fxLV5Sf8fZqGZ6GMWTAmtY2A5vIb1XIJ/8vUglR8UMqmt6ulpigtXGwhk68fbc
7VRjjDI5fhAqMLVg4MN/npz9HukmOThyPIhrgXdJ6yW1gyElbdCQGLh/d1xTsDZXo2GU6YVa/HPD
Tf9hBARP+uSgqgewSyPbTWFaDD46CJ2cF11dLLcDDPPJsj4uObXB0hS2bf9zhdEBplVojaMtqkUl
WJxh0rBQU4PTxK1bXbnxO5Buwt7rFyIyT4Jf2xq7L59hmEDUQMadGIseNEQ9MhVCApHe5102HbY/
9Mc6m/GjhRBSMwQhgdZumyQqBxMwEOnx4srvVMNFH3ejZE9c1RctootHQ821EilvQWe8Ln0ONljS
Lc1zOznAx8K87RkHGHP3IhfgGs3hH6EGUx9ikBWaK0ryu0lcCxB7iOHDVWOWe+PgowmV3zrLWSPt
iOh02yPfrGHxDKPz2+iP1ugAG4mVoyCZ7YkuNc4WOjD8c/jZw0stYwE023JqVEzRUqcoAVVp9Usc
wAM0c2m0SGTRzaPT+vuru+OK+ZV+ddeK5zJRS9ivG2/kFrvVfgQcavTDHu8RqczJlpJTVls6XAmh
xfQFL0r9eI7op+2HP46srAP2MbGRS614i/DEB88mu9YgnI/i5TlpztxOs6Wl0X5vlmAfiMEJb3HR
dTBuJRFFN68S1+KUmNMZYd/ddFv+j7WSQhGShXaQpTMkE6G7v/WS35CXYp5J2BeayUZnDPW1b4lv
AAgPUgGT3YYLU4QoPJeJf2yfc0tSsqnC8VtE8RtJj1uGE/iefTw/svhXAaUwmYAUrRtexH7lvM2y
YCiEw3aNyQ/wb4UC5UQp7d8ZoXOHmTvY129D6ao6VBYk9Fvi5vLpWEa4HEq/cdVHieg3qshzh35S
EtTKOWk3Y0V8q0Jdkt5KP5ltmNOXsjpcX2NOEa00/6HiDPRM6zXJps51ZrGlI9ZrU9br3xQ0tXwr
agm9vxi+yfcUKtQD2KxRSSCoC14A9go8v5ZyDThh8TElUne5IRr5gQodiM5OLvypEnE8FDpKX9sN
M08irisWRXk2IWvefGex7Yo/qhkU0Zx9Vfywg1iNhDhJjLe2VgpZ0Jety0bp15Y1Q1UXdkVJifLY
GadnOYPeE7LMVtL4sB1oF/waYILwZoH4+oTPB6rDk8JkDOAAdSX9UvhiOTX0dcVaSXCJw7UAQUQe
i4telOILt6iVfOvI/QD04dhs4vjeKIuL24d8AhBJrf1mJUpYOZG/g7fAqPKdQZdsm66jB9q0k5Cd
yKTKAGbOwzyhb+DWjPQHvDNuBx9VBrLR7J5girgKBdB1Nww6EV/FxLBfOJM3eVo2sJ7Jwg/aiJw6
YrHsS0j5WLcnwbg+AuCln7xEUqhQwKac2eEt6XZ05ILkKeaaankRgX91yr8fpFtZ78/6vBeXtJKz
NL8NX82dD1GyvenPkfCWwcY/4n2XspCxv3uOnyYeT42VEl5OBC+loztMBO00W3JFx8C0blhq8Ciq
AHkyQ/NqV6gg6JgMXGuvKvpue5Ld
====
There are a number of human rights concerns with client-side scanning.
Firstly once the software and hardware infrastucture is in place, there is little to stop it being used for other purposes.
Even with the stated aim of detecting child pornography, given how "advanced" auto scanning is I can see many people being incorrectly flagged. And with things like child pornography, even a totally false allegation can trash someone's reputation.
Finally, I have sincere doubts it's effective anyway. The paedophiles will just use obfuscation products to hide the photos and videos.
"At issue, primarily, is the loss of privacy from the identification process – how will that work with strong encryption, and do the files need to be shared with an outside service? Then there's the reporting process – how accurate is it, is there any human intervention, and what happens if your gadget wrongly fingers you to the cops?"
All valid, but my main issue is the fact that someone deems it necessary to rifle through my belongings without cause, evidence, suspicion, or a warrant, "just in case I may have something dodgy"
How is this different from regular warrant-less searches at our houses, through our physical property?
It's about the police wanting the ability to effortlessly conduct warrantless searches of all electronics without a reason at all except for "just 'cause":
"Just 'cause we feel like it". The police lobby is more powerful than the people lobby so you already know how this story will end. Likely the deed will be done in midnight sessions behind closed doors.
Ain't democracy grand?
it's never been about children. But because most voters (who come handy every 4 years or so) do CARE about children, the 'think of the children' is great, because they approve. And even if they do not approve, they HOPE 'the government has good intentions'. and if they don't approve and don't hope, they will NOT oppose it openly, because this would make them side with child molesters, so...
Looking at the comments and seeing the mention of hashes and how they can be used to identify content without having the content itself.
Reminds me of AI used for image generation and how source material can be modified to poison the models so that they generate or recognise in random unexpected ways with images that do not appear to me modified to human eyes.
Got me thinking that it should not be hard to get a list of hashes of content being scanned for and then modifying innocent images to evaluate to the same hashes or modify guilty images to have hashes that match common publicly available images.
Makes me think that any form of recognisng content without requiring having the content being compared to is going to be generally spoofable in this way.
Just another way in which sticky beaking in the hopes of preventing whatever naughtiness you might be interested in is a losing proposition.
it should not be hard to get a list of hashes
This is indeed a big problem, for the spooks, with client-side scanning. PhotoDNA with a standard (-ish) set of hashes, compiled from material supplied by the NCMEC and others, has been used for years by the VLOPs to scan images (including video frames) server-side. This has succeeded to some extent partly because the hash database isn't widely available. But put that database on endpoint devices and it'll be leaked pretty damn quickly, however you try to protect it, because endpoint devices are under user control.
Of course, there will still be many people passing around old images that will have matching hashes in the database, just as those people now aren't trying tricks like cropping and recolorizing to try to bypass the hash. (PhotoDNA is somewhat resistant to that sort of manipulation, but only to a point.) Many criminals are stupid and lazy.
But once the hashes leak, it won't be long before people start generating altered images that don't match the database. And with GAI, that's easier than ever.
At some point, the suppliers of CSAM will realize that it's cheaper to just keep generating new content from whole cloth with GAI rather than actually abusing real children, and there will just be a steady stream of new, non-matching images. It's hard to outright call that "better", or even "less awful", than the current situation, though it means less abuse of actual human beings,1 which is certainly a utilitarian advantage. But it does mean that this sort of scanning will gradually become less and less useful for its ostensible purpose.
1At least in theory. I don't know that I'd want to bet on it. The economics seem complex and frankly I don't want to think about it any further.