Reply to post: Re: If they can do why do they not tell us how?

Tech world may face huge fines if it doesn't scrub CSAM from encrypted chats

heyrick Silver badge

Re: If they can do why do they not tell us how?

"available which includes hashes of child porn images"

The problem with a hash is that it is a mathematical equivalence. Is this picture the same as that picture?

Well, couldn't that essentially be broken by scaling the image, say, 5% either way? Or compressing it a little more? Or gently messing with the colours? It wouldn't take much ingenuity at all to batch convert a bunch of images from known matches to unknowns.

Plus, with only a result and no actual image to work with, how does one train a machine to be able to recognise such a thing in this case? It'll be like that judge who said that he couldn't define pornography, but he'd know it when he saw it. Well, we would have to teach a machine to know, and given the hysterical responses a lot of people have (not to mention the malignant behaviour of the police these days) we would have to teach it to be accurate and have a low rate of false positives, yet protect children by catching everything that is bad. In other words waffle-waffle-magic-waffle-done. There, that was easy, in government land.

Meanwhile, in reality...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon