Re: It's simpler than that
“ it would be a stonkingly sophisticated AI that could make such distinctions”
Luckily it doesn’t have to. The AI in this case simply has to compare it to a database of known abuse images compiled by professionals who cast votes to agree illegality. The AI is not a model to identify CSAM, it is simply saying ‘is this picture one known to me’ (but not necessarily binary identical due to compression or cropping).