
About Apple's proposal
Apple's scheme [PDF] took care to address many of the issues raised by other posters. In particular, it was designed to have no impact on any aspect of end-to-end encryption. However, its Achilles heel/Trojan horse was that it flagged only images matching a pre-cooked list of hashes of CSAM image hashes provided by some external agency. The independence of any such an agency would be open to question: hashes for anything considered unacceptable for any reason could potentially be crowbarred in by authorities. A review, before alerting the authorities, by (unfortunate) Apple personnel of the images causing the threshold to be exceeded on a particular device, was proposed as a backstop. But nevertheless … Apple has in the past deferred to authorities in a variety of markets in a variety of situations.
As for the on-device detection of nudity in photos a child's about to share or receive, that's already there in Anglo-Saxon-type markets only (if enabled by a parent who has got around to setting up a family group).