Might be worth a shot ...
The word "Happy" in the headline threw me a bit, as there's no guarantee that 'normal' childhood pics are full of happy faces. But actually, it seems to be more about the overall situation that a child is in, and presumably the subtle clues that an AI could detect in the body language of the children and adults present - "To develop AI that can identify exploitative images, we need a very large number of children's photographs in everyday 'safe' contexts"
Worth a shot, I guess, even if it doesn't ultimately pan out.
However, I'm not sure if the outcome for the reviewers will be any better - "Reviewing this horrific material can be a slow process and the constant exposure can cause significant psychological distress to investigators," - if the AI is filtering out 'normal' pics, then the reviewers will only be seeing horrific images.
Unless, of course, the idea is to let the AI make the decision on all images.