Re: 0th
If an AI decided that humanity needed to be saved from itself, might it decide that murdering a hundred million climate change deniers to halt their drag on progress was an acceptable price to pay for ensuring that seven billion people had a better opportunity for life, health and happiness?
(I expect I'm going to get downvoted by some for my choice of subject matter than for anything else.)