The two previous respondents are missing the point.
Let me splain.
In sane places, you download the fixes for the fuxes. You store them in a central place so as not to over-stress your Internet pipe. You then give your TEST servers a little poke, and they partake of the alledgedly fixed software. You very carefully test whether this makes them fall over in interesting ways. If they don't, then you give your PRODUCTION servers the same little poke and hope that you didn't miss anything.
In insane places, every server wants to be connected to the Internet to function at all, and to acertain that you're not a filthy pirate. As an aside, these boxes will then automatically download the MS fixes as soon as they become available, and politely ask you whether you want to reboot or yes. Only NOW, for people who have a screw loose and actually allow this to happen to any server they care about, there is an ADDITIONAL tool that PREVENTS this thing from happening, should you lack the wit to keep it from happening in the first place. Which will be automatically installed at the next automatic update. Are you with me so far? Well, then. When six months have passed, the fixpack blocker will then cease to block the fixes that you shouldn't have let the machines install in the first place, presumably causing them to be installed automatically after all, unless you have had the notion to disable that behaviour in a more permanent way in that time.
And yes, I agree. That is madness.
But then again, I also think that putting a firewall on your machine to block people from haxoring into your system's less secure facilities, should be a practice abandoned in favour of fixing the sodding leaks in the faulty facility in the first place, or not running that facility in the first place.
So yes, while people are crowing that here MS has given us yet another feature to control the deployment of their shit on our servers, I would argue that there should be only one way. One that works properly.