Different measures for different things
Kerckhoffs' principle applies to ciphers, and makes perfect sense: The assumption that the adversary knows everything but the key anyway allows us stronger ciphers than otherwise. Point in case: DVD's CSS. Of course, the NSA can through their sheer size provide enough knowledgeable eyeballs to debug and deflaw their ciphers and so doesn't necessarily gain from wider disemination of their algorithms, nevermind that the rest of the machinery using their ciphers does the keeping-of-secrets thing well enough that even a weak cipher isn't that much of a problem. This wouldn't work out so well for most other crypto users.
That doesn't mean this applies elsewhere, universally. Keeping the internal phone list internal can make it a little harder for an adversary to /social engineer/ an organisation that's aware of such threats. Though most by far are oblivious and then obtaining copies of whatever is not that difficult.
Or, you know, not talking AXFR with just everybody is one of those basic prudent things plenty of zone admins take, as it does basically no harm and might possibly do some good. On the other hand, just blocking ALL EBIL ICMP breaks a few important things and paints the security admin indiscriminate and not knowledgeable. There's a couple types that you shouldn't allow and that's been known since september 1993 or so, but there's a couple that turn out important as well as plain useful. I'm not afraid to have public systems reply to echo requests (within limits, of course). For the few times it enables useful traceroutes (do not shorten the name, you silly person you) it is worth the neglible risk of someone learning something useful from the reply.
As to things like vulnerabilities, well, there's the simple "due dilligence team vs. lone exploit searcher" calculation. Turns out that with both fishing the same large enough pool of as-of-yet-undiscovered exploits, the due dilligence team will be fighting a losing battle simply because they have to find at least exactly all the holes the lone searcher does.
Disclosing may or may not help there, depending on what your goals are, but what's most important is that plenty of companies have mishandled their reactions to reports both publicly and privately so badly that even responsible parties are suffering in that some "researchers" aren't even bothering with prior notice any longer, regardless of the vendor's or software project's track record.
I haven't bothered to read more than the first few lines of abstract as I didn't like the tone, but the most important thing in the whole debate is that you have to understand just where what principle comes from and when to apply which. Otherwise the discussion will sink into yet another holy war flame fest. I don't really need to read any reports to know it probably will anyway, which is a sad but not unexpected state of affairs in the IT security industry.