Phew! We're safe then!
I recently had to try and package up the latest CLI for 'Box' cloud storage so it could be deployed into a secure environment with restricted internet access.
Although it's maybe an impressive feature of npm that it warns you about security bugs, after it had pulled in well over 200mb of JavaScript files in the form of well over a thousand dependencies, it warned me that the current package was vulnerable to something like 800 known CVEs! (I forget the exact number - maybe it was 300 and I'm exaggerating but 300 or 800: the point still stands that this is utterly insane).
There was an auto fix option which got part way through, borked the package, bombed out with a stack trace (if I recall correctly) and ultimately decided that the bugs could be automatically reduced to under 100 but to go further would require upgrading various dependencies through potential breaking changes. Sure, some responsibility has to fall of the vendor of the CLI code in question, but this problem is endemic in the general approach to software using such tools.
I had to stick with it in its native form, warts and all. I was not at all happy about letting that near a secure environment but with some damage limitation measures it seems to be workin, at least.
It's not just nodeJS though. Certainly it gets its fair share, but so many other languages are encouraging this super-convenient (when it actually works) but highly risky approach of automatically downloading vast numbers of interdependent libraries from a similarly vast number of sources.
For me the very worst part of all of them is the fact that they all assume you have an internet connection and make it anywhere from difficult to impossible to build using simple local mirrors or to package up a complete set of everything needed to reliably run the application fully offline.
People look at me like I'm mad when I suggest using a minimal set of carefully vetted external libraries whilst maximising the use of built in features of the chosen language and avoiding automated dependency resolvers like the plague... It certainly takes a bit more effort but you then end up with, let's say, a command line interface to a rest service which might be just a few megabytes in size instead of nearly 300mb, with only one or two external libraries if any at all - simple enough to audit, limited in function to pretty much only what it was designed for (How many times do Devs call in massive library X to use one single function out of hundreds!?), will run anywhere it's installed and is generally easy to debug.
I'm not saying we should start reinventing the wheel every time, and it's clearly safer to use tried and tested code for things like encryption. But I am saying there's a balance between writing some bespoke code which happens to duplicate some functionality of a massive external library, and downloading half the ecosystem of your chosen language just parse a config file or open a damn socket. In my opinion npm and the likes go waaaay too far off the scale.