50 downloads each?
I expect that's the 80 AV companies downloading new npm packages looking for malware.
A group of more than 200 malicious npm packages targeting developers who use Microsoft Azure has been removed two days after they were made available to the public. Security firm JFrog on Wednesday said that earlier this week its automated analysis system began raising the alarm about dubious uploads to the npm Registry, the …
I take exception to the following and think it's lazy and bad journalism on El-Reg's part.
This involves using high version numbers (e.g. 99.10.9) in the hope that internal npm private proxies – set up for fetching code from an internal registry – are configured to look for new versions of existing packages first from the public npm Registry before falling back to the local registry.
That statement implies the internal repositories and private proxies are setup in that way where it's the project where versions are defined. All private proxies I've seen or managed just do what the software projects ask them to do. That's typically If the requested version is in the cache it will serve it, if it's not it will attempt to fetch it
The developers, NOT the devops teams are responsible for that one. In all projects I've been on the developers have maintained package.json
The repos aren't WORKING that way, the repos ARE BEING USED that way.
There's a difference between "What ever you have above version 2.0" and "I want the latest". The former will always be served from cache (so long as it has version 2.0 or later), the latter will go to upstream and only serve from cache if there isn't a more recent version.
@Tessier-Ashpool Or do you want to get your project done on time?
Downloading and including malicious code will not get you project done on time.
Seriously you would not release your own code without it being tested and checked so why would you not do the same with unknown code form an external source.
Why? Because there is implied trust. You can’t even start a Visual Studio web project these days without it importing a plethora of third party nuget packages first, like certain JavaScript helpers.
From a security perspective, this is far from ideal. But it’s what happens routinely. Huge numbers of packages get downloaded millions of times by developers.
Let’s say you want, by this time tomorrow, to have an app that will compute distances between postcodes in a spreadsheet. Do you write the bulk of that code yourself taking weeks or months, or do you use one or more of the numerous helper packages to do the job? The vast majority of developers do the latter, and do trust by numbers. Package A has been downloaded a million times; must be safe, right? Of course not, but it’s what is practical. You have no easy way of knowing for sure that a package is benign.
"You can’t even start a Visual Studio web project these days without it importing a plethora of third party nuget packages first, like certain JavaScript helpers."
Yes you can, there are no nuget packages in the VS projects I work on. In fact we only use 2 pieces of open source code, one of which we compile ourselves.
Even with that small amount we still use Black Duck validation*. We only move to a later version of them if there is a need (say Black Duck has reported a security hole) and we move to a version that Black Duck has reported safe.
But we still fully test before we release. As we do whenever we introduce a later version of any thing for example .Net.
* It was introduced for the web dev projects but applied to all projects.
Interesting re. Black Duck. Not heard of them before.
But IMO, there is no excuse for package repositories like npm or pypi to be hosting binary-only packages in the first place.
If they must host binary packages, then it should be forbidden for the maintainer to upload them in that form - they should be compiled by the repository (after checking for Hex BLOBs in the source) and for any package, it should be possible for an end-user to download-and-compile sources instead of download-and-install binary packages. A bit like `apt-get --build source [package]...`
That of course means that packages must list their build dependencies as well as their runtime dependencies, and those build dependencies should themselves be open-source. That might be a bit much for Microsoft (who now owns npm along with GitHub) to swallow, but they can get bent, as far as I'm concerned.
BTW just had a look at your postcode problem seems straight forward to me, convert postcode into location data here is one source for that https://www.ordnancesurvey.co.uk/business-government/products/code-point-open.
If it is just straight line distance you want then just some maths Pythagoras, curvature of earth probably find the maths online.
I was intersted as my company has a GIS arm, my degree is in Computer and Mapping science and my first job after Uni was in GIS software. ;)
Nope, just stop using Node! Same reason I don't really like Python that much either, 67 bolt-on packages downloaded from some numpty's repo in the arse-end of nowhere just to prop up an SFTP app! Some scumbag then hacks just one base core package the house of cards comes-a-tumblin-down.