
I have never been infected by malware!
-Sent from my iPhone
Get free viagra on theregister.co.uk
Software developers have been lionized in recent years for their influence over the information economy. At the Node Summit in San Francisco, California, on Wednesday, Guy Podjarny, CEO and cofounder of security biz Snyk, reminded an audience full of devs that they've become a popular vector for malware distribution. …
I probably have a very simplistic look on things but at the same time several of my methods often gave me good results. And my idea is very simple: sysadmins and devs. should be more trusty of each other and stop bickering over alleged pissing contests.
Now, sure: one size doesn't fit all, and I realize that, but sometimes when sysadmins suggest that devs. use a more private network and don't get to browse the Net while they're working then it's not always because we want to throw some weight around: the idea is to help and keep things safe and secure.
Of course the downside to all that is that some developers need public access. Sometimes problems can be solved by looking at examples of other ideas or similar situations, and those concerns should be addressed as well.
Yet unfortunately I've experienced too many situations where both parties weren't willing to meet somewhere in the middle (or maybe they were incapable? Sometimes it feels like you're dealing with children). And sometimes... people tend to forget that in the end we're all working for the same team.
That Internet access? Could also be passed through a secured proxy filtering server which checks all data and locks down at the slightest sign of trouble. No, you're not being monitored and filtered "because".
But as long as you can't break through that culture which apparently dictates that both devs and admins "know best" then I don't think you're going to change much anytime soon.
My view is simple : you are at work, it is normal that you be monitored and filtered. Even if you are God's gift to programming.
Full disclosure : I may be a developer, but I am also a consultant. That means that I develop on client site (which include banks), and that means that I have to be extra careful when I have Internet access to not click on a link that is possibly not work-related, because customer.
What developers can live without net access in this day and age?
Documentation, code samples and libraries -- it's all on the net. Heck, some of us are using (or is going to use) off-prem version control systems and build servers because internal IT are too lazy to administrate our servers.
Reminds me of a friend who as he turned up for his new job, was told about some ridiculous net restriction, upon which he simply turned around and left while the idiots tried negotiating an increase in his internet allowance. He realized he could probably get the ban lifted for himself, but he was convinced that they clearly did not know what they were doing so he simply did not want to work there.
I would be very surprised if any higher caliber developers are going to accept such working conditions. Must be paid an insane amount of money in that case.
Quote - " because internal IT are too lazy to administrate our servers.".
And there Ladies and Gentlebeans lies the problem!
1. 'Internal IT' are probably too busy to stand up your servers, based on your half-arsed requirements document, to your stupidly impossible timescale.
2. They are not *your* servers, they belong to the company and you don't get to do anything that you want on them.
I fully expect downvotes from the Developers on here, but I hope the SysAdmins recognise the 'Entitled Developer Syndrome' and can be bothered to upvote as well :-)
"That sentiment poses a particular problem for the Node.js community, where developers often rely on dozens or hundreds of code libraries (each of which may incorporate other libraries) written by someone else."
You mean ... developers often rely on links to dozens or hundreds of code libraries that can be modified after the fact by someone else, so even if they weren't a problem when the software was written they could become one if an attacker so chooses.
But this isn't actually a problem, because from the point of view of the end-user who runs the code, all JavaScript is untrusted code and therefore runs in a sandbox as a matter of course. (Well, OK, not quite all if you are the kind of person who has locally maintained apps written in JS. But I think that makes you rather unusual.)
"But this isn't actually a problem, because from the point of view of the end-user who runs the code, all JavaScript is untrusted code and therefore runs in a sandbox as a matter of course."
Not sure if your being sarcastic or not...
Believing in sandboxes to hold all the shit at bay is laughable.
You do know spectre and such were/are triggerable in javascript!!!
and not sure if a coinminer worries about being in a sandbox!!!....
"Not sure if your being sarcastic or not..."
I'm not. I take your point about sandboxes being permeable, but my point is that if your sandbox is permeable then it was game over for you as soon as you started surfing the web. Hardly anyone runs trusted code in their browser. It's all "whatever the web-site feeds me". There is nothing in today's story that makes this any more scary than it was yesterday. I think it is unfair to pick on the Node.js crowd.
Starting around 1996 and for another ten years or so, I had JS disabled in my browser (except for sites I trusted / had to trust).
I have since given up on that strategy, but not because I disagree with you. At some point, the benefits outweighed the risks.
The browser makers have taken steps to mitigate some of the spectre concerns. My guess/hope is that this time the mitigations arrived before any exploits did.
But yeah, patch early and often.
But it's totally fair to pick on websites that run client-side code. I don't have a beef with node.js. I have a beef with Javascript, period. However, my solution is even better than sandboxing -- I allow nearly no Javascript to run at all (I have narrow exceptions for sites that I find particularly important). If a site is not critical to me and won't run properly without allowing Javascript, I simply don't use that site.
Who would guess importing shit directly from places not under your control would be a bad idea?
That would be me and any sane developer, I fucking loath web sites that show it loading from 100's of external sites (my blockers (note multiple!!)) tell me about, 99% of which are fucking tracking!!).
Which fucking idiot thought NPM would be a good idea....ah yes, that would be lazy fuckers who can't really do the job properly but argue and fawn over the latest hyped circle jerk idea and how you should use such and such pattern!!, for fucks sake if you know how to code you don't need a fucking pattern, you need to know which algorythm to use! then you actually know how to code problems and not panic when there is no pattern for it.
A lot of it is about avoiding the hazards at the other end of the spectrum.
Imagine, if you will, you go to a garage to get your car fixed. The car mechanic is surrounded by piles and piles of wrenches, drills, pry-bars and all the other tools of the mechanic's trade. He takes one look at your car and says "Ah, a Ford. I know how these work", grabs a length of mild steel and fires up the milling machine.
That's expensive, time consuming, and when it goes wrong it's on *you*, Genius Coder. Running other people's code makes the errors their fault. ;-)
This post has been deleted by its author
"I figure you get the same sort of "industry standard, Gov" protection that comes from, say, buying IBM.
If it's stupid but everyone does it, it's a different class of screw up."
I'd class that as following stupid sheep!! bit like brexit!!
same class as your mum asking you if everyone was jumping off a cliff would you follow?
Regarding the cliff there is a simple answer: It depends on the reason why the everybody jumps off the cliff. If for example a pyroclastic cloud from a nearby volcano is approaching from the land side, than I definitely would bet my life on the jump if there appears to be no other way out.
Always look at the whole picture. Sometimes it can be wise not to run with the sheep. Sometimes it can be stupid. Try to avoid absolute words like "is", "every", "none", "always" or "never" in the guidelines for your acting because those words tend to lead to stupid behaviour.
Right at this moment, there are dozens of suspicious looking files in my home directory on my laptop. A bunch of cryptically named .xls and .mdf files, even though I don't even have Excel or Access installed.
Who put them there? Malware you think?
That was my first hunch and I spent about an hour troubleshooting this. Turns out my brilliant benevolent sysadmin installed a honeypot system that relies on these files to detect malware...
All that anti-malware they have forced me to run for the past nine years have caught nothing. Nada. Zilch. Zip. They have however caused me quite a lot of needless headache.
Sysadmins need to STFU and let me get on with my work.
I'm very, very sympathetic to your argument here.
On the other hand, years ago (way back in the stone ages when you sold software by manufacturing CDs and putting them on retail store shelves) I worked for a company that managed to include an active virus in the software they sold. They had no idea until an infected customer scanned the CD he bought and complained.
A vigilant and highly annoying sysadmin would have saved that company millions of dollars and quite a lot of bad press, not to mention saving the company's customers from infection.