RE: Stefan Frei, research analyst director.
Can you confirm the author has met said analyst, and it isn't just a lazy attempt at pseudonym.
The gap between software patched by IT departments and the applications cyber-criminals actually target is leaving organisations at a greater risk of attack. And despite system administrators' efforts to keep Microsoft-supplied packages up to date, non-Redmond software is almost exclusively responsible for the growth in …
"Stefan Frei (born 20 April 1986 in Altstätten) is a Swiss footballer who currently plays for Toronto FC in Major League Soccer."
Probably not the same one, and it's barely possible that Stefan's parents are fans of the 1981 Cambridge Footlights Revue, or later work such as "Happy Families" written by Ben Elton, in which Stephen Fry played the extraordinary family doctor.
The course of action should be
A) fully and quickly patching all installed applications.
B) openly punishing vendors of insecure stuff (such as Adobe) by uninstalling their wares and telling them of that.
Currently, Adobe is doing extremely well despite the fact that they are a major risk to spearphishing. Apparently nobody cares to act on their neglicience.
One thing you don't want to do as a corporation is roll out untested patches to applications. You could, potentially, end up causing a problem that requires hundreds (if not thousands) of PCs to be fixed.
If (as we do where I work) you support a lot of applications on a lot of PCs, you could also break one or more of the other apps, if they happen to rely on a system setting or file that the patch changes.
So, you have to balance the risk (and potential cost) of one or more security breaches against the risk of patching all your machines and cost of testing and installing that patch.
Of course, any halfway decent corporate networking/it dept will have all their mission critical systems hidden behind a decent firewall, which mitigates the threat of not installing patches somewhat (but not totally).
If you think that your firewall is protecting you against unpatched acrobat or other end points you are mis-understanding the threat.
If you answer yes to the following you are at risk :-
1) Users have email access that allows them to receive email from outside of your organisation
2) Same users have internet access.
Spearfisher sends a well crafted email with a payload in an adobe file. End user opens it and Roberts your fathers brother, host is compromised, from the inside. Now you've got a whole word of pain in your trusted un-patched inner sanctum.
The hackers don't give a rats arse about hacking the mission critical stuff directly when they can compromise a badly patched endpoint that has authorised access to the mission critical system. job done.
Patching is expensive, of that there is no doubt. Misunderstanding what to patch and the relevance of firewalls in certain situations is priceless though.
Paragraph 1: A+
Paragraph 2: A+
Paragraph 3: A
Paragraph 4: D-
Firewalls offer some protection, but despite all the good points in paragraphs 1 to 3, the only way to protect the network is to prioritize the testing and deployment of patches while keeping the firewall, AV, and anti-malware stuff up to date. Having a good and properly patched proxy server/cluster in the mix would probably help too, but being as I've never worked the help desk anywhere that has one, I can't speak to its efficacy from experience.
This is actually what we do, and, TBH, I would always recommend up to date security systems, be they AV, Anti Malware or Firewalls.
I also realise that a firewall isn't a total solution, and I have not pretended it is (hence I used the word "mitigate" rather than "remove"), but most companies would want to weigh the cost of testing and patching against the cost of the possible consequences of not patching.
But I was trying to make the point that as a corporate IT person, you don't want to roll out untested patches.
Their course of action should be:
A) Assess impact of bug.
B) Book test platform.
C) Install patch on test platform.
D) Test patch
E) Devise rollout plan for patch which should include:
1) Success criteria
2) Fail criteria
3) Rollback plan
4) Latest time at which rollback can be initiated.
F) Raise Change Request for affected platform.
G) Implement Change Request.
H) Assess success/failure of CR
From my own experience the main things that hold back upgrading software in an enterprise environment are testing and management. The testing takes too long and often not enough is known about what software is out there or how it reacts. For example my old company, the largest bank in the country, were still running Java 1.4 on several machines and 1.5 was the norm. To upgrade all the desktops to 1.6 and the latest revision takes so much regression testing and the windows are so large that by the time you have everything ready for upgrade you're 3 revisions behind.
Then there's the inadequate testing from vendors themselves. Adobe Reader X for example, an application that had previously just been a package and drop app with little or no testing suddenly broke over 1500 desktops when we deployed it and had to roll back to 9.4 (or whatever). And those are just 2 examples. Vendors need to test better and release code which is in a fit state and doesn't need patched every 2 days and they could help companies by providing more details of a change log and what testing will be required.
...in which nobody, including vendors, *have* to do anything. Neither does a sysadmin.
If you don't like a particular vendor's approach to testing, you're free to choose another.
If an organisation doesn't like updating software as patches become available, it *has* to deal with broken desktops.
It's really that simple.
If an organisation wants to do the right thing, it will. If a sysadmin is frustrated because his hands are tied by an organisation not interested in doign the right thing, then he has the opportunity to:
a). Shine by devising a patch strategy, taking that as far up the chain as he can to get a result
b). Go and work somewhere else
c). Change career path to something less frustrating
d). Accept things as they are.
As artwork is often done in Adobe software. When emailed to my users they need to be 100% positive that it's perfect. Not a shade different, not a jaggered edge, not a millimeter different that the original.
As good as other 3rd party software is, and as much as I truly hate Adobe and their flakey, swiss cheese impersonating software - we are a service provider to the business, not the people that can outlaw software just because we think it's a bit shit.
So, why to whine about these problems? Choosing MS Windows ensue so many problems and inconveniences. On my GNU/Linux systems I have no problem with "the accepted means of interchange." My pdf readers evince,xpdf,kpdf/okular and emacs doc-view perform much better than the Adobe's own reader. Evince, okular and doc-view can handle more formats than the latter, like dvi, djvu, ps. And, more importantly, no problem of untracked patching exists. Repositories are maintained by the professionals, notifications and upgrades get available almost right away.
Spoken like someone who hasn't ever actually done anything than print from a locally attached printer on his desk.
Printing is really, really complex - I ran a project to replace all the printers at a FTSE100 UK financial company with centrilised MFDs. It was a complete nightmare and that was with officially supported software and hardware - CUPS just didn't cut it, it was way too flaky, probably a driver issue, but nontheless the machines we were using made the cost of serving from Windows pail into insignificance, so wasn't an issue.
You're speaking like "having no clue" about the subject.
Firstly, the question was not CUPS as a server, but as client-side app. Moreover, It was whether one should blame a pdf reader for printing. Any *nix pdf reader uses CUPS for pdf2ps conversion, postscript language is then either understood by a postscript printer or translated to the language the printer understands through ghostscript and cups-raster. Thanks to the good part of Adobe, both postscript and pdf are open and well documented. Good drivers and foomatic area available and most printers are very good nowadays.
Secondly, you're as a Windie admin are making an elephant out of a mosquito. I had to manage some printing administration (along with other) at our University and cups behaves very well as both the client and the server. Problems and issues may arise, CUPS is a much better yet simpler system to work with.
there was an alternative system, like an OS that had all the software packages within a some form of ...oh let's call it "a repository" which would allow you to update your desktops and servers in a planned manner having first gone through some sort of change control process first. Maybe where you could find alternatives like xPDF/Evince or OpenJDK, maybe where the underlying OS would be supported for 10 years with security backports, didn't demand hardware refreshes every 3 years, didn't seem to have problems with cruft requiring reinstallation and had proper privilege separation.
If only...
The point is, any application is a threat vector. ANY application, regardless of OS.
So I shall continue to laugh at people who think "oh I'm on linux/OSX/whatever and therefore I am a) safe and b) MORALLY SUPERIOR to anyone on 'doze".
Because the real threat vector is those idiots. Stupidity is the largest possible threat to security.
While I wouldn't laugh at Linux, you are actually right when you say that those who are arrogant enough to believe their systems are invulnerable purely because they are based on a particular OS.
Reminds me of a discussion that pops up from time to time on a forum I manage. We sometimes get users saying that because they have "properly" secured their systems and are careful about sites they visit and the emails they open, they don't need an AV installed.
While I agree that an AV slows down a system, my usual answer to them is to ask how they know their machines aren't infected.
Personally, I believe the only OS or software that is invulnerable is that which is installed on a machine that has good physical security around it, and is not connected in any way to the internet. Of course, such a beast is a rare thing now days..
So, you extrapolate from the MS Windows experience. This extrapolation might not be true, don't you understand it. Once again, the security issue we are commenting on right now for Windows and Linux/BSD are very different worlds. One has central repos/ports. The other one lacks it. Period. Now continue with your extrapolations right from here.
It's nice that you keep supporting linux, but yet again you are wrong about Linux and Windows in the same comment. You then go on to suggest that you actually might see these issues in Linux, but it would be the fault of the user. Kind of like it is in Windows.
You also betray the mindset which will directly lead to systems you use being hacked - "It's not Windows, which is shit, It's Linux which is invulnerable!" will lead you to a level of complacency which may well result in your systems' downfall.
It's nice that you keep telling me that I am wrong. My cowardly friend, can you please tell the article's author how wrong he is, since my conclusion was (partly) based on the mess Windows admins have to cope with.
I'd like to tell it in the 199th time that the level of possible vulnerabilities for MS WIndows and an average Linux distro are very different. Say, my intelligent mouse-clicking expert, I've never heard that the RedHat, FreeBSD foundation, Debian or Canonical security people would suggest to not click on the "bad" links, to not visit "dangerous" sites and so on. Linux or BSD admins do not complain about the lack of security fixes notifications. On the contrary, there's a competition between the distros on who sends them first -- for every package they maintain (\approx 30,000 in Debian)
"I'd like to tell it in the 199th time that the level of possible vulnerabilities for MS WIndows and an average Linux distro are very different. Say, my intelligent mouse-clicking expert, I've never heard that the RedHat, FreeBSD foundation, Debian or Canonical security people would suggest to not click on the "bad" links, to not visit "dangerous" sites and so on. "
No but you have no doubt heard recently about internet connected systems being hacked - cert authorities, cryptome, linux repositories etc - I guess they were all running Windows were they?
It's the meatsack that's the security issue in all cases. You know, a linux system with its security levels, modularity etc is only as safe as the admin made it. Bad admin, bad security. Windows (as in just the OS) security is pretty damn good these days.
In any case, "just switch everything to linux" is a retarded idea.
Suppose you, with 10 years of experience in various linux flavours, are suddenly told everything is going Windows/Active Directory. Will you secure it well?
No. Because you don't know what you're doing.
The same is true for somebody who can design and implement a secure Active Directory forest - at least as secure as anything you can create. He can't just switch over to linux and magically make everything "better". And linux-based networks he puts together will be crap, just as any windows-based networks you put together will be crap.
Is it possible to create a secure Windows network and desktop? Absolutely. Try working some of the places I've worked, the military establishments, the defence ministries, the high finance houses. Windows has been evolved to sell into those houses.
All security is a trade-off against convenience. Those people don't care about convenience. and their networks are solid. Cold that be done with linux? Quite probably, yes. Should it be? Only if you want to spend far more than the cost-price of a Windows environment on retraining.
Could it be done _better_? Almost certainly not.
This post has been deleted by its author
>>cert authorities
I did hear about infamous Windows running Diginotar, not sure what is used by the rest.
I also heard about the multitudinous victims of stuxnet, conficker etc.
When you install a GNU/Linux box, like Ubuntu, with the vanilla desktop setting, the only security measure you have to follow is get a powerful password, keep it in secure location, better in the head (I hear a lot of Windie admins and users enraged by this very "infeasible" task) and regularly follow the security updates notifications. It looks MUCH easier to me than "install a good AV, do not click on unknown web links, visit infected websites, open unidentified strange emails, insert "dirty" media, do not install from unknown sites etc"
if only you could rely on the developers not to break working applications from one release to the next. If only the devs would keep the positions of config files in the same location and if only they would have consistent dependencies from one release to the next.
Desktop management hell
Linux servers on the other hand are brilliant things to work with.
i agree if you wanted to use fedora for your desktop but you just wouldn't in an enterprise environment .I have been happily using CentOS 5 (5.0 to 5.7, gotta update to stay safe) for *years* on the desktop (the conf files haven't changed location in all that time and it has all "just worked"), now migrating to CentOS 6 (6.2) and based on what is happening in fedora I am really looking forward to RHEL/CentOS 7 when it comes out the door.
We use CentOS because we can support it, for those without internal support RedHat will provide that.
Compared to being stuck on XP with no hope of upgrading either because of a lack of Win7 drivers for old hardware or the boxen being "too slow" - forklift upgrade...in this economic climate - I'll take linux on the desktop today.
Other than games i don't see why anyone would want to stay with windows.
If you have business apps that need XP then use your current licences to run an XP virtual machine and then at least your host OS will be update-able after XP goes EOL. After all Vt-x has been around since 2005.
Slightly wrong. Some of the files have changed, but some of the change have been made by making the original file (probably in /etc) a link to the new configuration file. The other bastardization that's done is to add a new record type, which is a pointer to the new config file(s), and move all of the real configuration data into the new files.
This post has been deleted by its author
"if only there was an alternative system..."
One that is fast, stable, secure, has centralised development, is copyfree, comes with good documentation, has mature minded community support, and has variants which run well on both servers, desktops and laptops for the novice and pro alike?.. That would /have/ to be BSD then, eh? ;P
I really don't understand all the fuss and hype about Linux. (I am, of course, assuming that you were actually referring to Linux).
(Where's the Beastie icon?)
"non-Redmond software is almost exclusively responsible for the growth in vulnerabilities .. if PCs are running older installations of Adobe Acrobat then systems can easily become compromised by targeted attack"
When an application makes a call to the API that results in a security breech, is this a vulnerability in the APP or the OS? For instance opening a PDF that makes a call to the API.
If the API call results in an elevation of privileges, it is the OS's fault, fair and square, but the implications of the article are that 90% of attacks are not of this nature.
If it merely uses the current user's privileges to perform a malicious action, it is the App's fault. You can do quite a lot of damage without elevated privileges. All phishing attacks, for example, fall into this category and "emptying my bank account" is quite a lot of damage. Depending on network security, you might also be able to transmit company secrets to an IP address in Shanghai. (See the Nortel story.)
@Eulampois - You can deliver packaged updates to Windows machines, you do know that don't you?
Also - the repo is not a panacea, it's not uncommon for a repo to break software from other repos, or software which has been installed manually - such as commercial software.
As for most apps being on the most popular OS, well, err, Duh!
You can, you can but who does it? Guess what, every Linux distro does it.
As far as the "the repo is not a panacea and there are issues " (I can't remember having them myself) is concerned, it sounds like this. I drive my manual 4.0 Cherokee and see you anonymously and cowardly mending a flat tire on your old rusty scooter. When I say "hello", you grumble back to me: "eulampios, your Jeep is not a Mers. or BMW" . Sure, but I can drive farther and faster than you.
Errr, you're not really answering his post which was that updates from a repo can break manually installed software. You seem not to care. Take that attitude to a corporate environment (which you clearly don't work in) and watch your sorry arse get bounced out of the front door post haste.
>>Errr, you're not really answering his post which was that updates from a repo can break manually installed software.
Try not to install proprietary crap manually in the corporate environment, period. If absolutely necessary, make packages yourself and have an extra repository of your own to get updates from.
>>Every home user that I know (who has bothered to setup MS updates) has packaged updates to their OS.
Name that great packaging manager to me! Well, I admit you're much luckier than I am. 1) They install packages directly from the who-knows-which websites 2) signatures and integrity checks may or may not be present 3) update/notification is most probably not available.
Again "My Jeep is more reliable than your ever broken scooter"
as long as it can not trash the base operating system on my machine. Just run the brilliant portable versions of those apps you mention, lock down the rest of the machine and let's move on with our lives. This was Microsoft's one biggest mistake of all times, one that came back to haunt them year after year: to specifically require or at least allow one user space application full access to privileged areas of the operating system. Even the *nix world has been bitten by this kind of vulnerabilities but they managed to clean up their act long time ago. Microsoft on the other hand, did not. Why do I bloody need to be an administrator in order to install any application ? Why the hell does a flaw in Adobe reader or flash player allows privilege escalation ? Why do simple acts as displaying a document or rendering some font have to endanger the whole OS ?
"And despite system administrators' efforts to keep Microsoft-supplied packages up to date, non-Redmond software is almost exclusively responsible for the growth in vulnerabilities."
Do you mean, "Because of system administrators' efforts to keep Microsoft-supplied packages up to date, non-Redmond software is almost exclusively responsible for the growth in vulnerabilities" ?
Now we just need the same level of diligence for Flash and JRE updates and Bob's your Granny.
Since we moved away from Sage Line 500, most of our users don't need Java installed anymore, so I've just stopped installing it by default.*
That's the best way to solve the problem. Now if only I could do that with Flash, without all our users just installing it themselves anyway...
*(especially as of course, Sage was only supported if our user was running 6u7, newer versions would work fine, but if I needed support then I'd be forced to downgrade)