Sorry, off topic but I can't resist
Yes, I know it is supposed to represent the Golden Gate Bridge but I wonder if the B-Sides San Francisco security conference is going to get sued by Atari for using Atari's logo not once, but twice?
Comfortable illusions about how security is working are crippling the ability of government and industry to fight the threat, a former member of the FBI’s netsec team has told the BSides San Francisco 2017 security conference. Society is operating under the illusion that governments and corporations are taking rational choices …
And Cisco lost out.
A few "housekeeping" points (sorry not wishing to be persnickety but...):
It is CSIRO (Commonwealth Scientific and Industrial Research Organisation); the CSIRO logo is actually a series of vertical bars which echo the shape of the Australian continent not the *Sydney* Harbour Bridge and; Cisco were trying to block the CSIRO trademarking their logo because of the alleged similarity to their own.
Sorry, it's one of those acronyms that is impossible to type right, even when you know what it stands for and say it to yourself.
Still better than when they renamed SERC to PPARC and put the name on all the glass doors at the new office - without realising what the letters read as backward
The veneer of knowledge and capability is very thin.
Ooh we can identify facebook games even through https! Yeah, big deal. Now, about the real threats...
We need policy-based endpoint protection based on application profiles, not user rights. That's inbound/outbound network access, peripheral and storage access and we need better alerting.
We need to stop playing the fools catch-up game of scanning for malware and make sure that a compromised application isn't allowed to mess with anything else on the system. I know we've got EMET et al, but it needs to be baked into the OS and packaged with the application, not an afterthought.
An informed, realistic piece from someone who was sort of in trenches. This will disturb the PHB class. The market droids will get spooked and techies lectured about some irrelevant pet peeve of CIO de-jure or worse, HR process droids. No marketing spiel either. What is world coming to ?
The problem is IT has terrible engineering practices, which is caused not by the IT workers but collusion between government and business to keep it that way. Look at buildings: Blueprints available for public inspection. 3rd-party Inspectors. Regulations and guidelines. A culture of safety. Buildings don't fall over anymore and even the smallest defect is thoroughly investigated and understood, so lessons from each failure can be passed on.
Then there's us: Everything is black boxed. Source code not available for inspection by anyone, even the government (before you say they can, *which* government?). Poorly documented APIs. There is no code reuse. We reinvent the wheel constantly because of patent and copyright law. And the end result is self-immolating phones, operating systems that routinely crash, a losing fight with hackers because we keep making the same mistakes -- and why is that? Because everyone works in isolation. We don't share knowledge, we don't have a single playbook to work off of. Everyone can only rise to their own ability, without the benefit of peer review or able to stand on the shoulders of those who came before.
And people wonder why everything is on fire. Guys, security is hopeless as long as we don't practice proper engineering. Proper engineering will lead to all the changes that we need and more. But you're content with a total lack of auditing, even in critical infrastructure systems, and a legal system actively hostile towards every best practice there is in engineering.
Quote: "Guys, security is hopeless as long as we don't practice proper engineering."
While this quote is true, what is not mentioned is that the NSA, GCHQ and other government spooks (in Russia, China and elsewhere) quite like it that way.
Ask yourself if the NSA or GCHQ could do all that illegal hoovering if security was based on "proper engineering".
Ask yourself if GCHQ would get an extra 1.9 billion pounds sterling for their budget if the world was full of "proper engineering". The spooks at GCHQ are just like everyone else who would love a bigger budget.
Given all of the above, is anyone surprised that governments show absolutely no interest in "proper engineering" in the IT security arena?
Believe it or not, the NSA is not staffed entirely by idiots who haven't noticed that buggy, insecure software is a double-edged sword. Implementation details vary from country to country, but I believe most countries nowadays have some sort of national CERT or awareness raising organisation, with greater or lesser degree of arms length from the CNE people in the traditional intel gathering orgs, who are charged with improving national security levels. Undoubtedly the CNE people hoard 0day, but they're under considerable pressure to follow responsible disclosure to vendors these days.
I absolutely agree that the problem is a lack of proper engineering. But I feel you have missed an important - perhaps the most important - factor. The quality control in IT is cost driven by management who are both highly focussed on short term gains and, very often, have quite limited technical knowledege. They hoover up bonuses for shaving a few % of operating costs, and push off to the next company absolutely free of any conception of and, more importantly, any liability for, the reduction in quality that produced the "efficiency savings"
In proper engineering, management would not be able to get away with switching to cheaper materials, replacing all expert staff with apprentices or abandoning whole rafts of testing processes.
But as you surely know, if every software vendor or project spent 10x more effort on catching security bugs, or 100x more, -- whilst they'll catch a lot of bugs that way, it certainly doesn't mean vulnerability-free software. Microsoft have made enormous strides since the famous BillG memo in 2003, but you'll notice there are still 8-12 security updates every month. (I will, perhaps, grudgingly admit that perhaps fewer are RCE and more are local DoS...)
Anyway, the point is that doing more of what we're doing already is not the answer. It would help, and it would improve things, but we would still have orgs getting compromised left right and centre; don't kid yourself.
As noted generally - there is no holistic, regulatory approach to security - and with emergence of IoT and the unplanned evolution of the World-Wide Robot - this can only mean bad things ahead.
As Bruce Schneier has articulated - until we get a government lead agency to regulate cyber efforts, and this ultimately will have to be an international effort, things are looking grim going forward. The current agencies of the government committed to cyber-defense have no interest in advertising the leaks they can exploit to unlock the keys to the corporate castles. Meanwhile, companies making cyber products find security an externality, they don't pay for problems, their customers do. So if we want to move forward there needs to be a consensus on the problem scope and a will to address this proactively instead of reactively - where we inevitably get very bad knee jerk policies. Obviously our rational and thoughtful response to climate change is a harbinger of bad times ahead for cyber defense.
This post has been deleted by its author
This is what makes the term "Software Engineer" such a joke. Does anyone using that title, actually have a PE? Does IEEE have standards for "Software Engineer" like they do for other engineers?
You can get specs on a building, can you get them for the new car? or new smartphone?
When you build a bridge, it is used that way for decades. What is the last piece of software you have used that is over a decade old without a change? How about 50 years old, we have bridges that are CENTURIES old without a change. It is the changeability that makes software more difficult.
This is what makes the term "Software Engineer" such a joke. Does anyone using that title, actually have a PE?
I think I met one, once, in 50 years of DP/MIS/IT (same s**t, different labels). OTOH, his PE was in Mechanical engineering. He just "followed the money" into SW.
You can get specs on a building.
Tell that to the folks in the Millennium Tower in SF. Many balls were dropped in the making of that fiasco.
"When you build a bridge, it is used that way for decades. What is the last piece of software you have used that is over a decade old without a change? How about 50 years old, we have bridges that are CENTURIES old without a change. It is the changeability that makes software more difficult."
But they are not maintenance free.
Metal-based bridges will get a coat of paint to stop the rust (Paul Hogan's (Crocodile Dundee) day job before he made it in showbiz was, literally, a painter of the Sydney Harbour Bridge).
All bridges, metal, stone or timber get some level of maintenance, paint, replacing rotten timbers, replacing crumbling masonry, replacing rusting beams, replacing footings, pouring new cement, replacing cable-stays, road surfaces, etc etc.
If you exclude feature-releases (which engineered structures also get - spires or antenna added, new lanes, adding or removing a railroad across a bridge, extensions, changing the layout (e.g. knocking out a couple floors to make a 2 or 3 floor high atrium in an existing building)), security patches and bug-fixes are the equivalent of maintenance. Whats the difference between replacing a crumbling stone in a bridge (that stone was too soft for example) with issuing a security patch that plugs a hole?
While I agree that "software engineering" is stretching the definition of engineering, so is saying that engineers build something and it just stays there magically by itself with no maintenance in perfect condition for years, let alone centuries.
Take, for example, the classic, literally textbook, Tacoma Narrows Bridge. That was engineered so well that it'd twist in the wind and be too dangerous to drive across, and eventually, only 4 months after contructed, twisted itself into a complete collapse.
Civil Engineering has had millenia to get it right, and still fuck up big-time. Software Engineering, as a discipline, has only been around for 50 or so years.
But you're content with a total lack of auditing, even in critical infrastructure systems, and a legal system actively hostile towards every best practice there is in engineering.
I don't think it's the IT people at the coal face. It's corporate culture and profit motive running things. We can't share where there's lawyers in the water waiting to file lawsuits.
IT started out with "sharing" and discussing but the lawyers and greed have killed that. Pretty sad situation overall that you can't call someone and say "hey, I see a problem in your code"... and they will spread the word. Nope, liability and IP must be protected and minimized. The only reason there's bug bounties is for being able to say "we're doing something".
> I don't think it's the IT people at the coal face.
Where do you see me laying the blame at the feet of the poor bastards left to work from an untenable position towards impossible goals? We inherited a problem and have been denied the tools to fix it. We are, bluntly, the fall guys when this whole shaking edifice goes over the lip of the volcano.
> Pretty sad situation overall that you can't call someone and say "hey, I see a problem in your code"... and they will spread the word.
You can. But as you said, they're just as likely to eat a sueball as a thank you note. Out of enlightened self-interest, there are few volunteers. Only the very brave or the very stupid come forward.
> I don't think it's the IT people at the coal face.
Where do you see me laying the blame at the feet of the poor bastards left to work from an untenable position towards impossible goals?
You didn't. I was adding to the point that in spite of where the blame gets laid (that it's those in the trenches fault). Management is root of all insanity.
If it is possible to ignore the atrocious grammatical errors in this piece (el Reg, if you allow your journalists to post articles without being proof-read, then shame on you), then the underlying premise of this article misses the point...
There are several reasons why institutions continue to be vulnerable to hackers, but these are typically:
1. The institution does not understand the actual threats they face. Try and explain to them that the Severity of a Threat = Probability of Occurence x Impact of an Event and you will get a puzzled expression in response...
2. Business Executives undermine security best practices. Have a developer ask a business sponsor, "Would you like me to implement this new feature you have asked for, or would you like me to fix these 5 vulnerabilities?" and too many times the answer will be to request the new feature...
3. Institutions pander to the egos of "key developers" and/or have a cape-and-boots mentality when it comes to fixing issues. Go round any large IT shop and look at the diversity of technologies and architecture in use and you will quickly realize that this stems not from mergers but from arrogant developers or architects insisting on using the latest whizz-bang technology on their project. This is "resume architecture" - if it will look good on your resume, put it in the architecture requirements of your next project. Allowing this to happen will generate a heterogenous technology infrastructure that allows bugs to lurk unseen for years. Then, when thing go wrong, the same organization rewards the person who dons the cape and boots and swoops in to the rescue (when others were floundering) when they should be punishing the same person for failing to document a project properly in the first place...
You get the point...
Over the last 20 years I have spent much of my career being parachuted in to Organisations immediately after a major incident. - and 9 times out of 10 the root cause is idiocy....
1. Ignorance of Threats
2. Executives putting personal agendas ahead of common sense
3. Failing to patch known issues in a timely manner
4. Allowing the unqualified to purchase security toys, I mean tools
5. Needlessly complex infrastructure
6. Lack of Testing
7. You get the picture...
None of these issues are new. We've all seen them. Yet they keep biting Organisations where it hurts - and what's worse, unscrupulous snake oil merchants.... sorry, IT Security Product Vendors, will continue to peddle their snake oil, sorry, security tools, to ignorant middle management.
Are working hand in hand to solve the problems of Cyber-security / IoT etc @ www.ntia.doc.gov... So its all ok... Not!!! Translation: They're going to solve it by letting corporations slurp whatever they want w/o penalties. Example Vizio's fine for slurping Smart TV users appears large at 2 million. But the company sold itself for 2 billion. So its only good for business etc...
Reading through some of these proposed solutions...yeah great, however...I've said it many times.
The solution is not procedural, its not even technical....the problem is cultural.
In the same way it is expected by society that a person be able to read, write, perform basic maths and conduct themselves in a civilised manner it should be expected by society that a basic level of technical security awareness and proficiency is achieved.
At least how to access the management page on a home router.
If we continue to accept that its ok to be technically i competent the problem will never be solved.
There are billions of endpoints out there that probably arent as secure as they could be, theres no way on earth the IT industry and Governments can secure everything. Its impossible. We need a cultural shift.
Exactly. People feel empowered to file everything to do with networking and Internet connectivity under "far too complicated for me to understand, leave it to the experts".
However I don't think its unreasonable to accept that that's an irresponsible attitude. I don't touch main electricity because I know just enough to know how dangerous it can be if you don't know what you're doing, which I don't.
If you're hoping future nirvana where every citizen can tell you what an IPv4 subnet mask is or what DHCP is for, I can safely predict you're going to be disappointed. Nevvvvvvver going to happen.
I was never under the illusion that 'government and industry' knew anything about security.
"Truppi said we need to disabuse ourselves of the notion puncture the comfortable thought bubble that government and industry is working together to solve online threats."
What we need is to design 'computers' that can't be hacked by opening an email attachment or clicking on a URL.
'Sadly, that's impossible.'
"each tab launched in either Chrome or Internet Explorer will launch as its own, fully contained micro-VM. If a malicious site is visited, all users have to do is close the tab, destroying the virtual machine and the malware along with it. ref
Then someone develops a hypervisor attack and breaks out of the VM the same way Java malwares learned to break out of their sandboxes. Anything man can make, man can BREAK. Even the humble paperclip can be pretty much broken by folding it in half.
What we need is to design 'computers' that can't be hacked by opening an email attachment or clicking on a URL
I taught/mentored a few kids to develop a dedicated RaspberryPi "email-only PC", which was locked down so as not to give internet access to potential malware, to separate HTTP/HTTPS browsing and mail - at a cost of around £25. The idea was that the family/work more expensive/more capable Desktop PC then does everything but Mail (with badvert blocking too). They won a science prize at a school science symposium in Munich with their prototypes. Prototypes worked great!
You can , of course, achieve similar results by upcycling an older generation mobile phone/tablet configured just for 'generic' (your public) mail access, download your bulk mail in batches like UUCP, go offline, when you have time/energy try deleting most of the crud, queue the important validated mail for delivery to your work/home PC's (not very public) mail address , use DMARC validated services everywhere, certificate verification everywhere. It's all possible, but yes - there are many grey/black-hat opponents amongst the squirrels/seagulls data/sardines, trawling - to paraphrase a Cantona.
Certificates aren't the be all and end all of security, or repudiation, I wouldn't advocate one technology as the only solution to one thing, it introduces single points of technological failure, it's akin to Darwinian evolution, diversity is actually good despite it's inefficiencies, we want an Anti-fragile approach to our vast interconnected data networks, not fragile ones where the risk as it crystallizes and then exponentialises the harm as a single point of failure cascades through the system.
"But people want things as simple as possible. KISS Principle, turnkey simplicity and all that. And they outnumber you."
Most of my designs are based around KISS, but that doesn't mean they aren't complex (even for fellow consultants).
As an example, a current design is based around a very modular system (which was necessary due to the ever changing requirements of the customer natch.) and when you look at each component it is very simple indeed.
However, when you bolt them all together and try and encompass the whole thing at once (which you need to be able to do in order to reliably predict the impact of a new requirement/change) then all of a sudden it looks *very* complicated.
For example, Lego bricks are simple - but I've seen amazing things made from them that I wouldn't even know how to begin building!
This ex-FBI dude is right on the money....Sharing isn't caring, partnership doesn't work beyond a certain point, Government/ politics will shaft commerce for it's own ends. Never share more with Law enforcement beyond what they ask for, never give your enemies ammunition to use against you and never share anything you may end up liable for. Don't trust government agencies not to share the data between other agencies. Liability is everything.
Of course, it's probably worse than even he was saying.
Yes, EMET is baked into Windows 10, which is why Microsoft is planning to discontinue it for the older operating systems - many people are uncomfortable with having no control over Windows 10 updates.
But even something like EMET is still only one step past trying to keep up with every vulnerability as it is found.
Not having vulnerabilities in the first place would be better. And, while perfect bug-free software is not possible, progress in that direction is possible. Thus, IBM mainframe operating systems, because they handle I/O on the basis of records that include their length at the beginning, instead of character streams with record delimiters (CR or LF), have fewer buffer overflow problems. And then there's virtualization, i.e. the Qubes OS. (In my opinion, though, we really have to go beyond what the Qubes OS does, through changes in the hardware of the PC.)
Since no solution is perfect, one really has to do all three; reduce the number of vulnerabilities through a more secure design, mitigate the impact of vulnerabilities through additional techniques, and keep current with the advantage hackers are taking of what's left.
"Since no solution is perfect, one really has to do all three; reduce the number of vulnerabilities through a more secure design, mitigate the impact of vulnerabilities through additional techniques, and keep current with the advantage hackers are taking of what's left."
One problem: end users who don't want to learn, meaning you have to make the whole mess as simple and turnkey as possible.
Here is some very realistic opinion that explains many issue. Being an exec responsible for security engineering area, I keep seeing day after day sales both from my company and out vendors talking about next generation firewall stuff, SIEM and god knows what else. Reality, all garbage, none of these things stopped ever a proper attack, reason is quite simple, hackers are more focus on denial of service these days, easy to achieve and close to impossible to prevent and all those expensive piece of gears are useless, unless you have something like CDN alike things, where you fight with the same weapons, distributed against distributed. Even smart kits like arbor are useless if the attack is flooding upstream and your backbone isn't big enough. Solution is to go back to the drawing stage and correct all basic protocols like email, dns and others to have security in mind. And here is where government and companies should cooperate, to R&D and force implementation. If this is done we have a better chance to get somewhere better.
DDoS isn't about security it's about availability.
You can't nick someone's data or wreck someone's centrifuge with a DDoS attack alone. Sure it can disrupt business and cost money but it's a fundamentally different issue requiring a fundamentally different approach. Of course its causes are fueled by ignorance in terms of the recruitment of systems to participate in botnets but as a security engineer you should probably be far more concerned about protecting your data assets (assuming they're valuable) than fighting DDoS - that'll be one for the business continuity types...
This post has been deleted by its author
The CEOs will counter by looking for ways to convince any legislatures not to enact such a law: to the point of moving if they have to. Remember, it's HARD to convince a business to do anything they don't want to. They can play sovereignty against you. Then there's the matter of investors. Limitation of liability is one reason corporations exist in the first place, and that was done to encourage investment.
For any company that suffers a data breach, the CEO must serve a ten year jail sentence.
And where the breach comes about because of the myriad vulnerabilities even in the best software? Where the breach comes because MS changes their policies on EMET (Extremely Massively Excessive "Telemetry") and starts slurping data from all those enterprises who foolishly trusted them? Where an employee who passed all the best testing available turns rogue and steals data?
Even the best protected with the best practices in place can be vulnerable to the unexpected. You can do your best to prevent, but then even the best driver in the world would fall victim to a sink-hole opening up in the road a heavier truck just drove across.
All for people further up the food chain facing serious penalties for anything that could be their responsibility, but that is over-ridden by a very severe hatred of innocent people doing time, even a little bit. Those who advocate such things perhaps should give it a few months trial first?
Biting the hand that feeds IT © 1998–2020