In many companies...
Step 1) Allocate a budget
Step 2) Assign a charge number
Else nothing else will happen.
Information security (infosec) is no longer a nice-to-have. It is a matter of corporate survival. Even the smallest company can be weakened by the simple loss of a customer list, ruined by the fallout from the loss of protected customer information. There's a lot more to infosec than merely hunkering down behind a firewall. As …
Information Security. Depite being an Information Security Officer I'm not really allowed to talk about Information Security. I have to call it Cyber Security or possibly Digital Security which ignores the fact I have to deal with the paper as well.
The board demands reports on 'cyber security' then complains they can't understand it then refuse all attempts to provide awareness training to them. They read reports then complain that it mentioned 'risks' and 'vulnerabilities'. I had to change them all to 'opportunities for improvement'.
They have no interest in being secure if it affects their pet projects. They refuse to implement basic security measures for 'reasons' then deny ever having done so even though minutes prove it.
I'm sure that advert with the board made up of toddlers is factually correct.
I've had some success pointing out (tactfully) that its the loss of business value that gets people the sack.
Losing a list of clients means nothing. Losing clients because you lost their data means lost money, lost value and potentially a lost job.
People ignore risk, especially hard to de-cypher ones. Then they look surprised when the business takes a hit. Lose data, then you're looking at roughly a 10% client churn (so 10% drop in revenue). There's tools out there that let you understand the loss of business value with precision.
At the end of the day, cyber security is there to maintain business value.
"The board demands reports on 'cyber security' then complains they can't understand it then refuse all attempts to provide awareness training to them. They read reports then complain that it mentioned 'risks' and 'vulnerabilities'. I had to change them all to 'opportunities for improvement'."
Maybe the problem is in the presentation. Use language they can understand so the report becomes its ow awareness training. Introduce information security, risks and vulnerabilities by stating that they have to be accepted as such, euphemisms won't make them go away and they have to be dealt with which you're sure your board is capable of doing although lesser directors might shy away. And if that doesn't work, get your CV out there.
There are too many I's and They's in your post. I would suggest that might be the root of some of the failure. Why not allocate your time to their pet projects and bake infosec in so that it's not a problem in the first place. Do you carry a large three ring binder around with you by chance? is your name John Pesche :)
That would require departments to follow procurement policy and involve IS and IT at the start of any project rather than trying to sneak things in under the radar until the point of the delivery guy turning up to IT with a server on a truck with no notice. Or as is more common these days a helpdesk call demanding access to a free Dropbox account to transfer kids/vulnerable adults personal data to a non-EEA supplier which then puts a spotlight on projects which are often very risky if not illegal.
And this is government so no one really gives a shit about profit or clients or enforcing policy.
They refuse to implement basic security measures for 'reasons' then deny ever having done so even though minutes prove it.
And ... Why should they? As the "Information Security Officer" you are there to be the bullet-sponge, the sacrificial lamb, the one to fall gracefully on ones sword should the circumstances proscribe it.
Accept the Dark Side and Life Will be Better!
.......
.......
.......
Having accepted The Dark Side, then, part of your job is to collect and carefully document the dirt so that you will have some solid leverage when it comes to negotiating the severance payment over the "taking full responsibility"-performance; They want this to go smoothly. They will see the value in "investing" a tiny sliver of shareholders money in an "amicable agreement", money which keeps the inquest away and the press mum is well spent. You should get at least get 2 years salary with taxes paid up front. Maybe a pension on top too. Friend of mine got 4 years wages invested into a new venture company owned by him instead of the taxes-paid-bit, this to avoid the top-rate tax.
Next time you could consider "Risk Management Officer" at an investment bank, the role is the exact same, the pay is much better - and you don't have to go to jail to collect like "The Rogue Trader" must.
"Whatever is backing up your network has to have access to your network to do so, but you cannot have write privileges from the network side."
This is still a sticking-plaster remedy although it's the only one available to the user community.
We really need systems designed from the ground-up with an assumption of distrust built in. Our existing OSs originated when devices were much less threatened: users were trusted to a reasonable degree, devices may have single user and maybe not even networked, certainly not as open as the internet has made them. It's not the 1970s any more but responses to a deteriorating situation have been bolted onto less than secure systems.
Extend this idea of restricting write privileges beyond just the backup. Arrange things so that specific categories of information can only be written by a specific process. Write requests are only granted on the basis of not only the user but also the application that requests the write. Cryptolocker and the like wouldn't be on the approved list. Read requests would be similarly restricted although a wider list of applications might be approved - your print application needs to read the file it's printing and your email application needs to read the file it's sending. There should be no over-ruling this so a super-user in the Unix mould is out of the question.
"Windows has much of this built in and has done for a long time."
So if, for instance, I installed MS Office on a Windows PC I could configure it so that only Word can write to Word documents and only Excel could write to spreadsheets and that either format could be read to email them but neither could be read to copy to a USB drive?
So, we need someone to develop a "write-once" filesystem? Although I suppose even that fs could be subverted, given suitable system privileges, by installing a hacked rw version of the same and mounting (and then altering) the backups filesystem with it.
OK, so all (suitably paranoid) backups should only be to write-once media, then?
If you use snapshots in ZFS/GPFS or on NetApp boxes, etc, then you have the "copy on write" model so attempts to re-write files puts the data elsewhere on disk, and the original is still accessible via the previous (snapshot) file tables.
Of course you need to be doing snapshots, and to protect the machine doing so from being done over, and also to have long-term backups elsewhere so you can go back as far as you need when you discover it was infested. Some DB randsomware waited 6 months or so before revealing itself so there would be no viable backup to recover from.
So, we need someone to develop a "write-once" filesystem?
I'm thinking of shipping appliances that expose a filesystem over Samba/NFS/whatever, but are in fact git repositories underneath. Thus an over-write - even by Cryptolocker or its ilk - is just another commit, and can be trivially rolled back.
I suspect it will be dog slow...
Vic.
The problem is it can't be controlled by "write privileges" on any box that is potentially compromised.
If your cryptolocker is run as an ordinary user then any normal backup is fine because it is done by a privileged account. But if your malware is anything smarter than a small user-mode script then it will exploit either the meatware for a suitable password or use any one of the numerous flaws in *ANY* OS to gain what it needs to attack all. There is always some sort of admin account, and pointing to the all-powerful UNIX root is a distraction that if you have a more compartmentalised model (as Windows should be, but usually is not) you still only need a few more steps to get the account you need.
Really, the only viable option is to reverse the process, so the backup machine comes in and reads what it needs from servers and desktops and where it writes it to, and how versioning/snapshots/etc, are controlled is well separate from the at-risk boxes.
Of course this also assumes you can simply log-in to the backup machine using an account on the others...
@Paul
You need to think outside current models. Here's one.
One admin user has the power to allocate blocks of storage for a specific application. It can neither read nor write to those blocks, just allocate them. The user has to log on specifically as that user to do that - no privilege escalation is allowed.
The specific application does nothing but provide access to specific clients. It has complete and exclusive control of the blocks allocated to it. Once a block is allocated no other application can read or write to that block; there is no super-user which can also do that, not file system which kernel routines handle. The application enforces access writes based on a combination of both client application and user. The server application starts on boot-up or has to be restarted by a specific log on - no escalation of privilege is allowed.
Write access can be tied down completely - the server can be configured at source to only accept requests from specific applications. If the server isn't so configured then control is devolved to a specific admin user who can grant write access to specific clients. This admin can also specify applications from which read requests are handled and can optionally grant this right to specific users. The admin user has to log in specifically, no escalation of privilege is allowed.
Software installs and updates are handled by a specific user ID which checks signatures of install/update files. The user has to log in specifically to do this, no privilege escalation is allowed.
Granting user credentials? You guessed it. A specific admin ID to be logged in, no privileged escallation allowed.
So Cryptolocker can neither read nor write your office files directly. It probably can't have read requests accepted and it certainly can't have write requests accepted. It can't escalate its privilege to reallocate the office storage space to itself nor can it escalate its privileges to install itself as the server for that space nor even escalate its privilege to allow itself access, even if the server accepted such grants of write access, all these actions require a specific login, each with their own credentials. On a privately owned machine the user may have the credentials for all these admin IDs but in a business environment this is unlikely. This would make it significantly more difficult to persuade a owner/user to compromise their own machine and in the case of properly administered business networks it would require the collusion of one of the admin team.
You say Windows can have compartmentalisation of admin rights. But can it have compartmentalisation of access to hardware resources?
It makes admin less convenient but in part we are currently victims of a trend to make admin more convenient at the cost of reducing security. That isn't a good trend.
"The user has to log on specifically as that user to do that - no privilege escalation is allowed."
And in that one sentence you have nailed the problem. Privilege escalation is not supposed to happen, short of giving the admin password (a whole set of FAIL! for another day's rant), but it does. And because all software, be it application, OS, or low level hardware driver, has bugs of one form or another it is inevitable that someday someone will find it.
That is why dreaming up ever more complex OS models to try and stop this is never going to be that successful. Sure we can segregate user accounts from admin tasks, and we can use things like SElinux/AppArmor to enforce the expected behaviour of process that have high privileges to reduce what p0wning them can do, but we can never be sure.
And that is why a backup machine has to be physically and administratively separate from any machine that can be taken over from the Internet access or portable media, etc. And it has to assume that files might be trashed, so some point-in-time model for data recovery needs to be implemented.
"You've just described SELinux."
Such a shame that every Linux HowTo article starts with "turn off SELinux". As a result, nobody knows how to use it - kinda like the Windows one. Setting up MediaWiki recently (and bear in mind I don't know SELinux well either) it took me ages to work out the access problem. In the end I had to learn enough SELinux to sort the permissions on the right files so that the actual file permissions would work as expected. I've not felt that stupid in a while, but at least I didn't cave and turn it off; didn't even try chmod 777 like half the "help" comments suggested in forums...
A company I worked for had its network infected by Conficker. Microsoft sent me some instructions and told me to follow the instructions "to the letter".
The Change Board would not approve my change to enforce password complexity because "store managers are incapable of using complex passwords". Store managers being people who presumably had a driving licence and were eligible to vote in general elections.
When the company went bust in 2012, after I had left, I presume Conficker was still merrily ficking that network until the power was turned off.
The NSA used to recommend writing passwords down on paper and keeping them in your wallet for this very reason.
Password complexity didn't help with conficker, turning on your firewall did. Most of the impact of Conficker was actually where it locked out accounts while trying to guess passwords - this implied a lack of success in cracking those simple passwords since it wouldn't have locked them had it been right.
Patching was also useful as well as an up to date AV that wasn't McAfee based. Sadly, most IT people have an unexplained fear of configuring the Windows firewall, and treat patches with suspicion. Some of them were also duped into installing McAfee virus software believing it to be antivirus - these were the heaviest hit.
The only way to clear Conficker successfully was turning off the switches and offline patching and scanning the endpoints. Most companies I saw ignored this advice and spent a week chasing tails before accepting that the hard way would be quicker.
1) Encrypt everything.
2) Do not use American products.
3) Develop as much tooling as you can in house.
4) Do not have direct connectivity to the outside world.
If your data stays out of the hands of GCHQ it wont be given to the NSA who will then sell it off to American enterprises UK is competing directly against.
In case you think that this is melodramatic see the ongoing lawsuit with trade secrets from Airbus, which mysteriously showed up in America.
Or Siemenn's nuclear centrifuge operating data, that mysteriously showed up in stuxnet.
and remember that most organisations of any size have at least three IT operations: production, test/development, business administration - and that these should never be allowed to meet.
You really don't want people who work on one of these to act as a bridge to any other. If that means having two PCs (neither with any USB ports) on a desk, then make it so. But if you want to stop contamination spreading and to protect, or at least slow down attacks, your production - revenue earning - systems, then you need barriers between them.
Stuxnet targeted Siemens PLC's that get used in all sorts of applications. It just so happened that some particular PLC's were being used in Iran's nuclear, ahem, power program. The stupid move was to have the centrifuges on an accessible network and not just on an air-gapped local lab network.
2.5) Don't use British products (are there any left or is it all imported from China?)
Here's the perfect of someone so focused on the tiny details within the leaves they can't see the broken branch about to fall on their head.
Why are you worried about compromises that turn phones or smartwatches into bugs? Sure, those things may be POSSIBLE, but compared to the low hanging fruit 99.99% of organizations have not dealt with yet worrying about this stuff is stupid.
"Information security (infosec) is no longer a nice-to-have. It is a matter of corporate survival."
Information security has always been a matter of corporate survival.
Even in the days when information security meant locking up the ledgers in the safe before calling it a day. Or having someone guard the clay tablets after business hours.
"Information security has always been a matter of corporate survival."
It needs to be a requirement written into company law as part of the director's responsibilities so that our A/C Information Security Officer could remind his board about the possibility of their becoming HM's guests, and not at a garden party.
> You aren't one with the machines in the way that today's kids are and you never will be
I should bloody well hope not!
We read stories about people who are prepared to give away their passwords for a bar of chocolate. Just do a search for "millennial" "password" and "security" and you will be confronted with the opinion that today's under-30's neither care, are aware, nor practice any form of computer/information security.
Whether the slackness is limited to individuals of this age group (I doubt it), there is a clear warning that security is only ever an afterthought - usually after the attack: yeah, we really should start to think about doing something. But I've got a ton of work to do, maybe next week.
Give up the practice of BYO tech. Each employee that needs a smartphone for work, gets one. Each employee that needs a laptop, gets one. Each employee that needs a desktop, gets one. Only company issued computing devices can connect to the company network. Full stop. Even company presidents can find themselves removed and those are the sorts that would have access to the most sensitive information and may be pretty PO'd at being eased out if they were also the founder of the firm.
Any employee taking data off of company premises without written permission gets the sack.
If I bring my own computers to a job where I am an employee, what's to stop me from taking company data home with me? Laws will differ around the world, but it could be a messy legal battle to demand that somebody leaves their personal computer and phone for screening after they have been fired or laid off.
BYO is a problem but the point is much broader. Does the organization have good control of their network and good access control. If not, the data will be gobbled up by someone. Also, do not assume size matters. The Target breach of a couple of years ago was blamed on a vendor who had network credentials that got hacked. It was really mostly Target's incompetence.