Everyone who shouldn't be, talks about cybersecurity these days, and thinks they (a) know what it is and (b) how to "make things secure". This is not because they know much, but rather the inherent human need for order and an over-estimation of a person's influence. It's why Trump wants to build a wall along the Mexican-American border: Not because it will do any good, but because it looks like a decisive take-charge can-do action. It makes people feel good. Cybersecurity is like that: Most of what people think will make things secure are based on incomplete knowledge and over-estimating their abilities -- they want to quantify the risk, somehow manage it. People despise unknowns.
Here's the truth: The most basic working definition of "cyber" (I will never like that word) security is that "the device does what I expect it to do, and does not do what I do not expect it to do." Whether it is malware that destroys your server, or a software bug, or human error, the result is the same. The cost is the same. And any infrastructure meant to protect from that happening must be all-encompassing. Unfortunately, security will never be perfect. It won't even be good. This isn't because we can't design secure systems, but rather that the designs are not open. There are no laws or regulations governing proper design -- our industry is an unregulated one. Oh yes, bad hacker -- shame on you for stealing all our personal information. We're going to give you 500 consecutive life sentences in the electric chair! But the law is silent on what we should be doing to prevent such problems in the first place.
We've grown accustomed to corporations selling our personal data, our phones tracking our every move, our computers having 'telemetry', and companies big and small can roll out a new product without any certification or validation. But worse than that, because of copyright and patent law, it is *illegal* to follow good engineering practice. When we construct buildings, we do so with the collective knowledge of every engineer who built one that fell down. We share that knowledge freely, blueprints and designs are available for any who know where to look to see. And as a result, very few buildings fall down.
But in this industry, everything is a black box. We don't share blueprints -- we sue people who try to copy ideas, and make sure those ideas can't be copied without being sued for hundreds of years. Everyone has to reinvent the wheel, and possibly reinvent it many times, to avoid patents and copyright lawsuits. We've had successful lawsuits with hundreds of millions of dollars hanging in the balance over *beveled* corners. And so knowledge of how to build something properly just isn't there. Every engineer in this field can only advance to his or her own level of understanding -- they cannot build, or even see, the work of others. And as a result, our information infrastructure regularly suffers catastrophic failures... and these too, are not shared. It's illegal to know or use a well-understood and known method of doing pretty much anything... and if and when something fails, the default is to cover it up and whistle loudly.
Our problems with "cyber" security have nothing to do with the technology and everything to do with a broken system that values profit above solid engineering practice. It's why we have a "cyber" security problem in the first place -- old vulnerabilities never truly die, because everyone is forced to build everything from scratch. No matter how good you are, you're going to make at least one mistake in your design. Meanwhile, the criminals, the neer-do-gooders, are not hindered by laws, ethics, or any of that -- they share their knowledge with each other freely. The end result is, they know all the millions of places you can make a mistake... and computers and software, freed of legal obstructions, can go through all those millions of places with solid design, peer review, and decades of historical data at their fingertips and find every crack in your armor.
If you want "cyber" security, stop screwing around and demand that we treat our information systems like any other engineered system: Delete the broken patent and copyright systems, and allow that knowledge to be shared. Force every corporation that releases a product to the public to disclose its source code. And don't even think of giving me any crap about how this isn't secure because "terrorism" or "hackers", or whatever boogyman you're worried will look at it and think bad thoughts. Boeing tells everyone exactly how their planes are built -- and air travel is the safest way to move people because there are many, many eyes looking at each and every design inside and outside the company, and everyone in the industry can look at other designs and know which ones work, why they work, and when failure happens -- it is an open and transparent process to find out and disseminate the findings to keep it from happening again.
Everything else is just buzzword bingo... it's so-called "experts" re-arranging the deck chairs on the Titanic.