Catalyst for Industry Rethink
This is big enough to start an industry rethink. For too long we have given way to the performance needs of scientific processing while ignoring the issues that are faced by the rest of computing - that is security and software correctness, robustness, and reliability. Perhaps computer science courses are responsible for this, since you can objectively measure performance, but the many other factors you can't. So we have a generation of programmers and hardware engineers thinking about the wrong issues.
Scientific programming is actually quite simple, but compute intensive and complex in ideas. These ideas though are succinctly expressed in a few equations. But you have a simple program that runs for ages to get a result (it could be argued this is a simplification). But generally scientific programs are well specified to satisfy a particular goal.
Other computing by comparison is simple, yet the development results in complex and large programs because the goals are much more difficult to define.
I think they got it right in the 1950s to separate COBOL and FORTRAN. I don't like this separation but it seems to have become a fundamental fact because of the very different goals. But in processor design we have tried to merge the two. Security and correctness checks will slow a processor down, so they have been ignored. The RISC vs CISC debate is also where we can see the division, although you can use RISC and CISC for both scientific and the rest (I hesitate to call it business computing these days).
But security must be baked in at as low a level as possible. You really cannot get around that and should not do so. But that is what has been done – security and correctness have been sacrificed for performance. Performance isn't bad – yes give me more of it, but don't sacrifice other crucial issues.
If you don't put security at lower levels, it must be done at higher levels which will cost far more in terms of processor cycles for something which is not as accurate in terms of being implemented in weak heuristics (guesses) rather than strong rules. If these guesses miss, we can get false positives requiring wasted human interaction, or miss a real attack which can end up costing a lot in terms of money and human time. Virus-checking software checks for software that might do a buffer overflow, rather than directly blocking buffer overflows and out-of-bounds access.
We urgently need processors that do bounds checking and other security checks. Security should not be left to MMUs (MMUs themselves seem like an afterthought to provide virtual memory). Yes to do this might require a decade long effort.
In operating systems, we need to get back to microkernels such as Andrew Tanenbaum's Minix that has just a few thousand lines that run in kernel mode, rather than large monolithic kernels that run everything in kernel mode. Maybe revisit MUTLICS and Burroughs B5000 both in architecture and operating system. Systems should be designed as a whole, rather than just a CPU – this should be done by software and security experts, not electronic engineers. We now know that concepts such as virtual memory and security are essential to computing, not to be treated as afterthoughts.
Of course changes to architectures will break a lot of C programs out there – but for the better. We also need to address the issue of systems programming languages as well. C (and C++) have been too weak in terms of security and correctness for far too long, and the industry has got away with this dire situation – up until now that is. Systems programming languages should only be used for operating systems and not extended to applications programming.