Computer science is not the same as computing or IT. Expecting a computer science graduate to have taken cybersecurity courses is like expecting a maths graduate to have taken book-keeping courses.
Wait, security courses aren't a requirement to graduate with a computer science degree?
There's a line in the latest plea from CISA – the US government's cybersecurity agency – to software developers to do a better job of writing secure code that may make you spit out your coffee. Jack Cable, a CISA senior technical advisor, writes that in 2019 when he was a computer science student at Stanford University in …
COMMENTS
-
-
Friday 26th January 2024 23:06 GMT Throatwarbler Mangrove
Perhaps I'm dense, but it seems to me that if a person is taking courses about the science of computing, there must, at some point, be a mention of the real world operations of computing devices, which must necessarily include security in today's world. Or is there some bifurcation of education, where applied computer science is relegated to the dull profession of IT and its attendant concerns with minor problems like "security," "stability," and "usability," and theoretical, or "real," computer science, where the true geniuses dwell, is unsullied by the humdrum considerations of the everyday world?
-
Friday 26th January 2024 23:59 GMT cyberdemon
Security in a computer science context should be things like memory safety, race conditions, process isolation. That stuff IS covered by CompSci degrees (certainly the one I took anyway) despite the alarmist headline.
What should NOT be part of a CS degree are things relating to specific software technologies. Managing user behaviours etc. My CS course had a module on operating systems which included how to set up a generic UNIX-like system, but it did NOT go into the detail required to actually set up a robust UNIX system, that's for the IT course.
Similarly, Firewalls SHOULD and are covered, but how to properly configure a Cisco one should NOT. That's for the Cisco-sponsored IT course.
-
Saturday 27th January 2024 00:09 GMT doublelayer
I agree that specific product configurations shouldn't be in there, but should be in the IT security course, but there's a lot more to secure coding than the ways the language itself can have vulnerabilities. There is also secure design, which needs to be taught somewhere. Not just memory safety, but that you need to encrypt some data, hash other data, and be certain about the security of wherever you've stored it. They need to know how an attacker works so they can add the basic security precautions to their code. They need to know about information leakage so they can prevent having that vulnerability. Many of these things aren't about the tools they're using, but how they write their code.
I was taught all this stuff. Some of it was in security courses which were not mandatory, but at least some of it was included by professors who wanted to do this well. Still, those designing courses would do well to ensure it is there.
-
Saturday 27th January 2024 08:08 GMT Falmari
@doublelayer "There is also secure design, which needs to be taught somewhere."
It seems to me that secure design is a design methodology* and the decision to adopt the methodology within a company would be management's (dev manager or higher). As such the employers should supply their employees with the training required. Secure design does not need to be taught on a computer science degree, but I could see it being taught on computer science degree that focused on programming and software design.
Computer science is not just programing software and while Jack Cable may like to think it, it's not the job of CS degrees to produce fully skilled programmers for the software industry. There is also the assumption that because security courses aren't a requirement to graduate security is not being taught. Security specific to the subject of a unit could be covered as part of that unit, network security was part of my network unit.
*Please let me know if I have that wrong and it is something else entirely.
-
Saturday 27th January 2024 21:57 GMT doublelayer
I don't agree, possibly because my experience about what computer science degrees, at least the first degrees before postgraduate, intend to teach. In my experience, they do teach a lot of practical programming skills. It's not a full set of skills needed to be a good programmer in industry, but it's enough skills to be good at learning what you need when you are in that position. That's not the only thing they teach; there's plenty of theory involved as well, but they do teach practical skills and people do take them to learn those. If they are going to do it that way, I think secure design is important enough that it should be part of the requirements, not a separate choice to be trained later. This might be related to what I describe in a comment below that I've rarely seen computer science split into programming and a more theoretical version, so I assume that most other departments work the same.
The reason that I think secure design is required is that it's not really a design methodology. It's not an option to be picked from a set of choices, but a mindset you gain from knowing what can happen and what you should do to minimize the risks. You can design securely along with any other structure you plan to use, and the concepts involved apply equally well whether you're writing a videogame or low-level industrial control software. In either case, it consists of basic lessons like knowing how to look for vulnerabilities, knowing what likely ones are, and learning how to either prevent them from existing or block someone from trying to use them. It's true that it's very easy for people to design without paying attention to these, and there are cases where this can be somewhat excused if I grit my teeth and admit it, but those exceptions are a very small section of software produced and the consequences of ignoring it in all the other areas can be extremely bad. We could fix this with more security courses, or with more attention to it in normal programming courses, and the latter might actually be more efficient, but I do think we should fix it in the education step as well as the corporate one.
-
-
Sunday 28th January 2024 12:26 GMT Necrohamster
"Security in a computer science context should be things like memory safety, race conditions, process isolation. That stuff IS covered by CompSci degrees (certainly the one I took anyway) despite the alarmist headline."
Interesting that you mention context, and then give an out-of-context interpretation of the article's subject matter. Why would CISA – the US government's cybersecurity agency – tell developers to concern themselves with "things like memory safety, race conditions, process isolation"?
-
Monday 29th January 2024 08:52 GMT Anonymous Coward
What should NOT be part of a CS degree are ... Managing user behaviours etc.
I don't know about that. Computers are no use unless they help people. I'm just stunned that a human opening mail is still the most successful way to break in to a system. Throwing patches at it doesn't work. Perhaps a "scientific" approach to solving the problem is the only way to make headway.
You might also consider "Civil Engineering" vs. "Architecture". The latter is definitely more related to human ergonomics, but it's not necessarily less scientific.
-
-
Saturday 27th January 2024 17:55 GMT claimed
Tell you what, I did physics at uni, do you think there were classes on launching rockets, or building a microwave oven, or keeping a building standing? Nope. There was fluid dynamics, electromagnetism and classical mechanics. My friends doing a course called civil engineering did get classes on where a beam should be to stop things falling down though. I’m with the top poster on this
-
-
Monday 29th January 2024 15:37 GMT claimed
Yes I know, I got a 1st in General Relativity, that’s entirely my point. Physics is studied at the fundamental level, other specialisms apply it to solve specific problems. This thread is an argument about whether the discipline of “security” is a fundamental or a specialism. While security is a fundamental concern in any IT system, that doesn’t make it a fundamental tenant of computer science. Safety is a fundamental concern when building a building, a satellite, but it’s not a fundamental aspect of physics. Come on; I studied physics: of course I think it’s fundamental to everything, you’ll only win an argument if you’re a mathematician, and even then I’ll claim it’s just because you’ve taken the abstraction too far ;)
Edit: I’ll only concede an argument, I might very well have lost even if I refuse to admit it :)
-
-
-
-
Friday 26th January 2024 23:26 GMT doublelayer
That's not the structure used by most universities. Usually, there is computer science, which teaches both theoretical and practical programming, and there is IT, which is separate. You don't often get to skip the practical parts when taking computer science, even if you do decide to become a theorist, and if you are pursuing that approach, you'll still find that most of your classmates intend to become programmers. Those are generally the two options. Sometimes, there is a split in computer science where some students take a mostly software approach and some others do a combination of electrical engineering and low-level software, but not always.
I don't know of many universities where you can take a purely theoretical curriculum, nor am I convinced that any of them should run such a program.
-
-
Saturday 27th January 2024 06:16 GMT Neil Barnes
A dozen years ago I was made redundant and spent a year writing a Master's thesis for the Open University: the degree I received was titled "Computing for Commerce and Industry".
Although I wrote code - lots of it - to demonstrate and prove my thesis, not one line of that code appeared in the final document, and nor was any of it assessed.
-
-
Tuesday 30th January 2024 13:05 GMT JerseyDaveC
Re: A purely theoretical curriculum
My Comp.Sci. degree (graduated 1991) was significantly theoretical. There was loads of maths, formal language theory, architecture (including CPU architectures including VAX and RISC), algorithm complexity, data structures, operations research, you name it. There was some practical stuff too, of course - program design/development, databases, compiler writing - but the theoretical side was big.
I can imagine, therefore, that it's perfectly feasible to do a Comp Sci without touching security. Would I do so if I were starting university right now? Nah, it's so blatantly useful when trying to get a graduate job that you'd have to have a really good reason not to touch security at all. But in principle I can see why it's possible, even if not particularly advisable.
-
Saturday 27th January 2024 22:10 GMT doublelayer
Me: "I don't know of many universities where you can take a purely theoretical curriculum"
Reply: "Oxbridge"
Well, I didn't go there, so let's see what they have. Taking into account the criteria written in my comment, let's take a look at Oxford's computer science course description:
This course in Computer Science aims to produce graduates thoroughly conversant with the principles of modern computing science, who are able to apply those principles in the design and construction of reliable systems. The course at Oxford concentrates on bridging theory and practice, including a wide variety of hardware and software technologies and their applications.
[...]
Practical skills must also be developed, and the majority of subjects within the course are linked with practical work which contributes marks towards the final examination.
[...]
In the second year, Computer Science students are required to take:
- the core courses in Algorithms & Data Structures, Compilers, Concurrent Programming and Models of Computation;
They have many optional courses that offer practical skills. You are not getting through this degree without learning some practical skills in programming.
Let's check Cambridge, just to be complete:
Practical work is undertaken and assessed in all years of the degree programme.
[...]
You take four papers, including three compulsory Computer Science papers - covering topics such as foundations of computer science (taught in OCaml), Java and object-oriented programming, operating systems, digital electronics, graphics, and interaction design - and the Mathematics paper from Part IA of Natural Sciences.
[...]
You take four papers, spanning core topics:
▪ theory – including logic and proof, computation theory
▪ systems – including computer architecture, computer networking
▪ programming – including compiler construction, programming in C/C++
▪ human aspects – including Human Interaction design, Artificial Intelligence
You also undertake a group project which reflects current industrial practice.
[...]
-
Sunday 28th January 2024 13:44 GMT Necrohamster
"You also undertake a group project which reflects current industrial practice."
Where a group project is concerned, any fule kno that between 25% and 75% of the group may be involved in a non-participating or non-practical way :D Theoretically they could pass without doing any practical work
-
-
-
-
Saturday 27th January 2024 00:59 GMT Roland6
Also a Computer Science degree is only the first step on becoming a fully fledged professional. I suggest the problem isn’t what is or isn’t in a “Computing degree” but what professional societies demand to confer chartered status.
Okay a big problem is that industry likes cheap and cheerful…
-
Saturday 27th January 2024 15:11 GMT Dan 55
I would say any course, CS, IT, or Software Engineering which does not cover the security of data in transit and data at rest is lacking a key part of what should be taught in the curriculum. We can't have graduates being told to put things on the internet or without any clue how to do it, or at least any clue about which questions they should be asking.
-
Sunday 28th January 2024 05:39 GMT Brian Miller
Security is common sense, but both are lacking
Computer science is not the same as computing or IT. Expecting a computer science graduate to have taken cybersecurity courses is like expecting a maths graduate to have taken book-keeping courses.
Um, actually, it's more like architecture, and skipping over putting locks on the doors and windows. Really.
There are a lot of ways to ensure that a system has some reasonable bit of security. The problem comes from people in power who don't want to put any security on anything, and then have a funny look on their face when somebody pwns their network. And of course it's just not their fault for nixing all of the basics.
"I went to Blackhat and all I got was this stupid t-shirt, and your firewall, routers, switches, servers, PCs, HSMs, phones, light switches, toothbrush, and butt plug."
There's a phone API that puts plaintext passwords into the REST call parameters. There's a "secure" CPU firmware that has a copy-paste vulnerability when encryption is turned on. And on and on. "I know you worked hard on that, but security is complex, and it's behind a firewall, so it's OK."
The only thing that forces security is what happens after an "incident." Companies have to wind up paying hefty fines and face embarrassment and ridicule before they do anything sensible.
-
Sunday 28th January 2024 06:20 GMT Kevin McMurtrie
Computer science is the study of computers. Most security vulnerabilities should be covered by understanding how computers and their algorithms work. Command injection, XSS, exploitable race conditions, MITM, sabotaging handshakes, replay attacks, extra/inconsistent states, ... That's all computer science.
There are exceptions, of course. There's a category of security measures about having a second layer of protection for people that have been tricked.
-
Sunday 28th January 2024 10:26 GMT Mike 137
No quite
"Sure, computer science is not engineering, and you may argue that engineering is a more natural home for practical secure coding"
Practical, yes, but conceptual (grasp of the principles), no.
Scientific training is not just about absorbing a body of facts, it's supposed to inculcate a way of thinking that includes objectivity, perception, attention to detail, the abilities to analyse and synthesise and (most importantly) commitment to high standards. It's highly possible that current science education doesn't accomplish all of this, but they're actually the same set of capacities that should also be inculcated in would-be engineers. Consequently, both computer science and software engineering courses should include such training, couched of course in terms appropriate to each discipline. Furthermore, any software engineering course that doesn't include at least some relevant basic computer science is not worth its salt, as it will fail to expose students to some fundamentally necessary first principles they will need to understand to be good software engineers. A result of such omissions to date is that software development still, after all these years, doesn't objectively qualify yet as an engineering discipline.
-
Thursday 1st February 2024 23:07 GMT JavaJester
Security IS a valid computer science topic
"Robustness", the property that a program behaves as intended, has long been considered a legitimate avenue of inquiry for computer science degrees. I think everyone can agree that a system that allows little Bobby Tables to wreck the database is anything but robust. There are plenty of security concepts to be taught that are not technology specific. Examples are strict input validation, in band vs out of band parameterization and why out of band is more secure, the principle of least privilege, separation of roles/duties, zero trust network architecture, and the list goes on.
For a baseline requirements perspective, requiring knowledge of input validation, in band vs out of band parameterization, and principle of least privilege is a good starting point. These are low hanging fruit that a CS grad would likely implement or influence regardless of their role. The countermeasure of escaping should be taught for situations when out of band parameterization and strict validation are not feasible.
Note: parameterized SQL prepared statements vs string concatenated statements is an example of out of band vs in band processing. It would be obvious to a CS grad who took and understood the baseline security I am proposing that the out of band parameterized prepared statement is the secure choice.
-
-
-
Friday 26th January 2024 22:35 GMT elDog
Totally concur that security should be part of any computer oriented curriculum
The ivory-tower assumption that "Computer Science" is not really part of IT is the basis for the whole problem.
I know Knuth's brilliant "Art of Computer Programming" series didn't worry about buffer overflows or re-using unreferenced memory or being able to read data outside of your assigned space. But because our current implementations use languages and models that don't worry about these things, we are spending a huge amount of time and effort to clean up sloppy-but-effective code.
The same arguments will be made against automated testing environments, rigorous documentation, pen testing, etc. It's sort of like asking an economics major to take charge of a real banking system. Real life is hard.
-
Saturday 27th January 2024 02:55 GMT aerogems
Re: Totally concur that security should be part of any computer oriented curriculum
Ada95. Instead of teaching people C++ or Python or anything else in CS programs, the 100-level classes should all force people to use Ada95. For those unaware, Ada95 has a ridiculous number of compile time checks. There's a decent chance you'll have to take two or three stabs at a hello world app before it compiles cleanly. As the language designers put it, if the code compiles, you can be pretty well sure that the only thing left are logic errors. If memory serves, the language was originally created for use by the US DoD; before they started subcontracting everything.
Any decent programmer should be able to go on to learn any other language pretty quickly, so if you force some of the habits from Ada95 into people's heads while they're still young and impressionable, hopefully they will stick and become part of the person's habits no matter what language they're using. Still doesn't necessarily teach security, but it would still help eliminate some of the code errors that result in security issues.
-
Saturday 27th January 2024 20:46 GMT J.G.Harston
Re: Totally concur that security should be part of any computer oriented curriculum
That's because the real world reality is that "Computer Science" is not really part of IT. Computer Science != IT. Computer Science != Software Development. Software Development != IT. Yet, all too many people assume and insist that all are the same thing.
"You said you wanted a school job, I've got you a job cleaning toilets, hey why are you complaining, it's cleaning toilets IN! A! SCHOOL!"
-
-
Friday 26th January 2024 22:40 GMT shazapont
Drive awareness
I support this sentiment, and suggest that driving awareness of issues should begin with an understanding of the consequences and risks associated with actions. CompSci doesn’t sit alone, disconnected in murky pond. Decisions have consequences and those consequences can lead to frustration, error or worse.
Isn’t that worth teaching?
-
Friday 26th January 2024 23:21 GMT nautica
From David L Parnas--
"When someone builds a bridge, he uses engineers who have been certified as knowing what they are doing. Yet when someone builds you a software program, he has no similar certification, even though your safety may be just as dependent upon that software working as it is upon the bridge supporting your weight."
"There are no standards for computer programmers and no group to certify them."
“My judgment is that research in 'Star Wars' is going to fail, and I believe this so strongly that I'm willing to stake my professional reputation on this. I don't believe anybody is going to build this thing.”
-------------------------------------------------------------------------
“...Parnas earned his PhD at Carnegie Mellon University in electrical engineering. Parnas also earned a professional engineering license in Canada and was one of the first to apply traditional engineering principles to software design...
“...Dr Parnas took a public stand against the US Strategic Defense Initiative (also known as "Star Wars") in the mid 1980s, arguing that it would be impossible to write an application of sufficient quality that it could be trusted to prevent a nuclear attack. {He felt that the immense requirements for software which had to be absolutely faultless was--and still is--simply not realizeable.} He has also been in the forefront of those urging the professionalization of "software engineering" (a term that he characterizes as "an unconsummated marriage"). Dr. Parnas is also a heavy promoter of ethics in the field of software engineering...”
https://en.wikipedia.org/wiki/David_Parnas
--------------------------------------------------------------------------
-
Friday 26th January 2024 23:33 GMT doublelayer
I can't really agree there. There are lots of certifications for safety-critical software, as there should be, but there's a difference between licensing people to build something, licensing their product, and licensing everyone who does anything at all similar to it. Licensing the product is what they should be aiming for, and a certification of the people building that product can be a way of making it more likely that the product itself will be acceptable during that process. Trying to require licensing standards for all of programming is likely to end with a bad certification test which is difficult to pass and nonetheless doesn't tell you much about the capabilities of the student.
This sounds like the "programmers aren't engineers" cliche all over again, a cliche I believe to be flawed. It assumes that all engineers are like civil engineers when this is obviously not the case. The engineers building your consumer electronics don't need to have any more certifications than the programmers writing the code running on those devices or software with a similar level of criticality. They may be held to standards about security for the programmers or not spewing too much radio interference for engineers, but that's checked on the product and, unless it's quite extreme, only when it causes a problem.
I don't mean by this that security shouldn't get more importance in this. I think having security courses is a good idea, if only because I work in security and if I can stop dealing with the basic things people get wrong, that would let me work on more complicated and useful things. We also need to somehow stop people from ignoring all the lessons that get taught in good courses, whether those are about security, maintainability, or efficiency. I just think that adding a certification to this process won't make that happen.
-
Saturday 27th January 2024 15:45 GMT OhForF'
There is a lot of stuff taken into consideration when building bridges but as far as i am aware making bridges safe to use even after active attacks is not part of the normal specification. Most bridges could be made unsafe to use or even brought down pretty fast if an attacker uses readily available tools (e.g. angle grinders applied to the rivets). Fortunately this doesn't have to be in the normal threat model when building a bridge.
Any program (at least if exposed on the internet) nowadays unfortunately has to take active attacks into consideration making that comparision a bit unfair.
If a certified civil engineer had to built a bridge secure against attackers bringing power tools or explosives he'd probably have to fall back to measures like access control as it would be pretty hard to build a bridge that is tamper proof.
-
Monday 29th January 2024 10:02 GMT Necrohamster
"He has also been in the forefront of those urging the professionalization of "software engineering" (a term that he characterizes as "an unconsummated marriage"). Dr. Parnas is also a heavy promoter of ethics in the field of software engineering...”
Professionalization...hmmmm.
The problem is that, in many parts of the world including the United States, the title "Engineer" isn't protected. Any charlatan can call themselves an engineer without education or training.
-
-
Saturday 27th January 2024 00:37 GMT Bebu
Slightly different slant
One obvious difference between civil (&mechanical) engineers and software developers is that the engineers start with a fairly detailed specification with measurable compliance. My observation of developers at work is that they rarely have that luxury. Formal testable specifications are as rare as...*
If you think of something as simple as a buffer overflow - if your specification included a bounds check and what happened when the check failed and could be statically or dynamically verified you could largely eliminate that source of failure.
*hen's teeth, rockinghorse shit etc
-
Saturday 27th January 2024 02:41 GMT aerogems
Sounds Good
I don't see any real harm in creating a course for writing secure code. You could have like an ethical hacking class that's designed to be for both IT and CS majors. Maybe split the class up into groups of a couple CS and IT people and each group has to write and then set up some kind of server software. Then all the other groups try to compromise it and you're graded on how well you do at fending off the ne'er do wells. Manage to thwart every group, automatic A. Set up a system that's so full of holes you could sail a navy battle group through them, you fail. Everything else depends on how much damage the other groups were able to do to your setup. Like if they were able to gain access, but not much else, that's probably B-level. If they managed to slave the system as part of a bot-net, C-level. If they were able to exfiltrate data, D-level. Just as a 10,000ft proposal I came up with off the top of my head. If someone were to actually sit down and really think it through they might make some changes.
Worst case, it's one extra class people have to take to get their degree. However, if it gets even a couple people, especially people who go to work for IoT device makers, to think a little about security when writing code... that's a win-win for everyone. Well, maybe not criminal types, but for everyone else.
-
Saturday 27th January 2024 06:39 GMT Filippo
>"To date, companies have not expressed that security is one of the key factors they evaluate when hiring software developers,"
This is the alpha and omega of the problem.
We don't teach everyone computer security because we, as a society, are still not taking computer security seriously.
I mean, to date, the vast majority of my clients explicitly require my product not to ask for a password. This is industrial automation. The software can direct heavy machinery in physical reality. Let me reiterate: they aren't simply not asking for a password protection feature; they are explicitly asking for such a feature to be disabled, so that anyone in front of the computer - indeed, anyone in front of any of a number of clients around the site, or anyone with access to the local network and some intent - can give commands to the system. Because entering a password is too much of a hassle.
Teaching me how to properly encrypt the passwords that nobody uses is not going to fix the problem.
You know what we really need? Make bedtime stories about cybersecurity. It worked to teach people not to go alone into the woods for millennia, it can work for this.
-
Saturday 27th January 2024 07:47 GMT Bilby
"I mean, to date, the vast majority of my clients explicitly require my product not to ask for a password. This is industrial automation. The software can direct heavy machinery in physical reality. Let me reiterate: they aren't simply not asking for a password protection feature; they are explicitly asking for such a feature to be disabled, so that anyone in front of the computer - indeed, anyone in front of any of a number of clients around the site, or anyone with access to the local network and some intent - can give commands to the system. Because entering a password is too much of a hassle."
No, because in industrial settings, anyone with physical access to the controls of heavy machinery is assumed to be authorised. And has been, for a couple of centuries.
They may be living in the past, but until you understand their mindset, you won't be able to drag them kicking and screaming into the modern world.
Here's a pen-test challenge for you - go onto a construction site, and start up a JCB, and dig a short ditch.
You likely assume that the biggest obstacle to your doing this is that you would need an ignition key. But most heavy plant keys are left in the ignition. It's the norm to do this; Nobody wants to have to stop work for half a day because the JCB driver called in sick, but has the keys in his pocket.
The main obstacles to unauthorised use of heavy machinery are both "security by obscurity":
1) People don't know that the keys are typically left in the machines; and (more importantly)
2) If they did, they wouldn't know how to operate the equipment anyway. Most people wouldn't even be able to work out how to start the damn thing - even though the start-up sequence is likely displayed on a sticker above the windscreen.
That holds true for most heavy plant, and has since the industrial revolution - the only people who know how to operate it are constrained from illegal or damaging actions by reputational concerns, as part of a small clique of local professionals.
The problem isn't in giving people an understanding of the risks; Rather it is in giving them an understanding of the scale of those risks.
It's one thing to worry about the tiny possibility that a local skilled operator with malign intent might do something evil; It's a whole other thing to worry that any skilled operator anywhere in the world might be able to take over your gear.
Remote control is an abstraction to heavy plant operators. It's a possible distant future. Not a current reality.
Whether they, or we, are right depends on the specific implementation.
-
-
Saturday 27th January 2024 07:45 GMT claimed
Everybody on this thread missing the key point: industry doesn’t give a shit.
Sure, teach security, now just go sit by your PHB while they tell you to stop fucking about and ship it.
Same as this bullshit skills shortage: I don’t believe there are any skills shortages, just companies don’t pay enough to be competitive. If you’ve got half a brain you take the money where it’s offered, which is done by joining the merry go round. Wage suppression and increasing gap between rich and poor, nothing new here.
It’s like none of you have heard of a pharmaceutical company, YOU ARE NOT PAID TO CURE THE DISEASE. You’re paid to keep companies and products that have the disease, alive, and paying.
Grow up, you old gits, and take those rose tinted spectacles off
-
Saturday 27th January 2024 10:56 GMT Anonymous Cowpilot
I studied a computer science degree 25 years ago and while it had no specific security module, security was prevalent throughout the course. We learned about separating data and opcode memory and the perils of pre-emptive execution in CPUs in hardware modules. We learned about buffer over and under flow attacks in programming courses and learned about SQL injection in database modules.
And I think this is how it should be. Security should not be a separate module that you can take or avoid, it's not a separate dicipline that you apply after the fact. Security pervades computing and should be incorporated into all aspects of its teaching, not separated out.
-
Sunday 28th January 2024 12:16 GMT Necrohamster
Not surprising
In my CS degree, the closest I got to cybersecurity was a "cyberethics" module which was little more than filler. The lecturer himself had little enthusiasm for the subject and avoided any tangential discussion about security.
Secure coding is something you need to learn on your own, it seems.
-
Monday 29th January 2024 08:37 GMT Michael H.F. Wilkinson
The fact that there is no specific cybersecurity course (there are currently several in our curriculum) does not mean cybersecurity isn't taught as an integral part of programming and software engineering courses. Cybersecurity is not something you should tack on to an existing piece of code or system, it should really be integral to the design. Even in introductory courses on computer science, cybersecurity is discussed as an important topic. Over the decades, cybersecurity has become more and more important in our curriculum, as systems have become more complex and interconnected, which is why additional courses on the topic have been developed.
-
Monday 29th January 2024 15:52 GMT BPontius
Companies don't prioritize security! Computer and Network security is only a catch phrase, a jingle used to shine on consumers with the illusion of care on the part of the company. When it comes right down to it companies have proven over and over that they'd rather pay ransoms and lawsuit settlements than apply any real concern for security. Much cheaper and less impact of profits to settle, allowing avoidance of responsibility and admitting blame. The only true focus on security these companies have is to their stock prices and share holders. Consumers are no more than dollar signs, cash cows to be harvested for every penny then discarded like as a used Kleenex.
-
Tuesday 30th January 2024 21:50 GMT Michael Wojcik
It's the wrong request
In the US, where CISA is located, CS departments (or their equivalents) at accredited universities have rather limited control over their curricula. Accreditation requirements determine much of it; college and university requirements take up most of the remainder. If CISA want to see this change, they need to lean on the accreditation boards and the organizations which influence them, such as the ACM.
It's not like this is a new call, either. You can find plenty of papers and articles from the last three decades urging more teaching of security concepts in CS and IT undergraduate programs. As recently as the current issue of CACM we see this piece by Simha et al. reporting on a survey of concerns for curricula which notes the importance assigned to security (more by "experienced professionals" than other respondents, not surprisingly).