back to article Abstract, theoretical computing qualifications are turning teens off

The UK's professional body for the IT workforce has called for a more "inclusive" digital literacy age-16 qualification alongside Computer Science, which it says is "theoretical and demanding." In its submission to the government's review of the school curriculum, the British Computer Society said schools needed a new digital …

  1. KittenHuffer Silver badge
    Headmaster

    WYF!

    When I took my 'computing' exams, about 4 decades ago, the bulk of the mark (85%+) was actually for computing history, and very little (<15%) was for any sort of programming skills. I thought they had 'fixed' that stupidity long ago, but it seems that it's back again.

    My thought always was that they did it that way because they didn't actually know how to teach someone to program. If your brain worked that way then it all seemed sooooo simple, and if your brain didn't work that way then it was never going to make sense.

    1. Scotech

      Re: WYF!

      Ridiculous that this is still an issue. When I was in school 20 years ago, we had the opposite problem - tons of qualifications that amounted to little more than a certificate of competence in using MS Office and Google, but very little of substance for those wanting to pursue a proper career in IT. Now it seems the pendulum has swung back in the opposite direction. How hard is it to grasp that both approaches have their place? We need qualifications that are focused on practical applications of digital technology, with separate skills pathways that can support people to become end-users, administrators or developers.

      And our education system needs to stop treating studying emerging technologies as optional! Today's novelties quickly become tomorrow's essential tools in tech, the field moves at a pace few other school subjects can match, save perhaps for 'modern studies'. I get that it can be hard to figure out which things are fads, and which will catch on, but it's surely possible to allow some space in the curriculum to let teachers lead a study on this stuff... Or is lab-based coursework no longer a thing?

      1. blu3b3rry

        Re: WYF!

        Pretty much my experience here (vocational qualification too). Most of our work was office stuff or website design (plus Flash!) using Macromedia software. Unsurprisingly, the majority of that is very obsolete almost 20 years later. Anything useful for a IT career like basic command line functions, diagnosing issues or even just how a PC -worked- was never covered. Probably didn't help that our IT "teachers" until my last year were dragged in from other departments as the school couldn't match the pay offered by businesses....

        1. Scotech

          Re: WYF!

          Snap! I did a L2 BTEC Certificate in Digital Applications course in Y10 & 11, and my school's head of IT also sneakily signed me up to do the GCSE alongside it, as my school offered both options. That BTEC was still the more useful of my two courses - although we used Dreamweaver for our Web design unit, I still got exposed to HTML, CSS, and crucially, JavaScript. Also, while yes, we were using MS Access for the unit on databases, the fundamentals were all there - keys, relationships, normalisation, queries, etc. And as a learning tool, access wasn't that bad, certainly better than either of Microsoft's modern 'alternatives' they're pushing (Lists and Dataverse for Teams). The GCSE, on the other hand, opened with an essay comparing the features available in Google versus Windows Live Search, then moved on to an essay about how to use a spreadsheet to handle personal finances. I dropped it halfway through that exercise. I later tried to take an A-level course in IT too, and dropped that after it immediately started heading the same direction. And our IT teachers were at least dedicated members of the IT department... shame they just weren't dedicated teachers too, besides a few good ones who left the faculty between my Y11 & 12.

          1. OldGeezer

            Re: WYF!

            Totally agree with the previous comment; just retired after more than 45 years as a developer. Never went to university, never did any courses, never sat any exams. Taught myself on Apple 2es and Atari 400s and then learnt as I went.

            Some people can naturally program and some can't. Teach them the basics of DBs and data management, Interwebs (HTML/CSS/JS), and teach them to learn (teach themselves). That is all they need because anything else will be out of date and/or obsolete before they have finished learning it.

            1. Martin Gregorie

              Re: WYF!

              Total ageement here as well.

              My computing background was a Chemistry degree, ending with a masters thesis that required me to use a Mössbauer spectrometer, whose only output was 8-track paper tape. This needed to be fed through a computer (an Elliott 503) to produce a usable spectrum plot. As a result, I spent part of the next summer learning Algol 60 so I could modify some of the programs I was using to display and anaylse the Mossbauer output. This in turn was useful after graduation, when I found that chemistry jobs were few and far between, but computing ones were everywhere. I ended up joining ICL and learning first 1900 assembler and then COBOL and George 3 command language. After this I never looked back.

              1. ITMA Silver badge
                Devil

                Re: WYF!

                Did you have a hand in CESIL ? That abortion that was foisted on many a youngling using the ICL CES books?

                LOL

      2. An_Old_Dog Silver badge
        Devil

        Today's Novelties

        Today's novelties -- and yesterday's novelties -- are too often just that; or, they are hype-driven, bandwagon-carried, poor-to-bad ways of doing things. The Prolog programming language (designed to be be used for AI applications), expert systems, C++, Java, cryptocurrencies, non-fungible tokens, generative antagonistic networks and large language models, and "AI PCs" are examples of these.

        I can hear the cries now: "But there are millions of lines of code written in C++ and Java! The whole world uses C++ and Java!"

        Yes, there are, and yes it pretty-much does. Yet popularity != merit. Octagonal wheels on a bus or trains might 'work', but I would not ride that bus nor ride that train.

        I don't know what sort of theories are being taught in British computer classes, so I can't say how relevant or useful they are. It's helpful if the theories being taught are also taught alongside useful applications of them (if any -- I Ioved learning about Turing machines, but I haven't seen any real-world applications for them), so students can connect the theories they're being taught with real-world consequences. It's useful for students to understand why doing an inner join on a relational database can take a very, very long time.

        I have met too many Idiocracy-style, rote-trained, button-pushing, PC-operator-drones who believe data naturally resides in "apps" and in spreadsheets, rather than it properly residing in text files and databases, and who have no concept of data flow processing (Unix/Linux/MS-DOS "pipelines"), people who become confused and distressed if the icons on their PC desktops change color or position.

        1. RegGuy1 Silver badge
          Thumb Up

          Re: Today's Novelties

          In 1991 in the central library in Manchester I found a book, it was called The AWK Programming Language. Wow, that was an interesting read. And the skills it taught me have lasted a career. I've never needed to update my programming skills, althought I've learned a few others. Microsoft? Fuck off. Get your data into text format and the world is your oyster. (It was for me.)

          1. keithpeter Silver badge
            Windows

            Re: Today's Novelties

            Recovering maths teacher (retired) here.

            'Data analysis' caught my eye in OA and yes, there is scope there for some nice solid conceptual teaching taught through real world datasets that will catch the imagination. No chance of coursework no doubt.

            Link below to some past papers in Computer Science GCSE

            https://revisionworld.com/gcse-revision/ict/past-papers/aqa-gcse-computer-science-past-papers

            (Not picking on AQA, other exam boards will be similar as national curriculum)

          2. BinkyTheMagicPaperclip Silver badge

            Re: Today's Novelties

            I've upvoted you, but I wouldn't say I entirely agree. AWK is a decent language, and The AWK Programming Language a lovely book (I have a copy) but AWK is not without its issues.

            Programs often look like line noise (although possibly not to the extent of Perl), gawk is an awful lot more functional than awk, it's not great at dealing with non ASCII text, nor at coping with quoted delimiters. However it is powerful and self contained in one executable. Python and Perl on the other hand are an awkward morass of files that may not be already installed.

            However I fairly recently converted some utilities I would normally write in AWK to Powershell, because it's significantly more accessible to colleagues, and generally Powershell is a viable alternative. It doesn't include NF, so that needs working around, but generally that isn't a huge issue. It'll handle Unicode and quoting with less hassle than AWK, regular expressions are supported, and Powershell is (usually) installed on most Windows systems so it's a decent common denominator.

            Powershell programs are mostly easier to understand (the use of -ne still annoys me, but that's just syntax), and because it's a .NET language it's possible to link in any .NET component you want, including SQLite which can trivially let you manipulate an in memory or on disk spreadsheet just by shipping one DLL[1]. Really the most annoying part of Powershell is that it tries to pretend it isn't .NET to the programmer, and if it was more open and integrated with .NET syntax it'd be easier to use for more complex scenarios.

            If you're targeting Windows systems, Powershell really is worth looking at.

            [1] Yes, I know in theory gawk can do this, and there are in fact some gawk SQLite extensions. However they're a bit of a pain to use, not immediately available, and the .NET SQLite ADO component is linked directly off the SQLite homepage.

            1. RegGuy1 Silver badge

              Re: Today's Novelties

              I agree mostly with your comments but I have managed to keep well away from Microsoft, and so avoided powershell. However, while Awk does indeed have some limitations, it can shine when you use it with a decent shell, probably bash these days, but I used to use the Korn shell for a lot of my work. One of its biggest limitations is the single quote, but just having a variable called SQ = "\x27" was a workaround, or with other tricks such as tr or sed.

              In the 30 years since 1991 I've rarely found the limitations of Awk a problem, and it is present on even basic installations, which I feel makes it essential.

            2. Bebu sa Ware
              Windows

              Re: Today's Novelties

              "[Awk] Programs often look like line noise"

              I wasn't a fan of the pattern {action} form of (n)awk so I have just placed all my C like code in the BEGIN action (occasionally some in the END action) with the advantage the Q&D code can be rapidly converted to C if speed is a problem.

              Unfortunately the book for me (in 1979) was K&R The C Programming Language 1/E, later followed by Software Tools in Pascal. ;)

              Fortunately I never really worked as a programmer as the attractions of defenestration beckoned.

              1. Grunchy Silver badge

                Re: Today's Novelties

                "[Awk] Programs often look like line noise"

                That’s regex for me (“regular expressions”).

                Any time I see something that looks like line noise, it’s time to dig out the regex handbook and start deciphering…

            3. Apocalypso - a cheery end to the world

              Re: Today's Novelties

              > nor at coping with quoted delimiters

              Brian Kernighan recently released an updated Awk that natively supports RFC 4180 compliant CSV files. Gawk was immediately updated to remain compatible :-)

      3. Ian Johnston Silver badge

        Re: WYF!

        And our education system needs to stop treating studying emerging technologies as optional! Today's novelties quickly become tomorrow's essential tools in tech

        Yes, schools should definitely have switched to teaching cryptocurrency. NFTs and blockchains, which have turned out to be so important.

        1. Scotech

          Re: WYF!

          Yes, schools should definitely have switched to teaching cryptocurrency. NFTs and blockchains, which have turned out to be so important.

          If students don't study it, how are they supposed to understand why they're not?

          1. Munchausen's proxy
            Pint

            Re: WYF!

            >> Yes, schools should definitely have switched to teaching cryptocurrency. NFTs and blockchains, which have turned out to be so important.

            > If students don't study it, how are they supposed to understand why they're not?

            I'm not sure how to teach it, but I think bullshit detection should come from the liberal arts courses - maybe philosophy, history, and language in some combination.

          2. doublelayer Silver badge

            Re: WYF!

            That works for nearly everything that isn't considered important enough. A good teacher who understands more things can come up with a quick answer for why they're not studying cryptocurrencies:

            "So far, and despite predictions, cryptocurrency is not used very frequently by average consumers or businesses as either a store of value or a medium of exchange. If you end up spending or buying cryptocurrency, you will need some additional information. If you end up building the software that implements cryptocurrencies, you will need a lot of information and practice."

            Or ancient Greek:

            "Most of the stuff that you could read by learning ancient Greek has been translated and is not necessary to read for most people. If you want to be a historian of any of the civilizations of the Mediterranean, you'll likely encounter it at least sometimes if not nearly always and should either learn it or work with someone who has, who you will assist with the ancient language you learned instead. If you're not planning to be a historian or in a related field, modern languages will be more useful than ancient ones."

            Or cliff diving:

            "We don't live near cliffs and cliff diving could kill you and I don't care if you like the sound of it."

            You don't need to study any of those things to learn why not. There are many skills that we need to leave off the curriculum, sometimes because they're not very useful, sometimes because they would be very useful but there is only so much time available and students need a general education, so taking the time to study those properly would not allow for teaching them all the rest of the things that could be useful. We mostly get to try to even out that to try to teach people the set of things that would be most useful that they can actually learn in the time and resources available.

            1. Excused Boots Bronze badge
              Joke

              Re: WYF!

              You just know that someone is going to wade in with “well I have a Masters Degree in 'Cliff Diving’, and I’ve found it invaluable in job interviews”!

          3. Terry 6 Silver badge

            Re: WYF!

            I'd argue that stuff like cryptocurrency should be a unit in computing classes. But taught as an example of computer use, not as a tool for computer use. i.e. On a Wednesday afternoon in March there should be a lesson on this in year 9.

            Not too long ago, here on El Reg, people were arguing that the then ICT course needed to be replaced with coding. Others ( like me) were arguing that coding is a nice thing to study - but of limited value for the vast majority of students- who really did need the skills of using computers in everyday life.

            A view I've since had reinforced by two things; 1) Invigilating comp sci A levels, where the kids were totally flummoxed if something went wrong with the computers they were using for an exam, and 2) As a "Digital Champion" working as a volunteer with adults who have insufficient digital skills for everyday life, let alone keeping safe or avoiding scams.

            Most people need to leave school able to use common computer tools to create documents, enter and interpret data, illustrate and demonstrate information, keep their computers working efficiently and safely (if they use a proper computer that is), make informed choices about what type of technology to use, what kind and level to buy and so forth. A grounding in coding should be part of this- the principles far more than the specific language.

            When this gets to GCSE I'd argue that there needs to be a range of (ideally some optional) modules; coding in one of a limited number of languages, web design, graphics, computer hardware, Computers and Society, even History of Computing, Computing is a big subject, but no one area should be allowed to force out the others, whether for vocational or academic reasons. Much of what we teach in school is purely academic. History of Computing and Graphic Design modules are just as valid as learning History or Art and Design for GCSE.

          4. O'Reg Inalsin

            Re: WYF!

            Blockchain is favorite of academics and an endless source of papers and specialists. I can't see why it's a good idea to promote that.

    2. Cruachan Bronze badge

      Re: WYF!

      Things must have changed (or are different in Scotland) because when I sat my Higher Computing in the 90s (GCSE equivalent) it was mostly programming (in Turbo Pascal IIRC) and when I went to Uni (Strathclyde) it was all programming and maths. There was actually a lot of backlash from the students because the vast majority of courses, despite their names, were just programming in different languages. E.g. the class on Compilers was just programming in C, Object-Oriented Programming was just coding in Eiffel rather than concepts of OO programming (Java was new at the time and not the dominant language) and most classes used Scheme (A Lisp derivative demo language that was free to license for Universities).

      I remember a lot of calls for more classes in computing fundamentals like networking and Operating Systems, so they might have taken those on board and gone too far the other way.

      1. Yet Another Anonymous coward Silver badge

        Re: WYF!

        80s O.level was history, I still know the difference between a chain and drum line printer but am fuzzy on the precise distinction between a micro and mini computer - although to be fair so was the teacher.

        Programming was BASIC cos that's all the BBC.micro had

        1. big_D

          Re: WYF!

          We learnt about ferrite core memory!

          1. vulture65537

            Re: WYF!

            It was in the ladybird book.

            1. big_D

              Re: WYF!

              It was in the CSE course book covering the history of computing, from Babbage through to punched cards and terminals...

        2. Torben Mogensen

          Re: WYF!

          "Programming was BASIC cos that's all the BBC.micro had"

          Not entirely accurate. It was all that the BBC came with when you bought it, but you could get Pascal, LISP, COMAL, and several other languages on ROM or loaded from disc.

          But BBC BASIC was not the worst language to use. It had features fro structured programming (while loops, procedures, functions, etc.), which were not found in, for example C64 BASIC or Sinclair BASIC.

          1. Excused Boots Bronze badge

            Re: WYF!

            "Not entirely accurate. It was all that the BBC came with when you bought it, but you could get Pascal, LISP, COMAL, and several other languages on ROM or loaded from disc.

            But BBC BASIC was not the worst language to use. It had features fro structured programming (while loops, procedures, functions, etc.), which were not found in, for example C64 BASIC or Sinclair BASIC.”

            All true, except that they all cost extra money and hence....

            You are right about BBC BASIC, it was considerable ‘more powerful', ie feature-rich than other flavours of BASIC of the time - yes, it could be ‘abused’ and produce sloppy, if working, code. But it was better than most, and fairly ‘understandable’. Personally I preferred PASCAL!

            1. Michael Strorm Silver badge

              COMAL

              We used COMAL on the BBC when I did Standard Grade Computing at school. In all honesty, minor syntactical differences aside (*) it wasn't much different to BBC BASIC.

              As far as I can tell, the original intention behind COMAL was that it supposed to be BASIC-like, but with support for PASCAL-style structured programming features.

              This probably mattered when it was first created because most (all?) BASICs in the mid-70s only supported FOR..NEXT, GOSUB and (ugh) GOTO. (Something that continued to be the case with the smaller 8K BASICs later included with many early microcomputers).

              However, since BBC BASIC was one of the later- and better- implementations of BASIC it had already pinched those features from Pascal and COMAL was mostly redundant.

              (*) e.g. Pascal-style ":=" ("becomes equal to") for assignment (**) (rather than using "=" for both assignment and quality). And (*gasp!* for those of us used to using it liberally in BASIC), no "GOTO". Which in hindsight was a very good thing- because GOTO is bloody awful and only necessary if you don't have proper structured programming features. However, back then, having grown used to nonstructured ZX81 and Atari BASICs I'm not sure I *really* saw having it taken away as anything more than being forced to do things the boring educational way.

              (**) Something it's a shame that C didn't go for since using "=" for assignment must have resulted in countless errors over the years when it was indadvertently used instead of "==" for equality (and generally legal in context so it doesn't get caught). IIRC even Dennis Ritchie later admitted that this was a design mistake. Far more so because so many other languages used C-style syntax (which was otherwise great) and thus replicated that bad design decision for compatibility reasons.

              1. agurney

                Re: COMAL

                WOW, that brings back memories .. there's a fair chance that you used one of the BASIC and COMAL open learning courses that I wrote for TVEI way back when (1987-1988). The language and environment (compiler, interpreter, whatever) wasn't important, but the approach I took to logic and code structure was.

                Coincidentally I'm retiring today after several decades in the industry (real world, not Scottish education) with no regrets.

                1. Michael Strorm Silver badge

                  Re: COMAL

                  Sorry to say that I can't say from memory what specific course materials we used at school back then- to be fair it was 35 years ago(!)- but I hope you enjoy your retirement!

          2. vulture65537

            Re: WYF!

            6502 assembly could be programmed from the BASIC UI.

            1. Blitheringeejit
              Happy

              Re: WYF!

              That was my favourite feature (along with easy access to properly buffered hardware I/O) - and the assembly process was neatly integrated with the BASIC being interpreted, with variables passed seamlessy. A really neat way of getting snippets of fast execution out of slower machines while still writing the UI in BASIC - I'm surprised more home computers didn't follow suit. Memories of writing an 8-bit audio sampler in BASIC/assembler on a 2MHz 8-bit CPU - and you try telling the young people of today....

      2. Ian Johnston Silver badge

        Re: WYF!

        Highers are the equivalent of AS levels. If it was GCSE level it would have been a Standard Grade (now National 5, once O Grade)

        1. Cruachan Bronze badge

          Re: WYF!

          Shows how well I remember it!

          Think there were some fundamentals in Standard Grade, and the programming was in BASIC because as someone mentioned, it was on BBC Micros. Might have been some COMAL as well as there were a lot of similarities.

          1. Michael Strorm Silver badge

            Re: WYF!

            I read your post literally five minutes after posting this!

            > Might have been some COMAL

            I did my Standard Grade Computing in the late 80s/early 90s and we used COMAL, so if that rings a bell you probably did too?

            > there were a lot of similarities

            Indeed. One of the few things I remember about the BBC Micro version of COMAL was that- minor syntactical variations and loss of "GOTO" aside- there was very little difference from BBC BASIC. Most likely because BBC BASIC already included the structured programming features that were meant to differentiate COMAL from (earlier, non-structured) BASICs.

            1. Cruachan Bronze badge

              Re: WYF!

              Thinking about it I'm pretty sure it was COMAL at school, but I used BASIC at home where I also had a BBC Micro and in those days typing in games yourself from a book or a magazine was a thing. A very frustrating thing due to it being very hard sometimes to tell if the printed symbol was a comma or full stop etc and there being no internet to check the errata, but it was a thing.

      3. werdsmith Silver badge

        Re: WYF!

        Computing syllabus for GCSE now. They use Python as it's a good deal more relevant than Pascal today or ever was.

        :

        3.1 Fundamentals of algorithms

        3.2 Programming

        3.3 Fundamentals of data representation

        3.4 Computer systems

        3.5 Fundamentals of computer networks

        3.6 Cyber security

        3.7 Relational databases and structured query language (SQL)

        3.8 Ethical, legal and environmental impacts of digital technology on wider society, including issues of privacy

        It's a useful syllabus and has little to do with history. They problem here is that it seems a bit too computing and there is no place for general workplace IT - but surely this is covered by the SEPARATE ICT GCSE - syllabus:

        1 Types and components of computer systems

        2 Input and output devices

        3 Storage devices and media

        4 Networks and the effects of using them

        5 The effects of using IT

        6 ICT applications

        7 The systems life cycle

        8 Safety and security

        9 Audience

        10 Communication

        11 File management

        12 Images

        13 Layout

        14 Styles

        15 Proofing

        16 Graphs and charts

        17 Document production

        18 Databases

        19 Presentations

        20 Spreadsheets

        21 Website authoring

      4. Bebu sa Ware
        Windows

        Re: WYF!

        "just coding in Eiffel" that must have been useful. :)

        Actually I was taken by the language Eiffel (Sather) and read all Meyer's books but it never really took off. I recall a lot language lawyering over contravariance v covariance and other nonsense on the news groups at the time which last time I looked was being channeled by Rust.

        For us the OO language was Simula but none of the programming assignments had a skerick of OO so effectively just Algol coding.

        I only really understood OO when I worked out how virtual functions in C++ were implemented with hints from Brad Cox's Object "Oriented Programming: An Evolutionary Approach" but I did grok generics from a Uni course that used Pascal Plus given by one of the language's creators.

        Scheme was after my time but it's a decent teaching tool (I still retain my copy of Abelson and Sussman.)

      5. Anonymous Coward
        Anonymous Coward

        Re: WYF!

        > Things must have changed (or are different in Scotland)

        The educational system in Scotland (*) has always been separately run and distinct in terms of qualification and structure from the one in England. It doesn't follow that it's necessarily doing the same thing- or heading in the same direction- at the same time even if it sometimes does (e.g. GCSEs and Standard Grades addressing the overly academic/rote memorisation criticism of O Levels and O Grades respectively in the mid-to-late 80s).

        > when I sat my Higher Computing in the 90s (GCSE equivalent) it was mostly programming (in Turbo Pascal IIRC)

        When I did it circa 1991-2, my experience was that it seemed to be very memorisation and fact-heavy. Though that might have been a hang over from the old days (**) as my Computing teacher (who actually apologised for that) explained that they were overhauling the course from the following year (1992-93 on) to address that problem.

        (*) Aside from some private schools that use English qualifications like GCSEs, A-Levels and whatever.

        (**) In hindsight, I'm guessing that means we were the last year doing the non-"revised" Higher dating back to the O-Grade era. The Highers were modernised around that time, presumably for similar reasons to the replacement of O Grades with Standard Grades, and to align better with them. I'd done Standard Grade Computing, which would partly explain why it felt like such a change. (Though that being so, I don't get why they wouldn't have aligned the introduction of the old/new courses to avoid that).

    3. nematoad Silver badge

      Re: WYF!

      ... if your brain didn't work that way then it was never going to make sense.

      And my mind does not work that way and programming has never made sense to me.

      To concentrate on programming as the only thing that matters in IT is missing the mark.

      I got a post-graduate diploma in Systems Analysis and Design and then went on to a career in desktop support and systems administration. Some of the most awkward customers we had when doing desktop support were the programmers who worked next door. They seemed surprised when their fiddling with the systems sometimes caused the whole thing to fall over, and were resentful when they were charged our standard call-out rate to put things back together. They were good at what they did, i.e. programming, but did not have a clue as to how their PCs worked.

      As Oddball in Ryan's Heroes said "I don't know how they work, I only ride in them."

      Broaden thing out. Man cannot live by programming alone in the IT world.

      1. nematoad Silver badge

        Re: WYF!

        As Oddball in Ryan's Heroes...

        I got that wrong of course: It should be Kelly's Heroes.

        Old age is taking its toll.

        1. collinsl Silver badge

          Re: WYF!

          I thought it was Ryan's 11

        2. The Oncoming Scorn Silver badge
          Joke

          Re: WYF!

          Just be glad it wasn't Hogan's Heroes.

      2. doublelayer Silver badge

        Re: WYF!

        I maintain that a good teacher and a student who is trying can do a lot to make programming make sense even if it didn't come instinctively. There are three interrelated reasons it doesn't happen when they try to expand it too far:

        1. They get teachers who don't program well and teachers who don't program at all but took a four-week intensive course over the summer to teach it. Many of those teachers are trying, but they don't understand their subject, so they can't teach it well. Programming takes a lot of background knowledge to get things back onto the rails. For example, a good teacher needs to understand compiler error messages because students will do something weird and the compiler is going to do it. You may not bother teaching introductory students what l-value and r-value mean, especially as they're both sort of types but not types you create deliberately, but they'll need to understand that because compilers complain about them. The teacher needs to understand why and explain it in simpler forms so the student understands what is wrong and some basic level of why.

        2. Not all students are going to try. Of course, not all students are going to try at all the other subjects either. The problem is that programming properly requires learning a lot of little things and putting them all together, and if you miss a couple of them, nothing works. Students who don't want to learn might learn something else by accident, but they will not learn any quality of programming that way.

        3. People who want to teach programming to everyone sometimes also want to teach programming far too early without doing anything else. Without knowing how to drive the computer a little, programming isn't going to get you very far. You can probably teach a mostly unfamiliar student how to calculate some numbers, but they'll be missing all the rest of the reason that programming is powerful. That won't help them learn, it won't keep their interest, and it won't train them on using a computer by proxy.

    4. heyrick Silver badge
      Unhappy

      Re: WYF!

      When I was doing CS (in 1988), it mutated into IT because the new teacher wasn't qualified to teach CS. The utter pointlessness of the man trying to explain what a "database" was, while just the term before our project had been to write one from the ground up (BASIC, BBC Micro, etc).

      A few years later I did RSA IT at night school (needed the paperwork). It was basically teaching us how to use MS Works (Works, not Word). The teacher wasn't impressed when, after explaining the process to rename a file and wanted us to change the extension of a group of them, I dropped to DOS and entered something like "rename *.old *.txt" and the lot was done. So I got a mark for completing the assignment, which was then taken away for not doing it in the demonstrated manner.

      Oh, and for an interview with a recruitment agency, I did rather poorly with the typing test, because they weren't using real Word, they were using something with a weird name (Mavis Beacon?) which, critically, did not support keyboard shortcuts. So I had to stop typing, grab the mouse, click an icon, then restart typing. Too many context switches. I asked to use actual Word and my speed shot up. The interviewer seemed to think I was channelling some sort of magic.

      So this does not surprise me at all. For too long it's been "can you use <some Microsoft product>" rather than "can you process text / define what a string is / tell the difference between two dates / know obvious bollocks when you see it / etc". See icon.

      1. Peter Gathercole Silver badge

        Re: WYF!

        As I understand it, Mavis Beacon was a real person and a big name in the history of training what used to be called secretaries, people (normally women) who needed good typing and shorthand skills.

        Whilst there were several computer packages to teach typing using her name, they were really just courses aimed at future typists, not at people who used keyboards for other purposes. So of course it didn't teach anything about keyboard shortcuts. That was not the point of the packages. It was meant for people mainly typing plain prose on something that was more like a typewriter than a computer.

        Now of course, the world is a different place. I'd like to see a company nowadays that didn't expect their managers to type their own correspondence. There may be board-level managers who still have what are sometimes still described as secretaries, but in reality, they will be what were once called Personal Assistants, and typing, whilst still required, will be a less important feature of their work.

        Modern typing courses (and I admit I've not actually used any) are probably much more suitable for teaching a wide variety of keyboard skills more suited to more recent use. But teaching using keyboard shortcuts is still not really possible, as not all packages use the same shortcuts, so if you do, you get accused of just teaching individual packages, which is one of the most common complaints levied against a lot of IT training.

        And teaching to type prose is not going to be completely applicable to someone writing C++, Java or Python.

        I actually saw the BBC micro version of Mavis Beacon's typing tutor, and I have to admit that I thought it was just terrible. A reasonably competent typist could actually type much faster than the fastest that this package could run the exercises. I never really understood why, when the BBC micro was so fast for it's time, this package was so slow. I guess it was just poorly written. Maybe I'll find it in my archive, and pull it apart sometime when I'm boared (it was written in Basic) to see why it was so slow.

        1. RegGuy1 Silver badge

          Re: WYF!

          Learn to touch type. It's not difficult and the skill will apply in so many areas (such as typing a response on el Reg). I like to live in vi, and my typing skills help lots.

          I learned when I was 11, on a device, I believe it was called a typewriter; longer documents could be tiring. Years later I discovered a package, Mavis Beacon teaches Typing. It had a useful exercise where you had to type the text that appeared at the top of the screen. If you made a mistake a bug (fly) would appear squashed on the screen. The more errors you made the more the screen (which simulated the forward view from a moving car) would get filled with more and more squashed bugs. Quite a fun activity and helped to hone my typing skills again. (Of course by this time I'd moved on from Tipp-ex to just using the backspace!)

          1. JamesTGrant Bronze badge

            Re: WYF!

            You can tell the difference between a normie and a vi/emacs user’s keyboard…

            A normal person has a bit of wear on the letters on their keyboard but only in the middle.

            Vi/emacs have pristine letters but all the keys around the edge of the keyboard are worn away.

        2. BinkyTheMagicPaperclip Silver badge

          Re: WYF!

          Mavis Beacon is a marketing invention, not a real person. It's quite a decent typing package though. Its intention is to teach touch typing, not application proficiency.

          If you do think Mavis Beacon (on the PC, never tried it on home micros) is poor, rest assured all the alternatives are substantially worse!

          1. Michael Strorm Silver badge

            Re: WYF!

            Yeah, I learned to touch type using Mavis Beacon in the late 90s- though I could already hunt and peck at a decent speed- and it did a pretty good job at that. Would definitely have recommended it. (That was 25 years ago, so no idea how the most recent versions compare).

            > Mavis Beacon is a marketing invention, not a real person

            Apparently the original "Mavis Beacon" was just a stock photograph of a model who was paid $500 at the time (though AFAICT later versions swapped that out for another photo of someone else).

          2. Peter Gathercole Silver badge

            Re: WYF!

            Hmm. I stand corrected!

        3. Phones Sheridan Silver badge
        4. Terry 6 Silver badge

          Re: WYF!

          Specifically "Mavis Beacon Teaches Typing". Note "typing" not keyboard skills. Mavis had no interest in shortcut combinations, nor use of the fn keys, and so on.

          1. MachDiamond Silver badge

            Re: WYF!

            "Specifically "Mavis Beacon Teaches Typing". Note "typing" not keyboard skills. Mavis had no interest in shortcut combinations, nor use of the fn keys, and so on."

            I've never needed to learn to touch type function keys. Key combos I'm pretty good at but learning to type letters and numbers is the core that everybody can use so getting that down well is the most important thing.

      2. Doctor Syntax Silver badge

        Re: WYF!

        "The teacher wasn't impressed when, after explaining the process to rename a file and wanted us to change the extension of a group of them, I dropped to DOS and entered something like "rename *.old *.txt" and the lot was done. So I got a mark for completing the assignment, which was then taken away for not doing it in the demonstrated manner."

        That was the point where you should have demanded to have your course fees refunded.

      3. Yet Another Anonymous coward Silver badge

        Re: WYF!

        >because the new teacher wasn't qualified to teach CS

        NO CS teacher ever was. Ours got the job because he had seen a computer through a window on a maths course at poly

    5. big_D

      Re: WYF!

      When I did CSE Computer Studies, my teacher said on the first day, "I will be taking the exam with you!"

      He was new to the subject and knew some of the history, but very little about the practical side.

      2 of us had computers at home, luckily both Commodores, so we understood the PETs we had to use in class. I had already been programming my VIC=20 in BASIC and machine code for about a year before I started the CSE course, the other pupil in the class had a PET at home and knew even more.

      Whenever there was a technical question the teacher just pushed it off onto one of us!

      The worst was that we didn't use BASIC for the course work, we had to write programs in CESIL (Computer Education in Schools Introductory Language), an interpreter written in BASIC that ran on the PET. It emulated punched card systems, so you wrote the code then lines and lines of data at the end, which was the input into the program as it ran.

      I actually managed to write an adventure game in it, but it was a bit silly, because you had to put in all of your commands and moves before you ran it, but it worked!

      When I got to college, I had to write a program to calculate change (minimum number of coins) for a given value. We had an hour and ten minutes, I was finished in about 10, so, being the college used the same model of PET, I played around with machine code to pretty it up. Drawing windows on the screen, input using an 8x8 character matrix to make large characters for the input, "piles" of coins for the output... The lecturer took one look at the screen and said, "I never knew you could do that with a computer!" DOH!

      I was supposed to be the one there to learn! Don't get me wrong, I'm not some hot games programmer or Linux Torvalds clone, I was a competent programmer, but nothing special, but my flabber was gasted at how little the teachers and lecturers actually knew about computing and especially programming!

    6. steviebuk Silver badge

      Re: WYF!

      True. Problem was, for me, in the late 80s when started high school we had a room full of BBC computers but it was 2 boys to one BBC. If you were with a dick, you never got a go. Only lasted about 2 years and they all got removed. Then in college in the 90s we got taught old languages like COBOL which are argued all the time I was there was pointless as no longer used. Only use it would of been for was being a consultant making lots where banks needed to move away from COBOL and there were no COBOL programmers left.

      1. big_D

        Re: WYF!

        I learned COBOl in the late 80s, thought it was dumb, then my first job, after sorting out a corporate reporting system written in BASIC by FORTRAN programmers und maintained by COBOL programmer (who it seems never bothered to even read the manual, no For...Next, just:

        10 a = 1

        20 do something

        30 a=a+1 : if a < 20 go to 20) I was moved onto the corporate bookkeeping system, which was written in COBOL, then our personell system and the ERP system we used from Powell Duffryn, then lent out to PD to work on their ERP system (Protos 2000), which was written in COBOL. Then back to base in the early and mid 90s to do the Y2K fixes to the bookkeeping system and Protos 2000.

        Whilst I did a lot of assembler, C, Pascal, 4th Dimension, VB, Cognos etc. programming during that time, I was constantly pulled back into teams to sort out some COBOL project or another. COBOL was actually a very nice language, at least you felt like you had accomplished something when you went home, because you had written 2 reams of code that day, as opposed to the C programmer who had probably only written 20 lines of code to do the same thing. :-P

  2. Mike 137 Silver badge

    A fundamental error

    "Much of the content, such as CPU functionality and fetch-decode-execute cycles, is abstract and challenging for students, overshadowing practical exposure to emerging technologies like AI"

    We already have a general population of "users" who rely on (presumably better informed) others to define and provide the technologies they make practical use of but understand little or nothing about. Unfortunately, it is from that population that successive echelons of the supposedly better informed are largely drawn, so the societal body of real expertise declines, and with it the quality and reliability of the technologies..

    "It also pointed out that it is possible to pass the GCSE Computer Science course while doing very little – if any – programming on a computer"

    Perpetuating the myth that programming is the primary computer expertise. But someone has to design the hardware, develop the new languages and protocols, ensure security and robustness, and a host of other essentials. A nation of coders who don't understand the systems they're coding on or for is at a huge disadvantage when it comes to developing systems with increasingly far reaching societal impact.

    So I concur there should be two pathways, but it would be a huge mistake to deprecate computer science in favour of mere "digital literacy", even if that includes user level practice on "AI".

    "The GCSE contained out-of-date content about networks and internet protocols that could be removed from the specification to make way for more exciting material, Adamson said"

    I'm not at all sure that "exciting material" is the best criterion for what constitutes sufficient baseline knowledge for potential practitioners in a highly technical subject.

    "a study [PDF] by King's College London, the Nuffield Foundation, and Reading University also recommended broadening the GCSE curriculum and better teacher training and professional development in the subject"

    So the current comp sci syllabus is inadequate[1] and the teachers aren't sufficiently competent in the subject. Does this explain, at least to some extent, why the students are avoiding or dropping out of the subject? Maybe those deficiencies are the first things we should fix.

    [1] I've taught on such courses and commonly found the (nationally ratified) syllabi patronisingly shallow and consequently boring to students. They generally needed deepening rather than broadening. We had to surreptitiously break the rules on order to impart useful knowledge and keep students' attention.

    1. Caver_Dave Silver badge
      Boffin

      Re: A fundamental error

      When I was studying my 2nd year of 'A' level Computer Science, I spent most of my lunch times teaching the Maths teacher who was teaching the 'O' level course!

      When I studied IT at (a now) University in the mid '80s, we did the full HND courses for Analogue Electronics (right down to the doping in transistors), Digital Electronics (including making our own simple controller processors from 74 series TTL), Telecommunications (including aerial design and fax protocols), and software (Z80 assembler, Pascal and software design) - a proper broad understanding.

      But the most useful course for me was Psychology. It gave me the skills to be able to elicit requirements from those, who not only did not know what computers were capable of, but also did not how their own manual systems worked well enough to describe them.

      I worked for a computer dealership immediately after the course and had a reputation throughout the local business community for first 'fixing' and documenting their manual systems and then selling them the correct computer solutions a couple of years later when they actually needed it. That may have initially hurt our bottom line, but after a couple of months my boss was being praised at all the local business events, raising many more leads than normal advertising would have done.

      Looking at all the failed large scale projects, I don't think that this level of analysis is performed any more.

  3. b0llchit Silver badge
    Holmes

    Well, if you haven't made blinky as (one of) your first project(s), then you will never understand that computers are just plain stupid calculators instructed to do many simple things very, very fast so the machines can baffle the unknowing and appear to look intelligent to the uneducated.

  4. Tony Gathercole ...

    Another "R" - suggestions

    I've long believed that "Digital Literacy" needs to be treated the same way as the three "Rs" - Reading, wRiting and 'Rithmatic.

    Much in the the same way that (in my day at least) English Language O-level/CSE/GCSE was about practical communications skills and English Literature was more focused on the depths of the subject, so Digital Literacy needs to be about dealing with the everyday use of technology and there can then be a seperate in (more) depth study for Computer Science (or whatever you want to call it).

    These days it's barely possible to survive without the ability to work technology - and my guess is that there's few if any mainstream kids who can't, but formalising it and making sure that good practice is drilled into them seems to me to be totally necessary as a subject as compulsary as using written and spoken English (your primary language may vary).

    So, any suggestions for what this fourth "R" could be called?

    1. Neil Barnes Silver badge

      Re: Another "R" - suggestions

      "Rats" - as in, "Rats, another fatal error!"

    2. Scotech

      Re: Another "R" - suggestions

      I'd actually expect it to be incorporated to a degree into all of those other subjects:

      Reading - this is an obvious candidate, as one of the best things about computers is their ability to help us organise, search, parse and reference vast amounts of information. When I was in school, we were taught to do things like scan a text for key terms to find relevant information quickly, use the table of contents, index, glossary and references, etc. All of these tasks can be done more quickly and efficiently with the aid of computers, and it could even be extended to include one of the very few things LLMs are actually good for - extracting and summarising relevant portions of text from a larger body - plus, you then get to teach the importance of critical reading and checking your sources...

      Writing - Typing skills and word processors, 'nuff said. Who writes anything by hand anymore? I regularly go for weeks at a time between hand-writing things, and I appreciate I might be an extreme case, and that its still a vital skill to teach, but even when I was in school, the majority of writing was done on a PC, and that was two decades ago. And then there's the plethora of digital writing aids out there, including (whisper it, so we don't upset the cranks like me!) AI. Writing, these days, is an IT skill.

      Arithmetic - Do I even need to argue this one? We already had calculator exams well before I was in school, and perhaps there's space here for a segment looking specifically at the use of more advanced stuff like python scripting to solve maths problems? Would seem a good avenue to expose kids to coding, at least.

      The key thing here is that, digital skills aren't a thing. We need to teach kids to approach things and think about everyday tasks and problems in a digital-first context, not keep treating them as a separate class of skills on top of classical approaches to things

      1. Tony Gathercole ...

        Re: Another "R" - suggestions

        Yes, I have a lot of sympathy for this approach too - if done effectively and universally. Indeed, it has to be considered as a fundmental building block for most if not all other subjects.

        However, are teachers being trained (and re-trained) to be successfult in delivering it?

      2. Peter Gathercole Silver badge

        Re: Another "R" - suggestions @Scotech

        On your mention of calculator exams, when I went through school (and probably even more so for my brother), there was a thought that using calculators was actually going to damage the understanding of arithmetic, and tools like logarithms, which were taught as part of basic maths, but which then went on to become hugely important as tools for more advanced topics like calculus. So many maths courses actually prohibited the use of calculators when I started secondary school.

        I actually feel the same way about the various aspects of computing now. Too often I come across people who have absolutely no concept of the fundamentals of computing, but know how to use the latest IDE and management tools, and as such become programmers and administrators who can do what they need to do, but never learn ho to do it efficiently.

        The vast majority of people using computing devices really don't need to understand that much, but we do need to have sufficient numbers of people who do know more than just using the things.

        One of the biggest frustrations about teaching a subject which you might call Computer Science is that beyond the fundamentals, it is actually a very fast moving subject. This is the major reason why taught courses are generally regarded as poor, but this is because it can take at least 2-3 years to develop a syllabus, the teaching material and the assessment mechanisms, and that is before you start trying to train the trainers. It's really no wonder that by the time you start teaching something 'modern' it's already out of date. Add on to that the perpetual problem of keeping the trainers current, and the whole thing is a recipe for frustration. I mean, five years ago, could you really have predicted the sudden rise in subjects like LLM AI and tools like ChatGPT. Academically, these are very little like what would have been taught as AI a decade ago, which is where the skills to develop new courses come from.

        1. Scotech

          Re: Another "R" - suggestions @Scotech

          I completely understand the concern about people not understanding the fundamentals or underpinnings before moving on to abstractions. I came into an IT career the long way round - after becoming disillusioned in school due to the terrible way it was taught at the time, I went towards engineering, then fell out of love with that too, and took up a career in the voluntary sector, where my skills saw me slowly drifting back towards IT again, specifically Power Platform, MS's low-code stuff. I regularly encounter people who struggle to understand that these tools aren't built on thin air, there are logical underpinnings that you have to dig into and engage with in order to solve real problems, and if you don't understand them, you'll never understand the capabilities and limitations of the tools you're using.

          One thing I've been glad to see in my own area of specialism is that Microsoft are finally putting in the work to bridge the gap between the people they call 'Makers' (people who know how to push buttons in the low-code interface to make stuff happen, without any real understanding of how it works) and their 'Developers' (people who work in code and infrastructure and don't, or barely, touch low-code tools at all). This is a vital niche I've been glad to see them working to fill, but it's just one product area in one company, and I think we need to see more effort being put in across the board by companies to peel back the veneer and let people see underneath to what their products are actually made of if they really want quality candidates coming into their recruitment pipelines.

          Regarding the pace of changes, that's where I feel the role for coursework comes in. Part and parcel of IT as a field is keeping up with the constant changes coming at breakneck pace. Students should be expected to do an independent research assignment where they choose an emerging technology and discuss it in detail, or better yet, do a comparative analysis assignment where they discuss emerging technologies at the start of a course, then produce another analysis at the end in the context of the original, discussing how those technologies have evolved. In most fields, that wouldn't work, but in IT, 18 months is a long time - sufficient for something of note to have changed.

        2. Tony Gathercole ...

          Re: Another "R" - suggestions @Scotech

          A first I think! Responding to a comment by my brother responding to a comment referencing me!

          Four years before his time, calculators were barely a thing as I went through (the same) grammar school. I think there may have been some - but primarily used for the novelty effect (what can you spell with numerals upside-down). Anyway - totally excluded from any exams without question. Actually, thinking more about it, I do recall in 1975 (after an enforced "gap year") spending a large proportion of my savings on a high-end Texas calculator prior to going up to Leeds University.

          At University studying "Computational Science" (don't ask but I was very much better at the computing elements than the Computational bits - I still have flash-backs to the "Numerical Solution of Ordinary Differential Equations exam in the summer of 1977, where my response really wasn't much more than my name at the top of the paper [but see below]). But, I was heavily involved in running a Student Union campaign to establish a clear set of guidelines for using calculators in exams. I don't remember the exact details, but I think that the key point for me was to establish clear and consistent policy for where and when their use was allowed and when not.

          [from above] Numerical Solutions of ODEs: Failed the exam - but got a summer job with the department building on previous work to create a visualisation package that allowed users to define ODEs in a BASIC-like languare, associate them with a library of solution methods and graphing facilities to allow students to see how the solutions worked. I believe it was used (and hated) by two or three generations of students until the Leeds DECsystem-10 was decommissioned. See, analysis of the requirements, design, compiler writing, exploitation of system facilities, (primative) UI and IDE, documentation, user training etc. etc.

    3. OJay

      Re: Another "R" - suggestions

      cRitical Analysis - i.e. trust, but verify....

    4. xanadu42

      Re: Another "R" - suggestions

      Reasoning...

      Some may say "Critical Analysis"...

      Paraphrasing the various references I have read for the meaning of "Reasoning" it is basically the process of forming/drawing conclusions, judgements, or inferences from facts, evidence or premises

      Paraphrasing the various references I have read for the meaning of "Critical Analysis" it is basically the process of understanding the meaning of information and understanding the strengths and weaknesses of the information...

      That said it seems that "Critical Analysis" relies upon "Reasoning"

      So "Reasoning" would be a good "Fourth R"...

      And, if the Education System is working properly (??), the basics of teaching of Science, English and Mathematics should cover the Technology aspect...

      Eek - the STEM that everyone in governments around the world seem to talk about as being the "Basics of Education"

      Problem seems to be that, for many years (at least 2-3 decades, maybe more), children have not been properly taught all of the beauty of Science, English or Mathematics (due to cuts in spending on Education) and are, as a result, falling into the traps of the technology they use without any understanding of the pitfalls and think, in their ignorance, they are better-educated in "tech" than their parents...

      1. doublelayer Silver badge

        Re: Another "R" - suggestions

        The E in STEM is supposed to be engineering, not English. It's a set of related* subjects, not everything that people must learn.

        * The "related" is something that people who try to slide an A (art) into it should keep in mind.

    5. Terry 6 Silver badge

      Re: Another "R" - suggestions

      The 3 Rs were actually originally "Reading.......Reckoning". The "Rithmatic" was ascribed later in the same sort of way that "backronyms" are created.

      But reckoning was the common term before the 19th C.

    6. Bebu sa Ware
      Coat

      Re: Another "R" - suggestions

      So, any suggestions for what this fourth "R" could be called?

      Benny Hill's croft eels chap would proffer Ritelacy which wouldn't now pass muster.

    7. Apocalypso - a cheery end to the world
      Joke

      Re: Another "R" - suggestions

      > So, any suggestions for what this fourth "R" could be called?

      Reading, Writing, Regurgitation. Oh wait, that's just for the exams...

  5. mpi Silver badge
    Thumb Up

    Wow, so many surprises!

    So the next generations do NOT in fact automagically turn into "digital natives" with digital superpowers as was foretold? What an amazing surprise! What's that? Instead many of them struggle to understand even the most basic concepts, like file systems or programs, because they only ever got to experience the "digital world" through the confines of smartsomething devices that were designed to maximise the output of an attention economy?

    Well color me shocked. Flabbergasted even.

    And what's that? Curricula designed after computer SCIENCE (which is, in fact, a highly specialized scientific discipline) are NOT a good way to teach the general populace about how computers should be used? Amazing! Who would've thought that? Oh, btw. did you know that making people study Oceanography doesn't, in fact, get you more people who can swim? I know, right, it's almost as if there is a huge difference between these topics or something.

    1. Martin Summers

      Re: Wow, so many surprises!

      It gets worse. A number of generations of infant and primary school kids now only use iPads in their education rather than PCs. Can't wait until that severely fucks things up for employers.

  6. 42656e4d203239 Silver badge

    plus ça change...

    Now here is a topic I have thought on quite a lot over the 40 years since my Computer Science A level (which I understood the bare minimum of - got an E)...

    I distictly remember griping at the time about how un-modern the course was and how useless it was going to be - why did I need to know about tape punches and teletypes? who cares about how a computer stores a floating point number? Ah the innocence of callow youth....

    Looking back, with 20-20 hind-sight, I have come to the conclusion that you need both the whizzy new and exciting stuff, to engage those who as yet know nothing, and the actual mechanics of how things actually work/used to work as well to make a computing course interesting and relevant.

    In later years I have worked with people (budding/actual sysadmins) who don't understand the why but just know what button to click, which means they can't understand why a particular button won't do what they think it should and also look on incredulously as I get things working again becasue I can see the bigger picture.

    Trouble is that there only a finite number of hours in which to teach a course and finding the middle ground will be tricky to say the least.

    Good luck to those who have to try to sort it out.

  7. Pete 2 Silver badge

    Doing the impossible

    > Computer Science, which it says is "theoretical and demanding."

    When you look at the course content, that sounds like a fair criticism.

    Unit 1 - Fundamentals of algorithms

    Unit 2 - Programming

    Unit 3 - Fundamentals of data representation

    Unit 4 - Computer systems

    Unit 5 - Fundamentals of computer networks

    Unit 6 - Cyber security

    Unit 7 - Relational databases and structured query language (SQL)

    Unit 8 - Ethical, legal and environmental impact of digital technology on society

    They seem to have managed to turn what should be an interesting, rewarding and creative subject into a fusty, irrelevant and dull one.

    Something that I would have thought was impossible. But I suppose with years of practice, the course creators have managed it!

    1. Yet Another Anonymous coward Silver badge

      Re: Doing the impossible

      Wait till you see a Law curriculum.

      There is almost nothing on it about leaping out of a handbrake turn Jag and shouting "your nicked" to a bunch of blaggers in a transit

      1. Androgynous Cupboard Silver badge

        Re: Doing the impossible

        Presumably also no courses in sharp tailoring and how to win over a jury with an emotional summary? I can't believe TV has been lying to me all these years.

    2. tiggity Silver badge

      Re: Doing the impossible

      @Pete 2

      Plenty of opportunity to make some of those interesting e.g.

      Unit 1 - Fundamentals of algorithms

      No avoiding some maths, but could let the kids know the (often bad effects of algorithms) from influencing what they see on social media, to the ads they

      Unit 2 - Programming

      Creating something interesting is limited mainly by the programming skills and creativity of a pupil

      Unit 6 - Cyber security

      Create some honeypots, let them attack them. Turn the tables, they have to defend a system from attacks. Get them to look up some of the great content on real world red teaming

      Unit 7 - Relational databases and structured query language (SQL)

      Bobby Tables (unless you did it in 6!)

      Unit 8 - Ethical, legal and environmental impact of digital technology on society

      All sorts of interesting avenues to explore, from cyber stalking, surveillance society, lack of privacy, dubious practices (ranging from child labour, unsafe conditions etc, through to environmental damage) involved in some raw materials used in electronics, could add in the latest tulip boom (AI) & potential environmental damage from extra power generation for all the extra processing (never mind initial training being processing (power) intensive)

      So a good teacher, could add lots of interesting stuff, though may well be the usual problem of timetabling where amount of course content to be delivered in time available does not give much time leeway to expand beyond the dry facts.

      Although the exams changed over time, the continual issue of timetable restrictions making it a struggle to teach bare bones of curriculum tends to lead to dull teaching (it was the same in my day, same when my daughter went to school, I doubt it will change any time soon)

      1. Caver_Dave Silver badge

        Re: Doing the impossible

        All of those interesting things is where teachers are trying to get STEM Ambassadors in to demonstrate. i.e. do their jobs for them!

        It's why I've cut down to only attending STEM careers events.

    3. Martin Summers

      Re: Doing the impossible

      My HND in computer science consisted of project management and 'learning' Java, along with Visual basic. None of those things floated my boat and I never understood Java or got the hang of VB. 2 years of my life wasted. I didn't realise at the time I really should have just gone and got a job. University certainly isn't for everyone, and it certainly didn't teach me anything I needed for my current career in IT, which I'm also regretting taking as a path!

    4. BanburyBill

      Re: Doing the impossible

      Ah, but do you remember the old ICT A-level syllabus? Problem solved by doing all the documents in a mind-numbing once-through waterfall - stopping at implementation - and where using a programming language (apart from Excel) could result in you losing marks. At least you actually get to touch a computer these days. And we are talking, er, A-Level Computer Science here. Not A-Level Hyping Gen AI.

      This sounds suspiciously like a purveyor of courses - is the BCS still largely a certification business? - demanding the syllabus include New Shiny! That coincidentally fits their offerings.

      1. Anonymous Coward
        Anonymous Coward

        Re: Doing the impossible

        Ahh. Yes. A Level CS.

        Failed mine, despite knowing enough to actually teach some of the lessons. Still, managed to go on to develop a rather nice career in IT. Also went on to complete a few OU courses and pass those exams with no problem. I liked the old OU courses (back in the 90s, before they re-wrote them to be object-oriented).

  8. Howard Sway Silver badge

    taught to get the best out of tools like ChatGPT so they can succeed in life and modern careers

    Computing GCSE : Here is how to use ChatGPT to do everything for you

    All other GCSEs : 0/10 : Student used ChatGPT instead of doing the work themselves

    I suppose we can look forward to the first set of interview candidates in about 5 years time, whose answer to every question is "ask ChatGPT". That is if they're not sending along their "avatars" to do the interview instead of turning up themselves. Personally I would expect a bit more critical rigour from a professional computing organisation about the usefulness of these tools and whether it is wise to ditch fundamental knowledge in favour of becoming reliant on them. But fear of looking out of date is leading too many people to jump on this bandwagon without thinking things through properly.

    1. Scotech

      Re: taught to get the best out of tools like ChatGPT so they can succeed in life and modern careers

      To be fair, I think BCS have some of the best materials out there for teaching the fundamentals of AI, with a strong focus on the importance of understanding how it works under the hood, how to determine if its an effective approach to solving a particular problem (and by extension, what kinds of problems it's a bad or unnecessary solution for) and the ethical considerations involved. They approach the subject with nuance, and don't shy away from the pitfalls. They also don't make the mistake of conflating AI and LLMs, and recognise that the field is broader than just a limited set of GPT models. I recommend checking them out: https://www.bcs.org/qualifications-and-certifications/certifications-for-professionals/artificial-intelligence-ai-certifications/

      1. Conor Stewart

        Re: taught to get the best out of tools like ChatGPT so they can succeed in life and modern careers

        They might well have good learning material on how AI works but that isn't the point. They want a separate course that is just about how to use computers, not computer science and there is a very good chance that it wouldn't go into how AI works just how to use things like chatGPT. So at best it would be surface level knowledge on how to use particular models.

  9. heyrick Silver badge

    which it says is "theoretical and demanding."

    God help us. Yes, computer science is theoretical and demanding. It's programming. It's understanding protocols (and the RFCs that document them). It's knowing how the hardware works so you can get more performance from a floppy disc if the format is offset and interleaved. It's knowing not only that bubble sort is slow, but why, and how to make sorting faster. And so on.

    I think teaching stuff like Google-fu, ChatGPT, and spotting social media lies ought to be something related-but-different, like "Digital ethics" or something.

    1. wub
      Happy

      Re: which it says is "theoretical and demanding."

      Somewhere along my path, it finally struck me that good programming started with a proper understanding of the problem. Then you decide how best to solve it, given the tools available. Then you design and build your program, to teach the machine how to do it.

      Or - if you don't fully understand the problem, you aren't as likely to solve it as you are to turn it into a different problem. It may look like it works, at first, but over time the failures will force another attempt to solve it.

      I'm sure none of that matters, now that we have tools like ChatGPT to do all that for us, so we don't have to break a sweat trying to figure out how best to get the problem solved.

      /s

  10. Calimero

    Lost the plot... teach them math

    How ridiculous it sounds.... Computer science started from math. Teach the students discrete math, data structures, with a pencil and a paper - like the in the book by Horrowitz and Sahni! Stop treating the keyboard as an extension of the perpetual umbilical cord. Thinking should happen in the brain, not in the fingertips. I never start at the keyboard, I start with a piece of paper and write down pseudocode, and "compile by hand". Yeah, I must be a mastodon, but the first things my boss told me were: "do not think in a programming language" - I was fresh from math college. The best prrofessional advice ever!

    1. Caver_Dave Silver badge

      Re: Lost the plot... teach them math

      I write a lot of specifications - API and HLR are always written to be OS and programming language agnostic.

      It seems to take up to 3 months (sometime longer) for new employees to get the hang of this and not put Java/C/Python/Go/Rust function names and structures in these specifications.

      I despair with people who think that barely reverse engineered pseudocode counts as an LLR - we have a revolving door of people in a certain sub-continent who cannot understand that they need to understand the intent in order to write the intent as the LLR. (Yes, reverse engineering - I know!!! But we then independently check the reverse engineered LLR against the traditionally produced API and HLR - double direction traceability.)

      1. Doctor Syntax Silver badge

        Re: Lost the plot... teach them math

        I upvoted when I first read this and I've let it stand but on reflection I think you may be doing some of your new employees a disservice.

        People understand abstract concepts in different ways.especially where synaesthesia is involved. Remember Feynman writing that he saw mathematical expressions colour-coded or visualised some mathematical operation as a mechanical analogue?

        I'm not sure how you visualise what you're dealing with but obviously not as pseudocode. However something that one person will mentally see as a diagram someone else may see as structured text and someone else as a mechanism; what seems unlikely to me is that anyone would grasp something as a complete abstraction.

        What shape in your mental model, is a year, and where's September? I doubt your answer would be the same as mine.

    2. Bluck Mutter

      Re: Lost the plot... teach them math

      The other thing is solve the fundamental issue the application will do in a prototype before layering.

      Writing a music player... get the basic audio player working at the command line before writing the GUI/database.

      A classic example of this in a real world enterprise situation was a MAJOR bank spent 2 years rewriting a mainframe app in Java (with an Oracle backend) only to find that it wouldnt scale and so the application was abandoned.

      Not hard to mock up some production like transactions (after doing an initial database schema/data migration from DB2 to Oracle... which was my role ) and seeing if the Java/Oracle interface could sustain the required throughput.

      I was called in after they had been doing the Java development for 1.5 years and now needed a production copy of the database for testing. This should have been the first thing done....prove the migration can be done, prove you can do it in the downtime available and provide a real world test bed for developers to use and provide a real world test bed to validate scalability before wasting 2 years of work.

      Not related to the question at hand ... or is it? There is more to computing than pure technical skills... basic common sense also is relevant and maybe that is getting shorter in supply.

      Do they have a course in common sense at school?

      Bluck

      1. Doctor Syntax Silver badge

        Re: Lost the plot... teach them math

        On the whole I agree but:

        "a production copy of the database for testing. This should have been the first thing done....prove the migration can be done, prove you can do it in the downtime available and provide a real world test bed for developers to use"

        Letting the developers access a copy of the production database with customers' data in it? I hope you had a few steps in between such as changing all customer details to dummy data.

    3. ChoHag Silver badge

      Re: Lost the plot... teach them math

      > ridiculous

      There it is! The fourth 'r' from the thread above.

  11. ColinPa Silver badge

    coding is only a small part of computer science

    Many people in IT never program a computer. They are system programmers or administrators. I did a presentation recently saying "the functional requirements take 1% of the time. The non functional requirements take 99% of the time. This covered topics like you need to encrypt fields in a data base, you need to audit who references this field, you need MFA from the mobile phone to the server, you need to be able to debug programs and systems. How do you know if your system has a problem (before you get a twitter storm). What happens if you have 1000 transactions a second. Having a one hour lesson on topics like these would give people a more rounded view of IT in the real world.

  12. Doctor Syntax Silver badge

    a more "inclusive" digital literacy age-16 qualification

    Reading that my instant thought was "Meeja Studies" which had the reputation for being the subject for pupils who were nit going to get a pass in anything else back when my children were at school (not that they ever needed it).

    1. Julian Bradfield

      There used to be (maybe still is) a "General Studies" A-level. No reputable school allowed pupils to take it. I told my school I wanted to see what happened if somebody walked into the exam and took it without preparation, and they agreed, even though it meant me coming in during half-term. I confess I cracked the night before and looked at a past paper to see what the format of the exam was, but that was all. Yes, I got an A. Amused a lot of people:)

  13. bofh1961

    Computers? In schools?

    Would it even be legal to teach kids about big tech slurping all their data?

  14. gillburt

    It’s not difficult

    In secondary school, the application of software, such as word processors, spreadsheets, video editing, image manipulation should be within other subjects, such as English, maths, art, geography, history and so on. computer science should be about programming, software design, apps, databases, hardware design and control all very close, but not quite, aligned with design technology. I would expect a small class size. Having been, albeit briefly, a CS teacher, those with the best aptitude are massively held back by the curriculum, and those with the least aptitude have no clue what they are doing or motivation to develop, as it’s just not for them.

    1. Doctor Syntax Silver badge

      Re: It’s not difficult

      Aptitude. That's a word that I think too many teachers in my day didn't understand. I can think of one or two who presumably were very good at their subjects but had no inkling of what their pupils might find difficult and hence no idea about how to change pace so that those bits got more time and explanation whilst the glaringly obvious things got done to death.

      1. Scotech

        Re: It’s not difficult

        No idea what it's like now, but I can't see that having changed a bit. Especially when schools are often buying teaching materials alongside the qualifications they're offering - all of them do this but AQA in particular are egregious for selling bundled textbooks, course materials and sample lesson plans - that I'm sure a time-pressed teacher would never just use verbatim in a class whose average ability is off-base. Teachers who have record class sizes, no time during work hours to plan lessons, and who've been beaten down and demoralised by the stress of the job and the relative lack of support available, can't be expected to tailor every lesson to reach every student in every class they teach.

        Ironically, this is a place where greater use of technology could actually be useful. I remember seeing an excellent interview a few years back looking at how Khan Academy tried to do just this - by digitising the learning materials, and adopting continuous assessment, it's possible to track course progress more closely and accurately, and flex the pace and complexity of learning to fit student aptitude. This should (in theory at least, I'm not aware of any hard numbers on this) result in better learning outcomes, reduce student boredom and frustration, and help to ensure grades are more closely aligned to measured ability, rather than being artificially limited by 'banding' strategies like foundation / intermediate / higher exam papers.

  15. Doctor Syntax Silver badge

    ISTM that successful use of a computer at any level is to have a mental model of what's happening at that level (maybe a level or two below would be useful) and then issue instructions at that level. That applies equally to clicking on buttons on an app or designing the next sub-nanometre scale multi-pipelined processor.

    It follows that the fundamental skill to teach is forming those models, broadening them and realising when there's another level beneath that might be useful. A good many of the skills gaps which other comments have raised seem to stem from models which are too narrow, too shallow or both.

    1. doublelayer Silver badge

      I think this is a good summary. I've seen people who try to fix this by insisting on covering something complex. They start teaching programming by going to the individual instructions at the CPU. This doesn't work for two independent reasons.

      First, people want to actually accomplish things, and there are a lot of instructions. Pretty much every language abstracts some of this, like finding where the variable you referred to is and bringing it into registers so you can work on it, but why not make people who have never seen this before do that manually. In other courses, we can teach them chemistry by manufacturing all of their reagents from precursors they mined themselves. Going too deep doesn't help, even if you started at that level.

      Second, this makes it a lot harder to find someone who knows how to teach this properly. An introductory course that tries to teach that many levels needs a teacher skilled in all of them and enough time to handle them all separately. That's possible, but it generally doesn't happen often. Trying to do it when you can't is like teaching someone to drive too quickly; they might look like they know what they're doing, but it's likely to go badly later unless they train on their own.

  16. IGotOut Silver badge

    Are BCS now just click bait?

    "Pupils should acquire digital skills and be taught to get the best out of tools like ChatGPT"

    So in two or three years everything they learned with be useless and irrelevant.

    I'm sure 3 years ago they would of said

    "Pupils should acquire digital skills and be taught to get the best out of tools like the Metaverse and Blockchain"

    And having two kids sat though the shit system that is UK education (ok not as bad as US), having to learn yet more things, taught in a way that bores the shit out of them, while actively destroying any creativity and individuality, this is more crap they don't need.

  17. neoaliphant

    They ought to teach word processing, email etiquette and dangers, backing up, malware, scammers, online dangers, searching, cloud , privacy.....up to age 14 then anything for gcse ought to be vocational....

    I've got clients with 16 year olds who can code a website they have no interest in but give out their info to scammers and meet up with strangers they met on tiktok....

    But teaching AI, noft, crypto.....wtf!

    They only computing teaching I've ever used is how to touch type, what cc was, and ctrl x,c,v

  18. Dostoevsky Bronze badge

    Computers are hard.

    The people dropping out were only interested because of money or video games anyway—soooo sad to see them go.

  19. C. P. Cosgrove

    Something is definitely wrong !

    A couple of weeks ago I had a young lad - early 20's ? - on my doorstep signing people up for BT/EE broadband ( a bloody awful job !). Since I was thinking hard about making this switch I invited him in and during the conversation he started rabbiting on about download speeds and I asked what are we talking here - Megabytes or Megabits ? To which he replied in all seriousness 'Why, they are the same aren't they ?'.

    And I still signed up for BT but then I have somewhat more faith in their engineers than in their sales people.

    Chris Cosgrove

    1. Grunchy Silver badge

      Re: Something is definitely wrong !

      “I asked what are we talking here - Megabytes or Megabits ? To which he replied in all seriousness 'Why, they are the same aren't they ?'”

      This was me, trying to figure out “baud”.

      Because 300 baud is “approximately” 30 cps (characters per second), yet each character is 8 bits (or maybe 7 bits, or maybe 5 bits).

      It’s due to this peculiar thing they called “8b/10b encoding,” in which telecommunication requires 10 bits to transmit 8 bits of data. Well there might be other overhead such as parity bits, stop bits, etc. But anyway lets suppose 1 is distinguished by “a change,” and 0 by “no change.” How many consecutive zeroes can you telecommunicate before you lose track of the number, or suspect the line has gone dead? And yet computer science LOVES to transmit megs and megs of sequential zeroes. So if your telecommunication can, at best, only (reliably) distinguish between 1 zero and 2 zeroes, then the only way to telecommunicate arbitrary digital data is with this intermediary encoding such as “8b/10b”. This is what the UART chip does.

      (I had a similar issue figuring out GCR encoding on a Commodore 1541 disk surface. I never did figure out tape drive storage, which is supposedly like the modem, except the Commodore never actually had a UART? As I said: I have only a “partial understanding” of what means “baud”.)

    2. Excused Boots Bronze badge
      Facepalm

      Re: Something is definitely wrong !

      "And I still signed up for BT but then I have somewhat more faith in their engineers than in their sales people.”

      Oh dear!

  20. Bebu sa Ware
    Coat

    No waiting required

    Can't wait until that severely fucks things up for employers. already is.

    A whole generation whose only digital literacy is their smart phone and its garbage tip of installed applications are completely flummoxed by text based business systems written decades ago and require navigating between various screens using function keys or CUA key combinations.

    I doubt most of the poor didums could actually punch the Bundy as the concept is far too difficult to grasp (anyway far too violent.)

  21. Joe W Silver badge

    F**** Office

    CS is not Office.

    CS is lots of maths. Unless you enjoy that, CS is not for you. CS is not programming.

    Programming is something you can teach yourself. Most of us did. It was expected that you can program when you are studying physics (even decades ago). Not everybody needs to know how to program. I can fix older cars, at least some issues, but it's not a necessary skill.

    Do we need classes teaching people how to use office? I don't think so. Not a whole year, at least, or a semester, three days should be enough. They are simple enough to use for most things you need in an office job. And I mean both Microsoft and Libre / Free / OpenOffice.

    Teaching basic concepts makes sense, but you need some fun exercises, and show how things are connected.

    1. werdsmith Silver badge

      Re: F**** Office

      There is no Office in GCSE CS. There is barely any in GCSE ICT so there is no need for you to waste a pair of Tenas about it.

  22. cageordie

    Teach them typing

    The most useful thing I could have know 40 years ago was typing. I type OK, mostly, now. But I am not as fast as my wife, because she was actually taught. It would have saved me so much time.

    As to the rest. This isn't anything to do with realtime software engineering, my field. So I wish them luck. Enshitification is making the world a sad place right now.

  23. Anonymous Coward
    Anonymous Coward

    "94 percent of girls and 79 percent of boys in England drop computing as soon as they can"

    That's because as a society we have spent decades promoting science as nerdy and uncool. Because our leaders and media are computer illiterate and muddled by it. It's destructive for the nation. We cannot compete in manually intensive work and pay well so we bloody well need to be best in technology but through the nonsense have thrown that away. China did the opposite. Real national wealth comes from making stuff more efficiently or natural resource.

  24. Locomotion69 Bronze badge
    Facepalm

    I can understand why it is dropped.

    No being familiar with the program I googled into it and found I could make some test - so I tried, and got some insight on the content and awareness to be raised here.

    Why would you teach binary/hex representations to children??? That has zero value in "normal" life.

    The program needs to have a change. One should teach kids what is what in the digital world, how to use it, and -very important- how NOT to use it.

    1. werdsmith Silver badge

      For the same reason we teach algebra to children.

      Or Shakespeare.

      Or about shifting cultivators.

      It's not about real life. It's education.

      1. doublelayer Silver badge

        It depends what you want students to gain from their studies. If you only have a little time to teach someone to program, you want to give them enough information that they can decide whether they're interested in learning more and ideally enough information that they can do some useful stuff. If you get to have a programming course every year, you can cover lots of useful things, but not everything is or is considered important enough to get that much of a student's limited time. If it ends up being a half-year or even a single-year course, some things have to be postponed to the next ones. Various bases generally get covered briefly in mathematics. Covering it again in introductory programming isn't very useful if you spend a lot of time on it, because almost all introductory programs and a large number of more complex ones don't require the programmer to directly use either of those. It will be covered later when they're using it directly and you won't get a computer science degree without knowing it, but that doesn't mean it needs to be done first.

        1. werdsmith Silver badge

          Students taking GCSE CS are not learning to be be programmers any more than students taking GCSE English Lit are learning to be poets.

          They are getting a grounding in the subject in order to move to the next level or not. Which may be A Level or undergrad.

  25. Torben Mogensen

    Computational thinking

    I think the most important thing to teach school kids related to IT is computational thinking: Thinking about how to solve problems systematically: First by understanding the problem (which in many cases can be achieved by playing around with it), then deciding how to know when you have solved it, then breaking the solution process in to smaller steps, then implementing these steps, and finally checking if your proposed solution lives up the the criteria you decided before you started. Repeat as necessary. This is not far from Polya's "How to Solve It" method, though that is mostly targeted at maths.

    Note that you don't need a computer to do or teach computational thinking -- you can perform the solution by hand and describe it in text or using diagrams (such as flow charts). For example, you can give the kids a shelf of books and ask them to sort the books alphabetically by author given certain constraints such as not taking out more than two books at a time. Another problem is the classic Tower of Hanoi puzzle, which you can use stacking cups (found in every toy store) for.

    Only when the kids are familiar with solving such problems by manipulating physical objects by hand do you point them to a computer and ask them to implement the methods using abstract values such as numbers or strings.

  26. trevorde Silver badge

    Knew everything except how to program

    Worked with a CompSci intern in Cambridge. He had aced his course and came with huge expectations. Turns out his course essentially covered being able to build a computer from scratch - literally ie if he was ever stranded on a desert island, he'd be able to build a fully functioning computer starting with nothing but the beach sand. Needless to say, he couldn't even do a simple 'Hello, world!' program.

    1. werdsmith Silver badge

      Re: Knew everything except how to program

      I think this is the ultimate bullshit never happened comment ever on The Register.

      Not even the smallest attempt to make it believable.

      Do you realise that there will be actual Comp Sci grads reading your comment?

      1. The Oncoming Scorn Silver badge
        Facepalm

        Re: Knew everything except how to program

        I can see Trev's POV, we have a guy here, as new blood for the plant I'm at & very skilled in all sorts of engineering, electronics, & IT stuff he was taught at University, but the one thing his course didn't teach him was any basics in Networking Infrastructure (We gave him a crash course), which two years down the line he's laying out the plans for half the facility floors refurbishment including network cabling & new data ports.

        1. werdsmith Silver badge

          Re: Knew everything except how to program

          There will have been a module he could have chosen to do networks. He probably went for something more interesting. Like functional programming using Haskell.

  27. BobbyTables

    Please do not return to the noughties.

    I was doing my GCSEs in the late ‘00s, a couple of years before the computer science GCSE was introduced. They were a complete joke and I really hope they don’t return to them.

    For the coursework, we had to produce a children’s book (okay?), in the natural tool for the job, Powerpoint. When I immediately questioned this (we even had access to publisher, which I had used as far back as year 6), I was simply told to shut up and do it in powerpoint, which I did. My coursework (60% of the grade) was on track for an E and I had no idea how to improve it as the tasks were so fundamentally broken. I just wanted to get out of school, so I asked my teacher “If I get every question right in the exam, will I get a C”. He said “Yes, but don’t count on it.”

    I did the exam in half an hour. It was all questions about dated hardware like floppy disks. (I was already years into building my own PCs at this stage, so the hardware side of stuff was child's play) I did it in half the allotted time and got my C in the end.

    The IT GCSE was actually far less in depth than the IT skills I had done in KS2. (We were doing excel formulas back then, almost programming!) I still use those skills, it’s worth teaching in order to get familiarity with general computer tasks. However it’s not enough to really understand what is going on, just as primary school maths isn’t considered enough to get you into any job.

    I accept the world is different now. A friend worked in IT at a school until a couple of years ago and he said as time went on, more and more kids lacked basic knowledge of computers and had to be taught things like how to use a mouse. They had all just had touchscreens only their entire lives and it wasn’t until secondary school that they sat in front of a desktop pc.

    Not that it is an insurmountable problem, the first computers I used also had no mice, because they were BBC Micros.

    I think the answer is to teach the basic user skills early, but by GCSE you should really be learning how these things actually work and how to write your own software.

  28. Grunchy Silver badge

    I remember “self-teaching” C++ and I got up to the object destructor and that’s when I threw in the towel.

    I had no problem with anything in basic, fortran, machine language. C itself I considered as much more approachable than a pure assembler.

    Funny thing, I had an opportunity to participate in a “full stack” class a few years back for C#, we breezed right through objects, constructors, destructors… and that was it!

    It wasn’t such a big deal, after all.

    1. Apocalypso - a cheery end to the world
      Joke

      > I remember “self-teaching” C++ and I got up to the object destructor and that’s when I threw in the towel.

      I remember reading the Stroustrup book when it first came out and thinking "this is overly complicated".

      How utterly wrong I was: over the following 30-odd years, C++ has increased in complexity to a level I wouldn't have been able to even dream of back then!

  29. Argyll Coder

    The need for nerds

    I feel old, perhaps because I am. I started programming in the late 1960s while still at school. Algol 60 in those days. I took a first degree in mathematics but didn't start a proper career in programming until the late 1970s. It was programming all the way from then on, at least until it became 'software engineering'. Languages came and went and there was an expectation that achieving at least competence, if not more, was expected. I think I may have lived through the golden age of coding in the 1980s and 1990s, and (mostly) left IT in order to earn more in another profession, albeit one with a strong component of technology. I'm sorry to see that the article describe traditional software skills as too abstract and demanding. For some, perhaps. However, I recognise the need for digital literacy and I support the BCS in its assessment. I wonder, though, will most or all of software development be given over to AI? The need for nerds will continue – someone has to do it, or at least have the ability to check what AI is up to.

  30. theeggmaster

    Take iPads out of Primary Schools

    A huge amount of pupils are arriving at Secondary School having only used iPads and other touchscreen devices at primary school. Year 7 is teaching them how to use a keyboard and mouse because they don't get this basic level of use in primary. Take iPads out as they are a huge cost versus the actual academic achievement/outcome.

  31. nautica Silver badge
    Meh

    From a preeminent light of the 'computer science' / programming world, and the teaching of it...

    "Besides a mathematical inclination, an exceptionally good mastery of one's native tongue is the most vital asset of a competent programmer...

    "...we must be very careful when we give advice to younger people; sometimes they follow it...

    "...competent programming, as I view it as an intellectual possibility, will be too difficult for "the average programmer" — you must not fall into the trap of rejecting a surgical technique because it is beyond the capabilities of the barber in his shop around the corner.

    "...the topic became...prematurely known as 'computer science' – which, actually, is like referring to surgery as 'knife science'..."

    "Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians.

    "It is not the task of the University to offer what society asks for, but to give what society needs.

    "The required techniques of effective reasoning are pretty formal, but as long as programming is done by people that don't master them, the software crisis will remain with us and will be considered an incurable disease. And you know what incurable diseases do: they invite the quacks and charlatans in, who in this case take the form of Software Engineering gurus.

    --Edsger W. Dijkstra

  32. Ken Moorhouse Silver badge

    My first lessons on programming...

    My first lessons on programming were at South Bank Poly. I remember one incident very vividly and has stood me in good stead ever since. When we'd written our programs we'd submit them as two Hollerith decks: 1. the program and 2. the data (showing my age).

    So we all got to eventually get clean results (a week or so later @ one run a day, if you were lucky), then the lecturer asked us to submit some data he had prepared, and we were aghast.

    The program he had asked us to submit implemented the customary quadratic equation (-b +- sqrt(b^2-4ac))/2a

    We had all used sensible values for a b and c. But he wanted us to submit a=0 (for example) (I bet the computer operators all loved us).

    We all know what happens when the tyres are not properly kicked prior to putting code into production (ask Microsoft about buffer overflows, for example). But far too many coders are guilty of coding to run cleanly with no thought to rogue input at all. Nowadays coders use try except constructs to bat away erroneous input, but this can have unintended consequences if not properly thought out, and I've seen many too many examples of swallowed exceptions to think that is not a commonplace 'solution' for anything that can't be immediately solved.

    When I worked for A Big Organisation my boss would test his programs on the test system with his fingers crossed. If it 'held up' for a 'decent' length of time then 'it was fine'. If it crashed he would restart the system and assume it was a minor glitch. Only if it did that a couple of times would he investigate. Me? If the system fell over once I would immediately get a core dump and go through it until I found the culprit.

    This is another thing: Log files imho are completely useless these days. I worked with a company that had a Silverlight application installed. My $Deity, please spare me from users reciting Silverlight error codes to me. Even the designer didn't want to know what they were.

    Anyone who touched my programs would know of my love of the circular buffer for anything my code eyeballed, in the order it was to be dealt with. Conversations about race conditions were easily settled by a look at the core dump (this was a real-time messaging-driven system btw).

    And that's another thing: Do students still learn about things like circular buffers?

  33. Anonymous Coward
    Anonymous Coward

    Puzzled....

    It is probably hard to understand, but there is actually no single way of "understanding IT". Some examples:

    (1) Assembler: Procedural: tell the CPU what to do, step by step...

    (2) C: Procedural: tell the compiler what to do, step by step...

    (3) Prolog: Declarative: tell the compiler about the logic

    (4) SQL: Declarative: tell the database what to do with the data

    (5) Smalltalk: Object Oriented: tell the interpreter about objects, methods and messages

    And, of course, there's data:

    (6) C structs

    (7) ISAM

    (8) SQL (again)

    (9) SGML

    (10) HTML

    (11) XML

    And, of course, there are algorithms:

    (12) See Donald E Knuth (and many others)

    Puzzled.....because ALL OF THE ABOVE and more would be needed for a rounded view of "understanding IT"....

  34. Tron Silver badge

    Amusing.

    At school in the 80s, we knew more about computing than our teachers (and we had more powerful machines at home). Today kids have better ChatGPT skills than their teachers, as is reflected in their homework.

    Yes, there needs to be a basic computer literacy course for all kids from primary school upwards, but there is no need for it to be examinable. Just teach them the skills. A more intense course for those wanting to program should retain the exam.

    1. werdsmith Silver badge

      Re: Amusing.

      At school in the 80s, we knew more about computing than our teachers (and we had more powerful machines at home

      At school in the 80s I got my start in computing because of the science teacher who brought in his personal computer and let us loose on it at lunchtimes.

  35. Ken Moorhouse Silver badge

    The books that inspired me to pursue I.T.

    (1) Mullard's Electronic Counting book. The frustration with that book though was that it omitted "debounce" circuitry essential to make things count without skipping.

    (2) How A Computer Works by W.L.B. Nixon, publisher: University of London, Insititute of Computer Science

    https://www.computinghistory.org.uk/cgi/archive.pl?type=Books&author=W.L.B.%20Nixon

    Typewritten and mass-produced using a Gestetner machine (eh? what's a gestetner machine? Oh Lord, how we suffered without laser printers. Now listen, you've got a browser, go google it). Seriously though, extremely well written, with exactly the right depth to it.

    Bought from the bargain basement of Pooles, Charing Cross Road. My 2nd edition copy stays with me until they wheel me out in a wooden box.

    (3) Texas Instruments orange hardback guide to the SN7400 TTL range of IC's.

    When I was at South Bank Poly, no matter how good your labwork was, you only got 9/10 max. Even going down to the British Library to seek out papers by Turner on his method of measuring inductance fared me no better. When we did a lab on TTL though I went the extra five Irish miles and I got a 10! (No that is not a factorial sign, please be realistic).

    (4) Motorola M6800 Microprocessor manuals. IIRC one of the breakthroughs Motorola had with the M6800 was their PIA and ACIA peripheral interface chips which got rid of the need for any Input/Output instructions, and you could have as many of these chips as could be addressed by the bus.

    (5) Software Tools/Software Tools in Pascal by Kenighan and Plauger. A few here have I think, have alluded to these.

    (6) The course book we used for digital design was Douglas Lewin's Logical Design of Switching Circuits (I still have a 2nd Edition copy, with dust jacket!). One of the interesting projects outlined in this book is the semi-automated method of optimising Karnaugh maps devised by McCluskey.

    I may add to this list.

    Maybe some of us commentards should have a discussion about providing some kind of legacy resource. In the coming years this kind of knowledge will be lost forever. Google is on a trajectory to forget material it thinks is no longer relevant. Who determines that, then? This is not just I.T. I've come across people who are amazed that Plate Tectonics is a recent advancement in science. It is, I feel, essential that the evolution of such technology is not lost. It highlights how dogmatic and arrogant experts were, and how inspirational it was for someone to gradually change everyone's minds until now, we can't think of anything different.

    1. Roo
      Windows

      Re: The books that inspired me to pursue I.T.

      I have several bookshelves groaning under the weight of documentation about old machines - an Digital Equipment Corporation RM03 Printset for example, but realistically there is very little that I think *needs* to be kept for our descendants. If I had to pick 3 of the items from my bookshelves, I'd go with:

      1) "The Art of Electronics" by Horowitz & Hill

      2) "The Transputer Databook 1992" - because it covers parallel programming theory very well - in addition to packing in a lot of info about some very well designed bits of silicon.

      3) "The Design and Implementation of the 4.4BSD Operating System" - still relevant today given the influence of BSD, explains why things are the way they are in UNIX land, and teaches the kids a few new-old tricks along the way.

      From that lot I reckon you've got enough material to tackle most problems you're likely to meet in the real world - and maybe create a few more interesting problems on the way. ;)

      1. werdsmith Silver badge

        Re: The books that inspired me to pursue I.T.

        The Art of Electronics by Horowitz and Hill. Absolutely brilliant. I have an early one and I also have the latest update with the gold cover.

        I also have been educated by a series of Babani books. Penfold, Parr and Bebbington.

        1. The Oncoming Scorn Silver badge
          Pint

          Re: The books that inspired me to pursue I.T.

          The Art of Electronics by Horowitz and Hill - Had it back in the 80's.

  36. Conor Stewart

    "The GCSE contained out-of-date content about networks and internet protocols that could be removed from the specification"

    What out of date content would this be? The Internet and other networks still work very similarly to how they did when they were invented, most of the content should still be relevant especially content taught at GCSE level.

    Or is this out of date just because most people don't need to know this? Instead they just use a library and it handles everything?

  37. nautica Silver badge
    Holmes

    BuzzFeed

    "Older People Are Sharing What Scares Them Most About Younger Generations", by Hannah Loewentheil, Updated Sat, November 30, 2024, 7:18 AM CST·7 min read

    ...

    2."They CANNOT use a computer. They can surf the web but cannot do anything useful. Many of my students are worse than my parents at doing simple things like attaching documents to emails or understanding the file path."

  38. MachDiamond Silver badge

    It does not pay a prophet to be too specific.

    There's a danger of trying to teach narrow software packages over a more generalized approach. Using ChatGPT today will look much different than what will be on the market when somebody around 16 is prepping for big exams. By the time they finish college, if they go to college, that years into the future. I see knowing how to knock up a quick web page is a good thing, but not using just using one program and never being guided around what's under the hood. It's also fun to mess with the code that gets output to see what happens.

    When I was working in aerospace, we had interns 3-4 sessions each year and many of them were learning something like FEA via one program but not how to evaluate whether they were getting valid results or not. They'd learn one CAD program and not an approach that would let them quickly learn most CAD programs in a short time. Same for CAM, PCB software, etc. It was disappointing to knock down something they'd put a lot of effort into by having to point out an error emanating from a shifted decimal point somewhere at the beginning of the work. I would have to give myself a real kicking if I was off on something by an order of magnitude and didn't see that straight off. I had a very good maths teacher in high school that taught us how to make quick estimates of what an answer should be in rough terms. He was older and worked with a slide rule which forces some of that one you to make sure the decimal point didn't fall off and land in the carpet. It was useful in chemistry as well since mixing up a potion with too much of an ingredient can get really spicy and test how good the lab bench tops are. Since most of the time we didn't have a big jar of some reagent, errors in calculations would be obvious. On the other end of the scale, we didn't have scales that were accurate at very small quantities. Maybe it was cheating to assume that a lab assignment that was laid out before class would have what should be necessary plus a little and the equipment set out or unlocked would be what was needed. There were balances that were quite good at measuring small quantities, but those were locked up in the equipment locker unless needed.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like