back to article ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree

2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages. The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes. ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had …

  1. Anonymous Coward
    Anonymous Coward

    No love for CORAL 66?

    Back in the day, in order to get a military application running, I had to take the assembler output of a Coral 66 program and basically rewrite it completely in proper assembler, because the compiler wasn't merely not optimised, it seemed positively designed to waste as many CPU cycles as possible. It used the 9989 microprocessor, and completely ignored the 9989's register windowing system, instead creating boilerplate functions every single time.

    Kids today...I'm the one feeding the punched tape repeatedly through the reader to get the floating point libraries to work. Arguments over bracket syntax, tabs versus space and to semicolon or not are really a sign that nowadays there is very little to complain about.

    1. AndyMulhearn

      Re: No love for CORAL 66?

      Spookily my first programming experience was of Coral 66 running on a PDP 11/44 so way more congenial than yours. Those were the days.

      1. Persona

        Re: No love for CORAL 66?

        On my first day of employment as a graduate I was handed a Coral 66 manual and told to learn it. We used it to compile code for Intel 8080 processors. It was a truly horrible language so I transitioned us to using "C" on Zilog Z80's in ~1981.

      2. Anonymous Coward
        Thumb Up

        Re: No love for CORAL 66?

        Spookily my first programming experience was of Coral 66 running on a PDP 11/44 so way more congenial than yours. Those were the days.

        'COMMENT' My first job was Coral 66 on a PDP 11 too. I thought it was quite a good language - far better than the Fortran IV I had used at university. The only thing I really hated was that all the comments had to be done like this;

        1. Anonymous Coward
          Anonymous Coward

          Re: No love for CORAL 66?

          DEC's CORAL 66 was a very minimal language. I learned it in about two days - which was necessary as I had to teach a class about it the following week.

          No sooner had I more or less got a grip of CORAL 66 than I was sent off to Washington DC to learn Ada - which was big and complicated.

          Still, even today there is a great deal to be said for Ada if you want software to work reliably and consistently. I always though kindly of its designers every time I was on an airliner that took off or landed successfully.

    2. JacobZ

      Re: No love for CORAL 66?

      We used to call that the Pessimizing Compiler

    3. Graham Cobb Silver badge

      Re: No love for CORAL 66?

      My second professional programming job was in Coral 66 (actually, PO Coral). If I remember correctly, I had to write the code on a George III system (doing my editing on a teletype as the team only had one VDU and as the most junior I was never allowed to use it). It was then compiled a while later by a batch job and I had to walk to the computer centre after a while to collect my tape to load into the SystemX prototype I was working on (and often had to cajole the operators into loading a tape, which they had ignored for the last hour, so the job could finish and I could collect my tape to take to the lab).

      When I fixed the trivial bug I would find in my testing, I had to do the whole process again. About two iterations a day was fairly typical.

      (My first professional programming job was in APL - that was quite fun).

      1. ICL1900-G3

        Re: No love for CORAL 66?

        The great advantage of slow turnaround was more time in the pub.

    4. big_D Silver badge

      Re: No love for CORAL 66?

      DEC's FORTRAN compiler was much more efficient..

      I've told this here before, but we had a data centre full of VAXen and we were looking at possibly replacing some of them with a mainframe from a well known manufacturer. They actually delivered a test machine for us to try out, a whole room-full of big boxes!

      The salesman gave us a tape with a FORTRAN program on it, which would "run for about a week" on the mainframe and several weeks on our test VAX. We should "load it, compile it, run it and call me in about a week, when the mainframe finishes its run".

      He then left us to it. The ops put the FORTRAN tape through both machines, compiled the program and hit run simultaneously on both machines...

      The salesman had a message to call us back, by the time he had returned to his office. The VAX had finished in about a second, his mainframe was still chundering away.

      It turned out the FORTRAN compiler on the VAX was actually quite clever. It analysed the code: no input -> create a huge array -> fill array with random numbers -> no output. The compiler decided if there was no input and no output, there was no point running the bit in between, so it made an empty program stub. The mainframe compiler was not so clever...

      1. John Sturdy
        Thumb Up

        Re: No love for CORAL 66?

        I heard rumours when I was a student (early 80s) that there was an optimizing FORTRAN compiler that produced code that ran so fast it to two STOP statements to stop it! (Or maybe there was a bug in its implementation of STOP.)

        1. Colin Wilson 2

          Re: No love for CORAL 66?

          I heard that too. The second 'STOP' had a comment by it - "In case it skids!"

          1. oldcoder

            Re: No love for CORAL 66?

            No, That was an older IBM OS/360.

            part of the assembly for the stop was missing, so it took a second one to get it to work.

            The DEC compilers would recognize the second one and report it as an error: "code cannot be reached".

        2. oldcoder

          Re: No love for CORAL 66?

          No, that was an older compiler on IBM/OS 360.

          The assembly for the axtual exit was partially missing, some some people would put two stop statements together that would then force it to exist.

      2. smudge

        Re: No love for CORAL 66?

        It turned out the FORTRAN compiler on the VAX was actually quite clever.

        Unfortunately, their CORAL-66 compiler thought it was clever too.

        The first professional job I ever did was to help track down a bug in a sonar simulation system, written in CORAL-66 using the MASCOT methodology (defence bods will remember that). It always eventually crashed with a stack overflow, though the amount of run-time before the crash varied considerably.

        Turned out that inside a double, nested FOR-loop was an IF statement of the form IF <x> AND <y> THEN... , where x and y were complex expressions.

        Of course, if x turns out to be false, you don't have to evaluate y. So the optimising compiler added a jump to the next thing, which happened to be outside of the nested loops. But it left the temporary result for x on the stack. So eventually there would be an overflow.

        The number of iterations of the loops and the value of x depended on the input data, but also on how much time the scheduler gave to each of the "concurrent" processes in its simulation of a real-time system. So even if you fed it the same input data, it would run for different times before crashing.

        I remember going to a DECUS meeting to tell everyone about this. But it turned out that we were the last to know :(

      3. Anonymous Coward
        Anonymous Coward

        Re: No love for CORAL 66?

        Reminds me of the first chance I had to try out a DEC Alpha machine - back in about 1991 or 1992.

        Of course it looked exactly like a VAX, and the terminal was the same. I typed in the command to run one of our standard benchmarks and hit Return. Nothing - I was just shown the dollar prompt again. I tried this a few times before the penny dropped: the Alpha was finishing the benchmark faster than the terminal could display the prompt.

        "You're going to need a bigger benchmark".

        1. matthewdjb

          Re: No love for CORAL 66?

          In memory processing. We loaded our entire billing system into an alpha with a whole 2GB of memory. The billing run on the vax would take two hours. On the alpha, two minutes. (and then it still took five hours to print out the bills).

    5. Primus Secundus Tertius

      Re: No love for CORAL 66?

      In my experience there were very many poor compilers for Coral 66. It seemed to me that the Defence Ministry (British) thought it was the job of the computer industry to develop compilers, whereas the computer industry thought the MOD should spend the money if it wanted people to use a cranky language that nobody else used. There was a front end, to analyse Coral source code, available from MOD; but the back end, to generate object code, would be a software company job.

      I remember using Coral 66 on a project running on Intel 8086 hardware. When a procedure (subroutine, if Fortran is your thing) was called, the calling code put parameters onto the stack in one order, and then the procedure code picked up those parameters and re-stacked them in reverse order. That was the result of the front-end / back-end situation I mention above. It slowed down the code, of course; not a good idea in a real-time system.

      The company management was not interested in technical matters like that (and the company no longer exists). So they did not follow my advice to get the compiler fixed. Instead they encouraged the over-use of procedures coded in assembler to avoid the speed problem. There were many errors in that assembler code.

    6. Blackjack Silver badge

      Re: No love for CORAL 66?

      And then came the eighties and those shiny new 286s were running MS 2.0 and BASIC... in the accounting apartment.

      If you wanted to store something digitally in the eighties it was madness, tapes, carddriges that didn't work in your Nes, hard disks and floppy disks.

    7. Ken Moorhouse Silver badge

      Re: No love for CORAL 66?

      My first serious programs were written in CORAL 66, running on GEC4080 series minis. Located at the Cobourg Street control centre, these were used to monitor the Northern and Victoria lines and provide train descriptions for the lines to the public via the dot matrix signs now seen everywhere.

      1. chrisw67

        Re: No love for CORAL 66?

        Late 1980s, GEC 4000 machines with OS4000 and CORAL-66 cross-compiling for an airborne real-time acoustic processor (AQS-901 a version of the Marconi-Elliott 920 ATC). The good ol' days indeed.

    8. IceC0ld

      Re: No love for CORAL 66?

      Arguments over bracket syntax, tabs versus space and to semicolon or not are really a sign that nowadays there is very little to complain about.


      if you think a little thing like that is going to stop ME from complaining :o)

    9. Mips

      Hey; thought this was about ALGOL


      At college in ‘68: Elliott 803B. Wow! 8k memory; compilers on tape deck. Ran Elliott Autocode FORTRAN and the beloved ALGOL.

      Yess it was that good in them days.

  2. John Thorn

    .. never used .. ?

    I wrote my first Algol program in 1965. I can't tell you what for as I think I'm still bound by the Official Secrets Act.

    On the same theme I recall a cartoon when PL/I (remember that?) was launched. Mummy COBOL and Daddy Fortran are crooning over their new child PL/I. Driving off in the background is the ALGOL milkman.

    1. Simon Harris

      Re: .. never used .. ?

      When I was studying Electronic Engineering in the early 1980s, ALGOL was the first language we were formally taught - I remember the ALGOL-68R language guide was a Ministry of Defence book.

      1. Redigloo

        Re: .. never used .. ?

        I still have my copy of the Algol68-R Users Guide. It cost £1.

        1. Arthur the cat Silver badge

          Re: .. never used .. ?

          I still have both the yellow 1971 first edition and green 1974 second edition of the users guide, plus a library manual circa 1974/5, both much used(*). They're surprisingly small compared to modern language manuals.

          I also (mostly) remember an evening's drinking with Ian Currie at a conference.

          (*) To seriously get into greybeard one-upmanship, I also have a draft copy of the blue Smalltalk 80 book printed by Adele Goldberg on a Parc laser printer.

          1. Anonymous Coward
            Anonymous Coward

            Re: .. never used .. ?

            "They're surprisingly small compared to modern language manuals."

            Reminds me of the BCPL book, smaller than K&R but still includes the source code to a compiler for the language itself.

        2. ICPurvis47

          Re: .. never used .. ?

          So do I. It was issued to me (free) in 1968 by my apprenticeship employers, Ford Motor Company, so that I could learn to program the Elliott 803 at our college (Rugby College of Engineering Technology). The next year the college upgraded to an Elliott 1603, with a massive 16k of RAM, and replaced the card punches with tape punch teletypes. We students were never allowed anywhere near either of the computers, it was all done on a batch system called George, and the two (later four) acolytes were locked into the computer room with just a small hatch to communicate with us. In my final year as an undergrad, we were treated to Fortran IV upgrade, which I used until I finished my postgrad studies, by that time we were part of Lanchester Polytechnic, which is now Coventry University. In parallel with all this, our Apprentice Training department at Ford's taught us Timesharing Basic for use on the remote teletypes at various Ford locations, which stood me in good stead when I joined GEC Industrial Controls, as I had a running start over other new employees when it came to using that same system at GEC.

          1. Richard Plinston

            Re: .. never used .. ?

            > a batch system called George

            GEORGE (GEneral ORGanization Environment) was an ICL 1900 batch operating system with various versions from 1S to 4. I doubt that it had anything to do with Elliott machines.

            (I actually joined I.C.T., one of the companies that formed ICL).

          2. PeterO

            Re: .. never used .. ?

            "The next year the college upgraded to an Elliott 1603, with a massive 16k of RAM,"

            Maybe you mean an ICL 1903 , because Elliotts never made a machine called a "1603".

            1. spold Silver badge

              Re: .. never used .. ?

              Yes would have been an ICL 1902 or 1903.

              I used it on the 1902.

      2. Andy A

        Re: .. never used .. ?

        Yes, we had to learn a programming language too. In those days it was Algol 60, on an ICL4130. Got hooked, and changed courses.

        A whole working lifetime in computers, and many more pints, later...

    2. Peter Gathercole Silver badge

      Re: .. never used .. ?

      Strictly speaking, it was PL/1 (Pea El One). although the 1 was oflen written as an "I" as in the Roman Numeral. But I get a bit upset when someone pronounces it as Pea El Eye, which people are prone to do.

      But yes, it tried to be all things to all people, a scientific language, a business language, a control language and in some of it's incarnations (like PL/C which I used when learning PL/1 as a formal language in 1978), a teaching language.

      It had many unusual features. The one that I found most interesting were implied loops in I/O statements that allowed whole or even part arrays to be written out in a single PUT statement.

      The other language I was formally taught was APL (literally A Programming Language) of which I used to say (somewhat repetitively) "It's all Greek to me!"

      Neither of them helped me with my first job, which was as an RPG2 programmer! Thank goodness I had taught myself C while at University. And I had no problem teaching myself Pascal at my second job.

      1. Phil O'Sophical Silver badge

        Re: .. never used .. ?

        implied loops in I/O statements that allowed whole or even part arrays to be written out in a single PUT statement.

        Borrowed from Fortran, I remember using those!

      2. Anonymous Coward
        Anonymous Coward

        Re: .. never used .. ?

        "[...] I had taught myself C while at University. And I had no problem teaching myself Pascal [...]"

        Someone once made an observation along these lines:

        "When a Pascal program finally compiles it works as expected - a "C" program always compiles."

        IIRC Pascal was the language where a FOR loop's code was always executed the first time through - no matter what the criteria for the number of loops.

        1. oldcoder

          Re: .. never used .. ?

          That was the fortran Do loop, not Pascal. which would skip the body entirely if the terminating condition was already met.

      3. Fustbariclation

        Re: .. never used .. ?

        APL is a wonderful language - I even, for a time, had an APL terminal and keyboard in my garage for my HP 3000 machine that had an interpreter.

        I think it makes the ideal teaching language, because it is so intuitive.

        There's a brilliant on-line site where you can try out APL:

        1. ssieler

          Re: .. never used .. ?

          HP3000 APL was wonderful, it even had a keyword version (if you lacked an HP2641)

          It was, however, the slowest APL I've ever seen.

          Allegedly, HP kept it alive until a lawsuit over its unsuitability was closed, then they dropped it.

          I'd love to get an HP2641 terminal!

          If anyone is interested in getting HP3000 APL up and running on the Classic 3000 simulator, let me know. I'm easily findable.

        2. Graham Cobb Silver badge

          Re: .. never used .. ?

          I was using APL at IBM: two or maybe three different versions on different systems. Mostly it was on a 5100 (which also ran BASIC but we only used the APL mode) and APL/SV on a timesharing system. I think I also used APL2 on a VM/360. But it was a long time ago!

          Interestingly enough - this job was nothing to do with the academic and engineering computing APL excelled at (with the inbuilt matrix and vector operations). My job was as a programmer in a sales office selling typewriters and photocopiers. The products this office sold had nothing to do with APL or even computers at all - I was employed to write programs that could be used to analyse sales statistics, create reports and create letters to send to customers with special offers.

          It would probably have been better to write most of these in PL/1 or RPG II. But learning and using APL was great fun. I even played with j (as a hobbyist) for a while later on to try to recreate that time.

    3. spold Silver badge

      Re: .. never used .. ?

      Algol 60 on an ICL 1902 around 1980 here; How many freaking errors did you get because of missing semicolons?;

      As for PL/1, IBM also had its own extended versions (confidential for some reason), used for internal mainframe code development, called PL/AS and PL/DS.

      p.s. ADA anyone?

      1. JohnGrantNineTiles

        Re: .. never used .. ?

        One of the good features of CPL was that the semicolons were optional,

    4. Kobus Botes

      Re: .. never used .. ?

      @ John Thorn

      ...PL/I (remember that?)...

      That takes me back a while. The University where I studied had a Univac 1110 and the second high level language we learned (BASIC being the first) was PL/1 (I still have my PL/1 manual as well as an operating manual for the Univac).

      After that came Pascal (the neat thing about it was that the whole syntax fitted on a landscape A4 page).

      Assignments meant all-nighters (especially if you left it to the last week) as we only had 16 (I think) punch card machines available. The hierarchy for access to said machines ran from post-graduates (Master's degrees) all the way down to the lowest level, namely first year students.

      Then you had to queue for the hopper and wait an hour or more for your job to be printed. Most embarrassing was when you had a loop that did not break out, as you would have to phone down to the computer room and ask for your job to be stopped, as it produced page upon page of errors and that IBM printer was really fast.

      Later on we progressed to a mini computer (ISTR it was a PDP-11). At least you had a keyboard and (amber) monitor to enter your code (writing a compiler in machine code at the time). Problem was that the machine had eight terminals and if all of them were occupied it could take up to a minute before whatever key you pressed made it to the display; not good if you used ed (or edlin? - memory is fading with age).

      The best of times, the worst of times.

      (Icon for the lack of keyboards we wished we could have, rather than the punchcard machines)

      1. Kobus Botes

        Re: .. never used .. ?

        Replying to my own post:

        Some of my maligned memory cells have taken umbrage at my slanderous suggestion that they are fading as I was just reliably informed that the mini computer was in fact a DEC VAX (brand new at the time).

        Second correction: we wrote in assembly, not machine code (although we had to write a couple of small programs in machine code).

      2. Stoneshop

        Re: .. never used .. ?

        not good if you used ed (or edlin? - memory is fading with age).

        Ed. Edlin was an MS-DOS text file mangler.

    5. Bob Carter

      Re: .. never used .. ?

      I was taught Algol at Uni in 1975, and I still have a copy of my project on paper tape in the attic. I have been meaning to create an Arduino based paper tape reader just to see what is on it!

    6. Anonymous Coward

      Re: .. never used .. ?

      I remember learning ALGOL, APL, PL/1, FORTRAN, COBOL, and 360 Assembler in '69-'73 at my (US) University (after Autocoder for the 1401 and BASIC in high school).

      Autocoder was end of life before I learned it.

      BASIC was incredibly limited.

      PL/1 was just awful.

      APL was fascinating and powerful but unusable.

      ALGOL was elegant but not as useful as FORTRAN.

      FORTRAN was more easily written and maintained than ALGOL.

      COBOL was a reliable pack mule and about as pretty as one.

      360 Assembler was fantastic for understanding computers but not much else.

      Once I got a job, the only languages I was ever paid to work on were FORTRAN (a bit), COBOL (a bunch), and 360 Assembler (for core dumps).

      1. davebarnes

        Re: .. never used .. ?

        Let us not forget the APL keyboard.

        1. Antron Argaiv Silver badge

          Re: .. never used .. ?

          When I was at Data General in the 80s, I modified their D200 terminal firmware to handle and display the overstrike characters required by APL. To my knowledge, this was the first non-storage tube display to be able to do that.

          I think they might have sold 10 of them. APL was never more than a niche product for DG.

          1. Juanca

            Re: .. never used .. ?

            I made something similar with a DEC VT 100.

            1. oldcoder

              Re: .. never used .. ?

              VT100 became the ANSI standard terminal. Mostly because everybody tried to replicate it.

              And still try - most terminal emulators include a VT100 emulation.

          2. Waryofbigbro

            Re: .. never used .. ?

            That D200 with the APL PROM and APL keyboard was great. I worked for DG and was responsible for marketing APL and other DG languages in Australia. I met Ken Iverson at an APL conference and he said that DG's APL interpreter was the finest implementation he had seen.

            Back to Algol - before the Nova even had a disk-based OS there was SOS - the stand-alone operating system. SOS supported the 8K paper-tape Algol compiler which was probably the only other high-level language other than BASIC on that machine in the late 60s.

    7. bombastic bob Silver badge

      Re: .. never used .. ?

      Back in the (very) late 70's at a university, my PDP-11 assembly language class was taught by a professor that LOVED ALGOL. Brilliantly, he required all assembly language assignments to be accompanied with a "high level language" pseudo-program. Of course, it had to at least LOOK like ALGOL or you'd get a LOT of uncomplimentary comments when he returned it to you. I resisted but it was futile. So I started handing in the ALGOL-like pseudo code along with the homework assignment [which always worked].

      But the focus of that class wasn't assembly language per se; rather, it was on structures and lists and other things that languages _like_ ALGOL (and also C) are PERFECT for. I do not recall what kind of structure and/or pointer support you had in ALGOL, but I think there was at least _something_. 'C' of course took this concept, ran with it, and basically INVENTED how it's done from that point forward.

      NOTE: I've also had to work with FORTRAN 'EQUIVALENCE' and 'COMMON' blocks which indirectly accomplish the same thing, in a 'hobbled' kind of way. Pure assembly is NEARLY as 'hobbled' though. On the PDP-11, and with good x86 assemblers, you can specify an array index as a register offset and 'sort of' look like an index, and maybe 'struct.member' as an offset as well. But old assemblers had name size limits, and you had to get very creative with names....

      1. Anonymous Coward
        Anonymous Coward

        Re: .. never used .. ?

        "[...] pointer support you had in ALGOL, but I think there was at least _something_. 'C' of course took this concept, ran with it, and basically INVENTED how it's done from that point forward."

        A "C" programmer proudly showed me how it used pointers in "doped arrays" to make techniques like array sorting faster. Ah - the light dawned - the same technique used in assembler on KDF9 in the early 1960s.

    8. Stoneshop

      Re: .. never used .. ?

      First computer programming course in University (1977) was in ALGOL-60, the next one in Pascal. That one stuck for quite a while, with Borland's TP being available for PCs (and their 'treat it like a book; you can make multiple copies, but use only one at a time' license) so offering you a cheap-ish, personal environment. Started using Turbo C a couple of years later, but kept going back to TP for bigger programs, although all that were just hobby projects. Professionally it was mostly VMS command files, Unix/Linux shell scripts and more recently some Python.

      1. Fustbariclation

        Re: .. never used .. ?

        Turbo Pascal was brilliant - particularly its object-orientated add-ons.

        I wrote a simulator for an X.25 network using it, and it made the job really easy - I just made each switching node an object and you could construct large networks with ease, simply by creating a new node object and linking it to its neighbours. Brilliantly easy.

        It worked so well that the figures I produced for new nodes being rolled out on the network we managed were so accurate that, when one wasn't performing within the time my model predicted, they called the Bundespost PTT to complain, and had the line fixed until it matched my figure, rather than complaining, as I'd expected, to me, that my numbers were wrong.

        It was all a mistake, really. At a meeting our customer asked if we could produce projected performance figures, and, without thinking, we foolishly agreed - both my colleague and I thinking the other had a solution.

        We should really have gone back and explained that predicting terminal performance was impossible - the maths boffins assured me that it was, because queuing theory wasn't good enough for a multiple node network. Still, I had my newly purchased Turbo Pascal, and thought it worth a try. It was hard work, but I had the figures to them within ten days.

        1. Anonymous Coward
          Anonymous Coward

          Re: .. never used .. ?

          "[...] the maths boffins assured me that it was, because queuing theory wasn't good enough for a multiple node network. "

          I remember a debate in the Cross Keys one lunch time. Our guru Tom explained that when queueing theory became relevant in our comms front end processor - then it was time to upgrade to a more powerful machine.

  3. John Thorn

    There was one particular feature of Algol that made for some interesting programming..

    If a procedure was called with an expression as one or more of the parameters the expression was re-evaluated every time the parameter was referenced in the procedure (not just as the procedure was called). Nest that down a few levels and debugging was a nightmare.

    1. coconuthead

      "Call by name".

    2. BobAllen

      That’s the 'pass by name' feature; however, it’s optional—pass by value is also available.

      1. John Thorn

        No, it's something different....



        Call Proc(A+B)




        Print C ..... ALGOL answer will be 4 - in other languages it will be 3

        1. Ken Hagan Gold badge

          I think OpenSCAD still has that feature. It sucks. I think it is a bug arising from someone not really thinking through the possible failure modes of a two-pass interpreter.

          1. Yes Me Silver badge


            It wasn't a bug in Algol. It was meant to work that way. If you read up on why Niklaus Wirth designed Algol-W and Pascal the way they were, I'm sure you'll find pithy comments on this.

            Even more fun in Algol 60 was passing expressions including functions by name as parameters to other functions. Like A := proc1(proc2(A+B), proc2(A)).That wasn't liable to cause unintended side effects; it was side effects.

            Few compilers supported that feature, because it was really interpretative. Burroughs Algol created run-time entities called "thunks" to handle it, which I guess were mini-interpreters built into your compiled code, with hardware assist according to the only Google reference I could find.

            1. ssieler

              Re: Wirthless

              Yes, Burroughs ALGOL supported call-by-name (and call-by-value and call-by-reference).

              The "thunks" weren't interpreted...they were real machine code ... but the accompanying cost of (effectively) a procedure call/procedure exit made such code relatively expensive, probably why the compiler writer(s) drew attention to it for you.

              I learned Burroughs ALGOL in 1970, on a B6500 (which became a B6700 shortly thereafter) ...

              absolutely wonderful language. It had features we still lack in "modern" languages ... much of what Burroughs invented (and, later, HP with the HP 3000) is now lost, sadly.

              1. HailHenry

                Re: Wirthless

                Not completely lost! Burroughs (now Unisys) still provides mainframe platforms running the MCP operating system, which is itself written in an ALGOL dialogue. As is most of the environmental software and elements of many of the applications running on MCP platforms. You can request a download of a free, not for business use, full copy of the MCP platform environment (known as MCP Express) here - MCP nowadays runs on Intel Xeon-based platforms (proprietary hardware platforms haven't been available for some time now), and MCP Express will run on a variety of Windows desktop OSs.

                1. Ian Joyner Bronze badge

                  Re: Wirthless

                  Yes, Burroughs system languages beat C hands down. C looks like the toy it is in comparison. And MCP makes Unix look like a toy.

                  Burroughs machines are secure – they enforce boundary protection at the smallest level, none of this weak pointer overruns or indexes out of bounds junk in the weak C language. Bounds protection is done at the hardware level so it cannot be subverted by C or assembler as on other systems.

      2. Mike 16


        The problem with any code that one inherits is that the "options" were exercised by the original programmer (or the boss's nephew who "improved" the code). They can only be changed by the current maintainer when the ramifications are completely understood. Good luck with that.

        I have written very little Algol 60, but it was the subject of a compiler class, early 1970, and the experience of implementing Call-by-name left me with some useful techniques, and some scars.

        As for crap-code from compilers, a friend and I implemented a "code cleaner" for the assembly language output of a then considered very good C compiler, known for producing fast code. Sometimes not so fast, and I'm not sure it was a good trade-off against _correct_ code.

        1. Anonymous Coward
          Anonymous Coward

          Re: Optional?

          "Sometimes not so fast, and I'm not sure it was a good trade-off against _correct_ code".

          Fast - good - cheap - pick any two (if you're lucky - sometimes you don't get any).

    3. rafff

      Pass by name

      Nowadays that is called a lambda and is supposed to be the dog's wotsits.

      Tinkering with a component of a parameter inside the called procedure was called Jensen's Device and could be very useful - if kept under control.

      e.g. thingy(a[i], i)

      if 'i' is altered inside 'thingy' then you get a different array element. Happy days.

      I also wrote a Coral 66 compiler. It really was a godawful language. Not only was I/O undefined, arithmetic was too.

      1. Anonymous Coward
        Anonymous Coward

        Re: Pass by name

        Aren't variables from the enclosing scopes supposed to be immutable in lambdas? Or is that implentation dependent?

    4. gnasher729 Silver badge

      That feature is available in Swift as “Auto closure”. However, you are supposed to evaluate such a parameter 0 or 1 times. Used to implement && and || as functions in the standard library, not in the compiler. Very useful for assertions which are not evaluated in a release version. And supported by a clever compiler that can inline everything.

  4. dbayly

    ALGOL lives !

    I learnt ALGOL 60 on a Elliot 503 in Oz in the 60s , and moved on from there to a long career of systems programming in ALGOL variants on the Burroughs machines, then Unisys A series. The OS is still written in ALGOL variant, all the compilers are ALGOL programs , including the ALGOL compiler.

    And the I/O conundrum was solved handily by Knuth, who wrote an I/O library (largely in ALGOL) quite early on.

    ALGOL lives still, though is a rare skill these days

    1. Anonymous Coward
      Anonymous Coward

      Re: ALGOL lives !

      "ALGOL lives still, though is a rare skill these days".

      Rather like classical Greek and Latin.

      1. FarnworthexPat

        Re: ALGOL lives !

        Better not tell BoJo or he'll be sprouting Algol at PMQs

  5. Anonymous Coward
    Anonymous Coward

    "notions like beauty and elegance in mind for the language"

    You can still see that difference in languages from Europe compared to languages from US...

    1. Warm Braw

      Re: "notions like beauty and elegance in mind for the language"

      Sometimes those minds can be too twisted for their own good! [PDF]

      Still, it could be worse.

      1. Scott Wheeler

        Re: "notions like beauty and elegance in mind for the language"

        Chef is potentially useful. I sometimes draw simple Gantt charts when I'm cooking multiple things, and they could perhaps be produced automatically from Chef specifications. Even add in a bit of blur so that you're not unloading five pots and pans at the same time.

    2. aberglas

      Re: "notions like beauty and elegance in mind for the language"

      Which is why Algol 68 was replaced by ... C :(

    3. Anonymous Coward
      Anonymous Coward

      Re: "notions like beauty and elegance in mind for the language"

      Ditto - OSI v TCP-IP

  6. Anonymous South African Coward Bronze badge

    Started to program in Turbo Pascal in the early 1990's - and did Turbo C, and Assembly Language as well.

    Then I switched over to a Network Engineer course, and is now a BOFH.

    Sadly, programming skillz is not much anymore. Maybe it is time for a refresher and revival...

  7. steamnut

    It started my career move...

    We had an Elliott 803 at Rugby College (later absorbed into Lanchester Poly). Although I was studying Applied Physics, the computer part of the course fired me up and I went into computing from that point on. I remember the console speaker that, although abused to play music, did give you a sense of what your program was doing. Later on, in my 6800/8008/8080 hobby days, I used a transistor radio for the same thing.

    There were two no-no's: The first was to remember how little memory you had to play with so large arrays were not possible and, if attempted, resulted in a subscript overflow message (iirc - SUBSCROFLO). The second was to make sure your plotter programmes completed. The plotter involved an extra paper tape load for the operator. If your plot failed then the whole machine had to be restarted. The plotting code was probably an early form of overlay.

    Before we had magnetic tape installed the paper tape reader was something to behold. The output from the reader had to be caught in a basket as the speed was so high. The computer operator was a very smart young lady too.... Happy days. ;-)

    1. PeterO

      Re: It started my career move...

      The Algol Plotter Package (Library Tape P104) was a precompiled tape which contained the output of pass one of the compiler (called Own Code) and various bits of compiler state. It was a binary dump of the Own Code so loaded faster than running pass one on the corresponding source code. Your source was then read in and its Own Code appended that already in store. Pass two then read the combined Own Code to produce the executable in core.

    2. YAAC

      Re: It started my career move...

      I used that machine in 1970. We lowly engineering students weren't allowed anywhere near the machine so we prepared our stuff in the teletype room next door. Submit the tape, come back tomorrow, fix the errors - rinse and repeat.

    3. Lars Silver badge

      Re: It started my career move...

      Hello steamnut,

      So did I in 1968, what about you. I still have my diploma, happy days indeed.

      And the girl, the one with the gorgeous legs I presume.

    4. CABVolunteer

      Re: It started my career move...

      "Before we had magnetic tape installed the paper tape reader was something to behold. The output from the reader had to be caught in a basket as the speed was so high."

      My boss may have designed that paper tape reader!

      When I joined the internal management consultancy division of a large multinational, my boss retired in 1981 and I had to weed his many filing cabinets. Many papers went to the organisation's international archives since they documented the implementation of computing in the subsidiary companies worldwide, but I also found some of his personal memorabilia amongst which were the engineering drawings for paper-tape readers and punches - it transpired that he'd been the chief design engineer for peripherals for Ferranti. From my recollection of the project documentation, once the switch from mechanical to optical sensing on paper tape readers had been made, the primary problem was the dynamics of paper tape as it was fed from the reel through the sensor head and out to a collection device (bin or take-up spool).

      In that department, I also had other senior colleagues retiring in the early 1980s. One had been in the core team which conceived and implemented the first commercial use of computers in the UK at Lyons Corner Houses. Another had been a Post Office engineer who described to me how he had built a computer from telephone switching relays - you can join the dots about where he worked when you realise he was in his twenties in the 1940s.....

      So many of the senior IT management back in the eighties were hardware engineers - nowadays, it seems to be all software.

      1. PeterO

        Re: It started my career move...

        The Elliott 500 and 1000 char/sec tape reader design originally came from Cambridge University Mathematical Laboratory's EDSAC team.

        See page 60.

      2. oldcoder

        Re: It started my career move...

        Frequently, the problem with such readers was that they were faster than the ability to transfer the byte just read to the host, and the result caused data overrun errors in the line.

        I had to slow one down to 10 CPS or the read would fail.

    5. JohnLH

      Re: It started my career move...

      I also learned ALGOL programming on that 803 as a 5th form school student in Rugby. That would have been in 1965/6 probably. Apart from tunes on the console speaker I remember being told that the shortest valid Algol program was a single semicolon, but it compiled to many yards of paper tape so we were warned against it.

      Surely the teletype tape reader on Colossus was using optical reading in WW2? After all, optical sound tracks were ubiquitous on talkie films before the war so the technology was well known.

      I also remember looking at the Algol 68 User Guide edited by the great Philip Woodward and published I think by the Royal Radar Establishment. IIRC it starts "Begin comment Chapter 1...", and is presented in the form of comments to a program that calculates the date of Easter for any year.

    6. ICPurvis47

      Re: It started my career move...

      Hi Steamnut, you must have been at RCET at about the same time as I was, my nickname there was "Prof". Did we meet?

  8. anthonyhegedus Silver badge

    I've used Algol-60 at school

    I did my AO-Level Computer Studies in 1983 it 1983 or 1982 (can't remember!) and I did my project in Algol-60. Because that's what we ran on our Research Machines 380Zs. It was a program to solve the N-Queens problem, using the process of making different permutations of the numbers 1-N, and I did it using recursion (because I could). Little did I realise at the time that it was really out of date, and there's no reason we couldn't have done it in Pascal or C. But that's all we had.

    1. Alan J. Wylie

      Re: I've used Algol-60 at school

      Me too.

      Probably 1973/4. We wrote out programs out on coding forms, making sure that our 0/O and 1/I were distinguished (I can't remember any more which one had a slash through it). It was sent off to, IIRC, Lancaster Uni, where it was typed onto punched cards and put through the batch system. A week or two later, we got the output. It took a long time to get anything that would compile. I think mine did run eventually, busily calculating digits of pi until it hit its CPU limit.

  9. Antron Argaiv Silver badge
    Thumb Up

    Ugh! PASCAL

    We used it in school (UMASS/Amherst) as a teaching language. Too "wordy" for me, especially the IO statements. But it did get the job done.

    When I started working, it was at Data General, who had an Algol-like language called DG/L, which I absolutely loved. It was my language of choice for little utility programs on our AOS and AOS/VS systems.

    Then, of course, came the Sun workstations, with UNIX and C.

    Game over.

    Bonus for the CALCOMP drum plotters. I worked part time at the comuter center in school, we had one (3 or 4 foot wide). The number of aborted plots due to pen failure was astounding, as was the pen budget!

    1. Phil O'Sophical Silver badge

      Re: Ugh! PASCAL

      Bonus for the CALCOMP drum plotters

      I remember a friend at uni mapping out the Adventure Colossal Cave and having it plotted on one of them. It was on the wall of the terminal room for quite a while.

    2. Anonymous Coward
      Anonymous Coward

      Re: Ugh! PASCAL

      "It was my language of choice for little utility programs on our AOS and AOS/VS systems."

      In support on KDF9 EGDON 3 I had a choice of two languages to write ad hoc utility tools. Assembler took too long to get working - it was surprising how you could bend Fortran to that task.

    3. Ian Joyner Bronze badge

      Re: Ugh! PASCAL

      C – game over? If you like weak toys. C has too many flaws. It is not a solid language like ALGOL.

      1. oldcoder

        Re: Ugh! PASCAL

        C is a VERY solid language.

        Which is why it is used to implement most operating systems now.

  10. coconuthead

    Algol 68 is not ALGOL 60

    Algol 68 is basically a different language from Algol 60—and the Algol 60 designers, most notably Djisktra, were less than impressed by it, and along with others on the committee issued a "minority report" disowning Algol 68. Syntax aside,and leaving out the object-oriented stuff, Algol 68 actually bears a distinct resemblance to C++. Algol 60, on the other hand, begat Pascal. So really they are different languages sharing part of a name.

    I never got to use Algol 68 because it was difficult to write a compiler for and nowhere I worked had one. Burroughs Algol (a variant of ALGOL 60), on the other hand, was available. Time pressure, and perhaps a desire not to be seen to know too much about the "old iron", meant I never did write any Burroughs Algol. That was perhaps my loss. I don't think I missed anything by not writing Algol 68.

    BTW Burroughs Algol was implemented in hardware. It was a stack machine with display registers and hardware support for resizing arrays.

    1. JohnGrantNineTiles

      Re: Algol 68 is not ALGOL 60

      The problem with Algol68 was that Aad van Wijngaarden treated it as an academic exercise, valuing elegance over practicality. He regarded Algol68-R as unclean, it having compromised on some of the more esoteric features to make it actually usable for real work.

      People who used it told me one great feature was that most bugs were detected at compile time, i.e. once you got your program to compile it usually worked (unlike C).

      1. MarkMLl

        Re: Algol 68 is not ALGOL 60

        The problem with ALGOL-68 was that when Wirth's ALGOL-W was rejected he threw his toys out of the pram and resigned from the committee. If he'd stayed but voted against the proposed standard than it's likely that a majority of the committee would have followed his lead... it was basically Wirth+Hoare vs van Wijngaarden and a number of his students whom he'd coopted.

        And what exactly McCarthy was doing there is unclear, since he'd already abandoned ALGOL for Lisp.


      2. Anonymous Coward
        Anonymous Coward

        Re: Algol 68 is not ALGOL 60

        "The more I ponder the principles of language design, and the techniques that put them into practice, the more is my amazement at and admiration of ALGOL 60. Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors".

        - C.A.R. Hoare, "Hints on Programming Language Design", 1973

        1. Bitsminer Silver badge

          Re: Algol 68 is not ALGOL 60

          I recall a visitor to the university lecturing about an Algol-68 compiler for the IBM 360. This was the mid 1970s.

          He wrote on the chalkboard:

          A = B + C

          And pointed out that it would normally require 12 runtime checks to succeed.

          End of interest.

    2. Roland6 Silver badge

      Re: Algol 68 is not ALGOL 60

      >I never got to use Algol 68 because it was difficult to write a compiler for

      Yes, the really powerful and useful languages are a bugger to write a compiler for.

      Aho and Ullman's "Principles of Compiler Design" was the must have text book on the subject (still got my first edition) and explains why , although it was very helpful in writing a Pascal compiler, in writing compilers for languages requiring bottom-up LR parsing, such as C and Ada, there was much left to the reader...

      1. swm

        Re: Algol 68 is not ALGOL 60

        I wrote a compiler for most of ALGOL 68 about 50 years ago and it was not that difficult. (Not as hard as C or C++.) I didn't do all of transput though. ALGOL 68 had a garbage collector specified which I also implemented (about 1-2 pages of assembly for a compacting GC). Actually, it was quite a logical language. I could do everything with a top-down parser. First pass - find the operators etc. Second pass - do the actual compilation.

        I was at the final meeting of TC2 WG 2.1 (I think) in December 1968 where the language was approved. It was a contentious meeting. Everyone was friends until they got into the meeting room.

        The syntax of the language was generated by a meta-syntax making it hard to understand. Eventually they extended the syntax/meta syntax to even check if identifiers were declared etc.

      2. matthewdjb

        Re: Algol 68 is not ALGOL 60

        I remember attempting a C compiler using that book. Did enough to pass the course but never finished it. It was hard!

        1. Roland6 Silver badge

          Re: Algol 68 is not ALGOL 60

          >I remember attempting a C compiler ... Did enough to pass the course but never finished it. It was hard!

          Don't be to hard on yourself, there was a valid (technical) reason why Borland didn't release Turbo-C :)

          BTW it took a full-time team of 3 a year or so to produce a halfway decent C compiler and code generator that was marketable.

      3. oldcoder

        Re: Algol 68 is not ALGOL 60

        The problem Algol 68 had was that it required a multi-level grammar (something like three different grammars depending on the context of where each was being used). Not something easy to do.

        1. oldcoder

          Re: Algol 68 is not ALGOL 60

          That is the same problem C++ has.

    3. Jason Bloomberg Silver badge

      Re: Algol 68 is not ALGOL 60

      I never got to use Algol 68 because it was difficult to write a compiler for and nowhere I worked had one.

      If you have a Raspberry Pi: "sudo apt install algol68g", compile using "a68g filename"

      1. dajames

        Re: Algol 68 is not ALGOL 60

        If you have a Raspberry Pi: "sudo apt install algol68g", compile using "a68g filename"

        Indeed. The same works on any Debian-based distro.

        Algol68g is also available for other systems, including Windows ...

        It's an interpreter, not a full compiler, but it is a pretty full implementation of the language of the Revised Report.

    4. dajames

      Re: Algol 68 is not ALGOL 60

      ... Algol 68 actually bears a distinct resemblance to C++. Algol 60, on the other hand, begat Pascal.

      Algol 60 begat both Algol 68 and Pascal ... and quite a few other languages along the way (Simula, anyone?). There's no "on the other hand" about it!

      C++ was certainly influenced by Algol 68 -- Bjarne Stroustrup says that he would have liked to write an "Algol 68 with classes" rather than a "C with classes" but he realized that if he wanted his language to gain widespread adoption it would have to be based on C -- but that was around 20 years later.

      1. MacroRodent

        Re: Algol 68 is not ALGOL 60

        > lgol 60 begat both Algol 68 and Pascal ... and quite a few other languages along the way (Simula, anyone?).

        I never used Algol 60, but I did do an exercise in SIMULA-67 at the Helsinki University of Technology. It had the same syntax as Algol 60, but added classes, with objects allocated dynamically, had garbage-collector (like Java decades later...), strings, and a sensible I/O library.

        It actually felt a way more practical language than the Pascal compiler used in earlier courses. Pascal at the time omitted too many real world features. It was impossible to make a portable program that processed a named file. In fact, making a portable program that reads a string from the terminal, and prints something in response, was impossible, because the INPUT stream was defined to work in a way that only a theorist would love. Every implementation had a different workaround for this, or just redefined the I/O semantics, like Turbo Pascal did.

      2. ccx004

        Re: Algol 68 is not ALGOL 60

        I never used Algol 68 but I spent many happy hours writing in XAlgol on a Burroughs 5700 in CandE (Timesharing). Better still I spent around 5 years (from some time in 1979 to the end of 1984) writing Simula programs on DECSystem-20s running TOPS-20. I still have a copy of "Simula Begin" which I treat as a sort of Bible of Programming. I remember that time very fondly but I suppose the people starting their programming careers now will, in 50 years time, look back fondly on C#, .NET and Visual Studio. Such is life.

    5. martinusher Silver badge

      Re: Algol 68 is not ALGOL 60

      68 was essentially impossible to write a compiler for (because of features like being able to define procedures inside a procedure) but that didn't stop a useful subset of the language, Algol--68R, being implemented on ICL 1900 series machines. I think the 'R' stood for the "Royal Radar Establishment", the group that developed the compiler. It was actually a pretty useful language, I used it to write a program that simulated the operation of a pipeline and found it 1970s standards a very easy language to use. Ihave a copy of the language definitiont (the green book with the language definition, "Report on the Algorithmic Language....").

      1. MacroRodent

        Re: Algol 68 is not ALGOL 60

        Defining procedures inside procedures was already in Algol 60 and in most languages descended from it. I recall using it enthusiastically in my beginner Pascal programs. In fact, lack of this feature in C and C++ is rather the exception.

        1. Roland6 Silver badge

          Re: Algol 68 is not ALGOL 60

          >Defining procedures inside procedures was already in Algol 60 and in most languages descended from it.

          In the 1990's I used the ability of Algol 68 to handle recursion and pass functions as parameters as an interview question. It quickly weeded out those whose only experience of programming was MS VB et al. as these people just didn't know what you were talking about.

  11. Anonymous Coward
    Anonymous Coward

    I used it.

    Once. 45 years ago.

    1. Anonymous Coward

      Re: I used it.

      "God (s)he's old", I thought. Then I remembered - I used it once as well, and it was also 45 years ago. What happened to the years?

  12. dirtygreen

    ALGOL 60 was the first language I learned, at school in the Computer Club. Turn round was a bit longer than ten minutes. We used to write programs on coding sheets, which were then taken to the nearby university and typed up by data prep ladies, run by the operators on the ICL1909, and the coding sheets, pack of cards and printout was returned to us a week later. That did make you concentrate fairly hard on program correctness; my first program calculated primes and I've still got the output somewhere. After a while a friend and I learned that we could get off the bus on the way home, walk up to the university and punch our own cards on IBM 029 punches, and then watch while the operators ran the program. That made things a lot faster and meant we could write bigger programs - my favourite was a linear regression program for the results of our physics experiments. It made them look much more 'official' :)

    And then we discovered the unversity had a free access PDP-8 so we learned BASIC on DECtape and the joys of typing into an ASR-33 and Friden Flexowriters. And then ...

    ... the university got a copy of the POP-2 compiler. Still my favourite language ever! Programs to synthesise English using Chomsky's grammars etc etc. Machine Intelligence 1, 2, 3 ...

    1. keithpeter Silver badge

      Haskell's grandad?

  13. Stuart 22

    The Curse of the Semi-colon

    Was it Algol that started it?

    Computers were run by the Maths department who were only interested in getting them to do things in FORTRAN IV. It was quite clear the end of a hollerith card was the end of a statement so what with the bother?

    Then the department spawned one of the first UK computing departments who created the MSc on the Theory of Programming Languages and in 1970 Laski & Turner published Program Linguistics. Laski was in love with reverse Polish notation - that's when I decided that a fascination in computer languages for their own sake was not for me.

    I even deserted FORTRAN for Dartmouth BASIC (Nope, nothing to do with the other curse of the amateur programmer - VB). Oooh I can hear the downvoters marching towards me ...

    1. swm

      Re: The Curse of the Semi-colon

      Dartmouth also implemented a version of ALGOL-60 on the LGP-30 (4096 words of 31 bits). I remember the excitement when they successfully passed a procedure to another procedure and successfully called it. I think the runtime was about a second.

      1. Ron Martin

        Re: The Curse of the Semi-colon

        I was a student at Dartmouth from 1963 to 1967. I worked on the time-sharing system based on the GE-225/235 and the GE DataNet 30, and on two succeeding time-sharing systems based on the GE-635 and the GE DataNet 30. The 225 system was the first host of a BASIC compiler, written by John G. Kemeny with guidance from Thomas E. Kurtz. Most of the code I wrote was communications front-end code for the DataNet 30, but I also wrote quite a bit of 235 code. In the summer of 1964, Steve Garland, an earlier Dartmouth grad, returned to campus to write an Algol 60 compiler for the 235 time-sharing system. I wrote the runtime math library, i.e. the transcendental functions, square root, etc. and suggested some formatting options for the non-standardized terminal output facility.

        After the summer, Steve Garland departed the scene, to return again sometime after I graduated. Sarr Blumson was the next to take responsibility for our Algol 60 compiler.

        I wrote a fair amount of code in Algol 60, the most sophisticated of which analyzed the radiation field of an Ernst Radiation Applicator containing an adjustable number and configuration of radium needles, related the radiation to key feature locations taken from an in-situ x-ray of applicator. The results were presented as a plot of isodose lines on a teletype printout. The key feature locations were included in the plot, which was to scale such that the x-ray could be overlaid on the printout to aid in interpretation of the results. I used Algol 60 for this project because the code would not have fit in a BASIC program. BASIC programs at that time were limited to 250 or perhaps 256 lines. Algol 60 source code was much more compact and concise than BASIC source code.

        I don't expect many people to ever read this due to the timing, but I decided to add my two cents to the record, anyway.

        1. Ken Moorhouse Silver badge

          Re: I don't expect many people to ever read this due to the timing

          Everyone who contributed to the thread will have it flagged up on their post list. It is always interesting to hear these stories.

  14. disgruntled yank

    Military uses

    For a long time, the US Navy used JOVIAL, otherwise Jules's Own Version of the International Algorithmic Language. It may still, but I haven't looked at job ads in a while.

    1. Anonymous Coward
      Anonymous Coward

      Re: Military uses

      This inspired me to rename the real time kernel I once devised for the 8086 BOVRIL - Berenger's Own Version of a Realtime Industrial Language.

      It was a bit temperamental, make a mistake and your machinery could find itself in the brown stuff, and widely adopted it was not. In fact, nobody else was willing to use it. With hindsight, this was a correct decision.

    2. Primus Secundus Tertius

      Re: Military uses

      I remember hearing a military project manager in America moaning, in about 1996, that the young chaps did not seem interested in learning Jovial.

    3. swm

      Re: Military uses

      Lots of $.

  15. smudge

    My first computer language...

    ... was Algol W, at university in the mid-70s. Sequence, selection and iteration - what more do you need? :) It set the way for how I think about programming, and I found it difficult to think in other ways. For example, we later covered the functional programming language SASL (a precursor of Miranda, which in turn preceded Haskell), and I found it almost impossible to think that way, until one day the penny simply dropped, and I had no problems after that.

    For Raymond, up above, I still have my copy of the MoD "Blue Book" standard for CORAL 66. Used that at university on a CTL mini, and in professional life on Ferranti Argus and CTL/ITL minis.

    My real love was Pascal, and I remember how chuffed I was the day I discovered that although I had been using it for many months, I hadn't realised that our implementation of it did not include GOTO statements! I still have a free implementation of it on my home PC - very useful for occasional puzzle solving.

    Others I used included Fortran IV and BCPL at university, and ASM-86, PL/M-86, and PL/I in professional life.

    But then I became a consultant :)

    1. JohnGrantNineTiles

      Re: My first computer language...

      The KDF9 had two Algol compilers: W, written by a group at Whetstone, ran fast but didn't optimise, used for testing and teaching, while K, from Kidsgrove, was an optimising compiler used for production code.

  16. Doctor Syntax Silver badge

    "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code…"

    10 minutes? Luxury. Punched card jobs run in batches. 2 hours turn-round, max 3 runs a day with the compiler losing track after the first error and rejecting every subsequent line. Then you really paid attention to your source code.

    1. Steve 114

      Those of us with nighttime access to the ops in the 'computer room' would collect some chads from the cardpunch hopper, put them into the errant holes in the card pack with a quick wipe of polystyrene cement for security. then try another run.

    2. Anonymous Coward
      Anonymous Coward

      The mainframe operators soon learned that when a System 4 (IBM 360) Assembler run produced thousands of errors after the first few statements it just needed a statement adding and rerunning. IIRC something like a USE or BALR 3,0 establishing the addressing base register.

      The punched card data-prep girwomen also became quite competent at spotting common mistakes. Hence one compiler test source compiled cleanly - when it was supposed to test those error messages.

      On an official training course to learn the System 4 Assembler there was a desk of us who had already had some hands-on practice. The lecturer in Hut K was kept on his toes by our questions. When the set program task was returned from the computer run - he gleefully gave us our failed run listings. We looked at them - then pointed out he had forgotten to include the macro expansion pass. Oops! He then remembered that he always left it out on the first submission to save the expense on his machine time budget. He didn't expect clean compilations from the students.

      1. swm

        I once did an assembler run of a small program and forgot to include the character set card. Caused 2-3 errors per line.

  17. jonesthechip

    Early Uni Computing

    Back in the early 70's Bangor University Computing Department kept what was the previous workhorse 803 in a back room, after migrating to an ICL4130. The computing department manager would oversee a rigorous test on the 803 operating procedures before allowing any member of the great unwashed student body unsupervised access to said machine. I've still got my copy of Algol A104 compiler tape in a drawer somewhere. As an introduction into the wide world of programming this was an ideal education in small memory management and code optimisation, useful on my next processor, an intel 4040. How did the compile procedure go? Power supply to 'on', tape in reader, press the control button to 'read', hit the 'operate' bar, press the control button back to 'normal', hit the 'operate' bar... And listen to the soothing chatter of the loudspeaker connected to the overflow bit... Kids these days, they don't believe you!

    1. PeterO

      Re: Early Uni Computing

      You'll soon be able to relive it all in stunning 3D :-)


      1. jonesthechip
        Thumb Up

        Re: Early Uni Computing

        I'm sure I've got an Elliot Autocode Manual somewhere in the garage... (Wife: When are you getting rid of that junk? Me: You mean my 'resources'?) Didn't understand it at the time, but 45 years of 'real programmer' assembler might help now! And not forgetting 803B 'on-line Algol' sessions which ran for 5-10 minutes - if lucky...

  18. djdel

    Not just Elliotts...

    The National Museum of Computing's ICL 2966, running George 3 and Maximop, also has Algol 60 and Algol 68-R compilers. The Noughts and Crosses and Game of Life demonstration programs are both written in Algol 60.

    ICL's VME operating system is written in S3, an in house language designed specifically for systems programming, and very closely based on Algol 68.

  19. shawn.grinter

    OK, that's it.....

    OK, so I'm officially "old" - I wrote my first programs in Algol 60!

    These days I just use an abacus.

    1. --CELKO--

      Re: OK, that's it.....

      Georgia Tech originally had a Burroughs 5000 machine for its computer department. Since the algal compiler for the Burroughs machines were written in alcohol, the standard senior project was to extend the compiler to give us something we called GTL (Georgia Tech language). It was a while before someone got around to actually trying to optimize all of these extensions.

      It's funny you should mention an abacus. Before we had good calculators, I used a Chinese abacus (two beads in the upper section and five beads in the lower section of each column) which could represent hexadecimal numbers.

  20. Anonymous Coward
    Anonymous Coward

    "[...] but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing."

    Except that various mainframe models had different numbers of bits in their arithmetic data "words". Consequently floating point precision varied - which could be very important for some calculations. IIRC the EELM KDF9 had 48bits - or possibly even 96bits - for floating point operands. Which made it good for scientific users.

    1. Primus Secundus Tertius

      The "bad for science" machine was the IBM 360, where the floating point exponent represented 16**n rather than 2**n. As a result the floating point mantissa sometimes lost three bits. The result was that the single precision floating point was good to only about six decimal digits. Hence the proliferation of double precision floating point on IBM. It was not needed on ICL 190x nor Elliott 803.

      IBM were mainly interested in commercial arithmetic from COBOL compilers. This used binary coded decimal (BCD) arithmetic, which could handle billions of dollars to the nearest cent. COBOL type computational defaulted to BCD, I believe. I was once trying to explain floating point data to a database salesman. I finally got through to him with the phrase computational-type-3.

  21. Andy A

    ALGOL 60 was the first language I learned, in 1973. At Warwick Uni, they reckoned that virtually any science undergrad would need computer skills, so this was a compulsory course.

    They ran an ICL 4130 and a 4120 at the time - designed by Elliott, so presumably somebody at Elliott was an ALGOL fan.

    Never used it in any commercial setting, but it laid the basis for the structure used in any number of languages since. When writing COBOL, I often sketched things out using ALGOL to check things made sense before getting the coding sheets dirty.

    Who remembers Backus-Naur ?

    1. lsces

      Punched card rather than paper tape ... it was a step up from the ICL1900 machine code I'd been using before going to Warwick ...

      1. Andy A

        First actual job was with 1900 kit. Ended up supporting our very own version of George 2.

        So do you remember the significance of 7036875 ?

        1. Richard Plinston

          > So do you remember the significance of 7036875 ?

          No, but 8,388,608 still haunts me!

          Hint: PIC S1(23) SYNC RIGHT.

          1. Andy A

            The official in-all-the-training-manuals method of converting an integer from binary to decimal for printing started by dividing the integer by 1-followed-by-as-many-zeros-as-you-wanted-digits in the result. So if you wanted to print a 4-digit number you would divide by decimal 10,000.

            That gave you a "binary fraction" as a result.

            You then went through the CBD (convert binary to decimal) instruction the relevant number of times - 4 in the example.

            However, the biggest integer in one word was +8,388,607. If you wanted to print that you needed to do double-length division.

            Now some clever sod worked out that if you multiplied an integer by 7,036,875, you ended up with the same answer as if you had divided by 10 million. What's more, ordinary multiplications were faster than ordinary divisions, so if you wanted a 4-digit output, it was actually faster to multiply by the "AMAGIC" constant and then throw away the first 3 digits.

    2. CABVolunteer

      "Who remembers Backus-Naur ?"

      I do! Back in the early-1980s, working at the in-house consultancy arm of a multinational, I was on-site doing some tedious task when I was joined by a colleague from our Dutch office. He had even less patience than me, so less than 30 minutes in, he resorted to giving me a formal lecture on Backus-Naur notation for the rest of the morning (and returned to Rotterdam in the afternoon). (When the main Board closed us down the following year, he returned to the University of Leiden. Thank you, Joss - I'll never forget BNF.)

      1. Ian Joyner Bronze badge

        BNF or EBNF is still used to better define language syntax or you end up with non-context-free messes like C (luckily C is simple, or it would not be predicable).

        Other techniques like denotational and axiomatic semantics are used for semantically defining the effects of language constructs.

        BNF can be used not just for languages but any system designs.

        I even use EBNF to present topic and lecture organisation to students.

        BNF has a somewhat arcane syntax that was cleaned up, simplified and extended by EBNF.

    3. Fustbariclation


      Isn't BNF still useful for representing languages?

      1. Torben Mogensen

        Re: BNF

        BNF is essentially just a notation for context-free grammars, and originated in Noam Chomsky's work on linguistics. While compiler books and parser generators use somewhat different notations, they are still BNF at heart. EBNF (extended Backus-Naur form) extended the notation with elements from regular expressions (but using a different notaion), including repetition ({...}), optional parts ([...]) and local choice (...|...).

      2. martinusher Silver badge

        Re: BNF

        >Isn't BNF still useful for representing languages?

        Apparently its not the 'modern' thing to do, at least that was what I was told by some clever so-and-so who was implementing a form of scripting used on a piece of embedded hardware. Judging by the dog's breakfast that resulted -- and the relative ease of specifying a grammar that I needed for a test tool -- I reckon that BNF.1 isn't obsolete, its as relevant as ever.

  22. Beeblebrox

    Not quite how I remember it

    "you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish"

    When I was in the sixth form, one of my maths teachers had a contact at the local Poly. We were low priority, do we got to leave our precious punched tapes with the operator who would run it for us in time for us to pick up the gibberish the next day.

    Needless to say I thought computers were complete bollocks.

    A couple of years later when I was at University, and they had these lineprinter style terminals, I started to think wow, maybe there's something in this; soon after I got access to a green screen terminal.

    The rest is history.

  23. RobThBay

    The edit/compile/run cycle

    "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code..."

    I remember those days, except I was using punch cards instead of paper tape. Those long turn-arounds forced you to desk check your code and spend time debugging it properly.

  24. IoTrickle

    Fall 1965: Gunn High School in Palo Alto CA... the school district's fist 'computer' class (Perhaps one of the first HS classes anywhere?)

    They only had a keypunch machine at the school... and at the first class admitted they were still looking for a computer to use, as the District had quashed plans to allow their own in-house computer to be used by [gasp] students!

    Next class, they announced they had FOUND a computer very nearby:

    The Burroughs B5500 mainframe at SLAC!!! (Stanford Linear Accelerator Center).

    No problem for students there!!! We ran our class exercises in B5500 ALGOL.

  25. Joe Gurman

    And yet....

    ....FORTRAN marches on, particularly for the physical sciences HPC crowd while ALGOL.... needs an article like this to remind people that it ever existed.

    1. Fustbariclation

      Algol marches on

      Algol marches on in Pascal, Ada, C, C++, Java, Javascript... pretty well every current language - it only wasn't an influence for APL, Lisp, Prolog, RPG, and other less mainstream languages.

      1. MacroRodent

        Re: Algol marches on

        Besides modern FORTRAN has about as much resemblance to the original 1950's FORTRAN as Algol 60 has to C. It is in practice an entirely different language.

  26. Just A Quick Comment

    Old languages

    Yes, not only do we need to remember the old languages just to prove they existed but, also, to learn from them that sometimes the old ways can have merit.

    1. Ian Joyner Bronze badge

      Re: Old languages

      Not just merit – frequently better than current widespread languages.

  27. sturdy1234

    Early Algol68 Compilers

    In the early 1980’s UCLA had an active group of graduate students working in a compiler for Algol68 for the IBM 360/91 under Robert Ugalus. The book, “the Informal introduction to Algol68” was the Bible for the language semantics. The book created its own set of specialized words like propositity and lapsity to describe in precious words the language semantics. The underlsying memory model was the stack retention model with variants called the Cactus stack for its threaded nature. The language semantics, referred in previous posts as European sensibilities was quite well done. It lacked a machine description in favor of very precise descriptions of the language semantics, very worthwhile reading for those who like reading compilers semantics by jumping into the way back machine. As Joe Bob Briggs used to say, “check it out”.

  28. Mud5hark

    Algol 68 on a ICL 4100

    When at Warwick University in 1974 I was using Algol 68 for my computing project. My maze solver did eventually work though as mentioned the bug cycle was more like one day as you had to post your punch card stack into the pigeon hole of the computer suite and wait for them to process it. I found by trial and error that why my maze solver never ran correctly on Friday afternoons was because they used to take half of the magnetic memory offline to clean it or something. The largest maze I could create an a Friday was 3 x 3!

  29. Anonymous Coward
    Anonymous Coward

    The Invasion

    The bit that always worried me was how it would cause computers to catch fire and shut down of you gave it an insoluble program, as demonstrated on Doctor Who's 'The Invasion'.

    ...what do you mean, that wasn't real?!

  30. Don Casey

    50 years ago...

    ... as a struggling Computer Science major at The University of California (Berkeley): in one class we were tasked with writing a simple compiler (using SNOBOL) to spit out COMPASS (assembler code for the CDC 6x00-series) code for a set of statements.

    I graduated, and passed the course (somehow), but never got the damn thing to work. After graduation moved into the commercial world and never saw Fortran, Algol, SNOBOL, or COMPASS ever again.

  31. Prophet Heisenberg Uncertainty Principle
    Thumb Up

    First learned programming ALGOL 60 on an Elliot 803 installed at RMIT in mid 1960s in Melbourne Australia. By correspondence; sent in code to be punched into paper tape by RMIT; certainly was a slow turn around for fixing typos/bugs.

  32. Fustbariclation

    Algol 60 on the HP 2100A

    Well, I've used it -- it was the best language to use on the HP 2100A, because you needed only one paper tape, because Algol 60 was designed to work with a single pass compiler. Apart from it being a better language anyway.

    FORTRAN IV, as it was then, needed a two paper tapes. So, even for a little FORTRAN program, you had to load the first pass compiler tape (then wind it up on its spool), then put your program tape through and get the intermediate tape. You'd then load the second compiler paper tape, and feed it the intermediate tape. It would then produce the binary. If anything went wrong, you had to start again.

    Even then proper design made a huge difference. Declaring your variables first, and declaring functions before you use them, makes good sense anyway, and has the excellent side-effect of improving usability by one-pass compilation.

  33. timjh

    Algol 60 at school

    In about 1964 a very enlightened teacher at Bedford School devoted a couple of lessons to teaching us the rudiments of Algol-60. Another teacher built a two-bit (literally?) computer from post-office relays; I had to fire up the AC to DC motor-generator (normally used to power the school's ancient film projector in the Great Hall) to meet its power requirements so that I could add 2 to 2 and watch it get 4. All that started me on a career through Fortran, Cobol, PL/I, Assembler, C etc and now, in retirement, PHP, Python, CSS and Javascript.

  34. PeterO

    Rugby College course notes

    For those who have mentioned learning Algol 60 at Rugby this may be interesting :

  35. Torben Mogensen

    The influence of ALGOL 60

    You can see the influence of ALGOL 60 on modern languages most clearly if you compare programs written in 1960 versions of ALGOL, FORTRAN, COBOL, and LISP (which were the most widespread languages at the time). The ALGOL 60 program will (for the most part) be readily readable by someone who has learned C, Java, or C# and nothing else. Understanding the FORTRAN, COBOL, or (in particular) LISP programs would require a good deal of explanation, but understanding the ALGOL 60 program would mainly be realising that begin, end, and := correspond to curly braces and = in C, Java, and C#. Look, for example, at the Absmax procedure at

    FORTRAN and COBOL continued to evolve into something completely unlike their 1960 forms while still retaining their names -- even up to today. ALGOL mainly evolved into languages with different names, such as Pascal, C, CPL, Simula and many others. So ALGOL is not really more of a dead language than FORTRAN II and COBOL 60. There was a computer scientist in the late 60s that was asked "What will programming languages look like in 2000?". He answered "I don't know, but I'm pretty sure one of them will be called FORTRAN". This was a pretty good prediction, as Fortran (only name change is dropping the all caps) still exists, but looks nothing like FORTRAN 66, which was the dominant version at the time. You can argue that the modern versions of FORTRAN and COBOL owe more to ALGOL 60 than they do to the 1960 versions of themselves.

    1. martinusher Silver badge

      Re: The influence of ALGOL 60

      I thought that the notation of Algol predated computers as such, it wasn't so much invented for the computer but an adaptation of an exsiting mathmatecal notation to decribe operations inside a computer. Algol itself, as you say, isn't dead, its the base for all block structured languages and its forms are recognizable in them today (even though the signature feature of a 'new' language is that it doesn't require semicolons or braces or similar lexical quirks).

      I've always felt that there really are only two computer languages -- ALGOL and LISP -- with everything else being based on them. There are also line by line interpreters that pretend they're languages and might even be useful for programming but if the language wants to grow it has to adopt features from either (or both) of these two original languages.

  36. William1940

    My first programing language was Burroughs EXTENDED Algol 60 on a B5000 in 1965. Burroughs always extended the basic language. The beauty of the Burroughs' extensions is that their ALGOL 60 was wedded to the hardware design, and their compilers are wall written in the language they compiled. At that time is was ALGOL, COBOL and ESBOL(?SP) which was superset of ALGOL used to program the MCP. I did in fact program the MCP. In ESBOL one had wide open access to the hardware itself.

    The B5000 hardware had an interesting limitation: Code was generated into program segment strings which were block limited to 1024 words. A word was 48 bits. Why? The program segment string S register was 10 bits. If say, a BEGIN code END; block exceeded 1024 words, then the compiler yielded a "program segment string exceeds 1024" error. This of course nailed me in my first program. Easy enough to repair.

    The B5000. B6000 series also had virtual memory. Program segment strings need not be memory resident. And the 1024 word segments could be viewed a VM. Each program segment string terminated n a descriptor that referenced the program reference table (PRT). This referenced location could be a disk address to access the next segment. Etc.

    The B6000 series, I worked on the B6500, generalized the PRT into what was called the cactus stack. This enabled the extended version of ALGOL 60 to have multiple dimensional arrays of tasks. Super feature. An array element had the parent node that could link to family member tasks. The first time I used the feature I had a "death in the family error." This meant that a member the family hierarchy crashed.

    I can go on and on about this amazing hardware/ALGOL fusion. I have to say it's unforgettable since I'm soon to be 80 years old. I've written software in some 20 or so languages during my career as a computer scientist, and Burroughs extended ALGOL 60 is definitely the mid-wife of almost all of them.


  37. Guimar

    I wrte a star trek game in algol 60 on a PDP10

    I used algoll6 0 when I was in High school in the 70's .

    It was a nice procedural language. Better than basic for writing lage programs.

    The Algols eventually led to ADA.

    In College I was exposed to PLM and C.

    C and C++ are where I spent most of my acrrer coding.

  38. Ray Foulkes

    Algol was OK, but all wasn't plain sailing.

    I programmed a lot in Algol 60 between 1967 and 1971 at the University of Salford on the English Electric KDF9. I was doing research into Computer Aided Circuit design using Y matrices. When I started paper tape was in use created and edited on Friden Flexowriters. My program was fairly large for the time and a relatively large roll of tape. There were two ways of changing the program - put the paper tape in the flexowriter and set it to copy, up to the line that you wanted to change, type in the changes, re-sync the rest of the tape and let it copy the rest OR get out the little tape splicing machine and hack out a section, gluing your amendment in. Compiling and running was effectively twice per day since you had to write out a job slip and put your tape in a box for the computer operators to load, compile and execute. Then - REVOLUTION - the kdf9 was front-ended with a pdp8 running a system called COTAN ( I faintly recall from more than half a century ago). This had a disc where you could keep your program and edit it using ksr33 teleprinters (if you were patient). Then submit your job electronically no less.

    Unfortunately the COTAN system had a sort-of anti -Algol design - it was based on 80 column card images. Students were allocated a certain (insufficient) maximum number of cards. Algol 60 was not Fortran, so my beautiful indented layout of the "begin" and "ends" of the paper tape which used quite a lot of blank characters didn't fit in the allocated card space. Answer - eliminate most of the comments and the blanks. This resulted in a more or less "rectangular" program of solid text 80 characters wide - not that the Algol compilers cared but the very devil to read, edit, and debug.

    I still have some printouts (somewhere) of the resulting mess together with some of the little boxes originally used to submit the paper tape jobs.

    I don't hunger after the "old days" of computing.

  39. Greum

    I remember learning ALGOL in the late 60s, although I never used it for a real-life task. The only thing I can recall was that, as well as regular quote marks, there was a pair of very large opening and closing quotes which our tutor referred to as George Woodcocks after a trade union leader who had extremely bushy weyebrows.

  40. Ian Joyner Bronze badge

    Andy Herbert

    I met Andy Herbert a few times at ISO Open Distributed Processing (ODP) meetings. I doubt he’d remember me.

  41. Ian Joyner Bronze badge

    A lot of people program in ALGOL 60

    There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow.

    I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia.

    Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler.

    Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C.

  42. Ian Joyner Bronze badge

    Not a language of card and paper tape

    While the videos sort of tie ALGOL to paper tape and cards, ALGOL is far in advance of these long-gone technologies.

    Burroughs systems were much more disk-based like modern systems. When Edsger Dijkstra visited the plant in Pasadena he was amazed that the ALGOL program he had compiled in seconds. He wanted several Burroughs systems for Europe, but Ray Macdonald, then CEO of Burroughs did not want to sell systems to Europe at that time.

  43. Ian Joyner Bronze badge


    Burroughs ALGOL originally implemented define … # from a suggestion by Donald Knuth. This was adopted in C as the #define (# introducing the define rather than terminating it).. Like many copied things in this industry, the original is usually better and Burroughs defines are much better than C defines.

  44. spold Silver badge


    I was just going through some old boxes and found my "A course on programming in Algol 60" by Reeves and Wells! (1970 print edition). I wrote Algol code (on paper-tape) for an ICL 1902 - second programming language I learned! ... the first being Basic, ok a smattering of Cecil (the ICL was old and had been donated to my school by United Bread - now toast).

  45. Shindles


    Having done my degree at Woolwich Polytechnic (Tommy Flowers was there long before me!) from 1965 onwards while working as a rocket scientist at the MoD, I was one of the early progammers who never actually went in to the computer room where the program was run. I simply submitted my Algol programs on coding sheets which would be run and sent back. I did occasionally see the computer through a window!

    I then taught Algol at Braintree college on an Elliott 803.

    Nothing compares with MIRFAC, probably the most sophisticed language ever running on the MoD's COSMOS computer though. Simon Lavington reviews this machine.

  46. Shindles

    Re: Facebook simply would not exist today if not for Bletchley Park

    Apart from Colossus Bletchley Park had nothing to do with electronic digital computers. All data analysis was done on Hollerith-type card machines.

    Actually Britain was far ahead of the US in developing commercial computers. Just look at the history of LEO. Britain also pioneered practical computers such as EDSAC which is the forerunner for all today's computers and can bee seen at the National Museum of Computer, located on the Bletchley Park Estate. Spend 3 hours there and see the Harwell Dekatron - the world's oldest original working computer.

    While there see a working Enigma/Bombe and the Lorenz/Colossus.

    My advice is to see TNMoC first then another pop down to Bletchley Park for a brief tour to see the Mansion and other interesting facts about their role in the the second world war.

  47. oldcoder

    Computers got MUCH easier to use after DEC created some timesharing systems.

    Even the old PDP-8 could do timesharing (as primitive as it was).

    And CPM followed their design.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like