back to article RIP: Software design pioneer and Pascal creator Niklaus Wirth

Swiss computer scientist Professor Niklaus Wirth died on New Year's Day, roughly six weeks before what would have been his 90th birthday. Wirth is justly celebrated as the creator of the Pascal programming language, but that was only one step in a series of important languages and research projects. Both asteroid 21655 and a …

  1. stiine Silver badge
    Pint

    Life's no longer Wirth living...

  2. Gene Cash Silver badge

    Good research!

    I thought I knew a lot about him, but I discovered a ton of stuff I didn't know in this article.

    Thanks!

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Good research!

      [Author here]

      > I discovered a ton of stuff I didn't know in this article.

      That was my plan. I am glad it worked out. Thank you!

  3. UCAP Silver badge

    Pascal was the first programming language I learnt at University, and while I have not programmed using it since (nearly 40 years!) it still has my highest regard as being easy to use.

    RIP Niklaus Wirth. You've finally hit the END statement.

    1. HuBo Silver badge
      Headmaster

      Same here (after machine and assembly), with an Apple IIe (and some ucsd p-code/Pascal card) at home, and dumb terminals into a PDP 11 at school. What I love though (beyond the clarity of the structured programming approach fostered by Wirth), is the language. For example, in his 1995 "A Plea for Lean Software" (linked in Liam's very nice and extensive obit), one reads:

      "Software's girth has surpassed its functionality"

      Wow! Great prose from a great mind -- he'll be greatly missed!

    2. ChrisC Silver badge

      I was in the last year to be taught Pascal at uni, and at the time I was a bit miffed at missing out on any formal C teaching (knowing even then how much more useful that would be for my career to come). Some years later however, when I got my hands on whichever version of Delphi first made it onto a coverdisc and I realised just how insanely good it was for creating desktop tools compared with the versions of VC++ and VB I'd dabbled in up til then, I became rather more thankful for having been given that grounding in the underlying language, and despite the best efforts of VC# to displace it, I still find myself using it from time to time even now.

      So whilst Pascal is the only one of his creations I've had any hands-on exposure to, it ended up having such a large impact on my life that I genuinely feel saddened at reading this news.

    3. Michael Wojcik Silver badge

      When I started with Pascal — using Turbo Pascal 3.0 and Borland's great Turbo Tutor book — I'd already written programs in BASIC, COBOL, and assembly for a handful of ISAs. I'd looked at samples of code in a number of other languages, including LISP, FORTRAN, PL/I, and even an inscrutable APL example.1

      Pascal was an epiphany. Oh, yes, of course that's how a programming language ought to work.

      Not many years later I had something of a similar reaction when I learned LISP properly, and again with OCaml. But never as profoundly as when moving from a primarily-BASIC mindset to Pascal.

      During my CS degree one of my projects was working on a Modula-2 compiler, so I got to play with that language a bit as well. Never used Oberon.

      1Many years later I got an interactive APL environment and learned the language well enough to write some small programs that actually did something useful. It's still inscrutable. I look at screenshots of code I wrote and I can't remember what half the operators do, or why in the world they'd do that.

      1. fg_swe Silver badge

        LISA, HP MPE

        Both highly influental OSs, both implemented in a Pascal flavour !

      2. TotallyInfo

        APL: The write-only language!

        "Many years later I got an interactive APL environment and learned the language well enough to write some small programs that actually did something useful. It's still inscrutable. I look at screenshots of code I wrote and I can't remember what half the operators do, or why in the world they'd do that."

        I really wouldn't concern yourself about that. At one point I wrote a lot of APL and if you could work out what you'd done a month later you were doing really well! :-)

        But an amazing language if you needed some complex multi-dimensional/matrix math.

        The only language I've come across where you wouldn't be concerned about focusing on a single line of code for a week. And a really complex program might only be a few lines (of utterly impenetrable) code.

    4. An_Old_Dog Silver badge
      Headmaster

      Being Picky

      Hmm ... the last END statement in a correct Pascal program must be followed by a full stop ('.')

      I wonder what Kathleen Jensen and Urs Amman have been up to these past years (as I understand it, they were grad students working under him.) Ms. Jensen was the co-author with Professor Wirth on Revised Pascal User Manual and Report, and wrote some of the runtime routines for Pascal-6000 (which ran on CDC's 6X00-series supercomputers). Mr. Amman coded a Pascal compiler, which was written in (a subset of) Standard Pascal. I was not impressed with his one-character indentations, but he was working with a 72-column limit (columns 73-80 were used for card sequence numbers).

    5. itzumee
      Pint

      Pascal Pedant here

      You mean he finally hit the END. statement

  4. MacroRodent

    A lovely obit

    Thanks!

    Pascal was the second programming language I encountered, after BASIC. The introductory programming course at HUT in 1981 used Wirth's "Algorithms+Data structures = Programs" as the text book.

    I also dabbled at one point with Modula-2 (but not much as much as a more clever friend of mine, who used it to implement a universal diskette reader for a CP/M machine he had built from a kit).

    1. MiguelC Silver badge

      Re: A lovely obit

      I had a course subject entirely based on that book. Unsurprisingly, the course was called "Algorithms & Data Structures"

      I also got my only 100% mark on a project in one written in Pascal - although the teacher admonished me for my snarky comments on the code (like a header comment stating the code had been written for the admiration of others... ah, the delusions of youth)

      R.I.P. Niklaus Wirth, you were a luminary

    2. cnsnnts
      Angel

      Re: A lovely obit

      > "Algorithms+Data structures = Programs"

      By the time I picked up Wirth's text book I had already read a number of books about programming. What immediately struck me about "Algorithms+Data structures = Programs" was how next to the description of shell-sort was the calculus to determine the efficiency of the algorithm, similarly for quick sort and so on. I now realised that this wasn't just a computer book but a Computer Science book.

  5. Jay 2

    Ah Modula-2, that was the base language for the start of my degree back in 1992. On pretty much the first day one of the lecturers stood up and said "We could teach you something more comercially viable like Ada, but we're not into that". The next year the base language was indeed changed to Ada...

    If I mention Modula-2 then I usually refer to it as "Son of Pascal" to give people an idea of what it is, as they may not have heard of it. Though to be honest back then I had a hard time figuring out the move from BASIC/COMAL on a BBC to Modula-2 (and all the other langauges we had to mess with; Ada, Lisp, C to name a few) on a Sun box, which probably explains why I'm a sys admin now! Though I think after all these years I have figured out procedures, functions and libraries!

  6. cornetman Silver badge

    > contributing a strong "Closing Word" to the November 1968 Algol Bulletin 29

    As I read that interesting and insightful document, I could only think of C++ and the impenetrable monster it has become.

  7. This post has been deleted by its author

  8. Androgynous Cow Herd

    USCD Berkeley?

    What the hell is that?

    It's "UC Berkeley"

    or "University of California, Berkeley"

    or, colloquially, "Cal"

    If you re talking about the place that put the "B" in BSD Unix"

    USC is "University of Southern California" and it is far away in Orange County

    Colloquially, "University of Spoiled Children" for it's student body predominately made up of Buffys, Tanners and Chads.

    GO BEARS!

    1. Rob Menke

      Re: USCD Berkeley?

      Methinks the author got confused. Perhaps he mixed up Cal Berkeley with UCSD Pascal, which was an implementation of the language from the University of California, San Diego.

      1. ldo Silver badge

        Re: USCD

        UCSD Pascal was more than just a language dialect, it was the core of the “p-system”, which tried to abstract away from the plethora of incompatible machine architectures by having all compiled code run on an abstract “p-machine”. So just about the only actual machine code that had to be written for a new architecture was the p-machine interpreter. And maybe some very-low-level drivers. Then everything written in UCSD Pascal (including most of the OS) would port straight across.

        This was the first popular implementation of the “write once, run anywhere” concept. And yes, it paid the price in speed of code execution. But some people felt it was worth it.

        1. Michael Wojcik Silver badge

          Re: USCD

          p-system was an early interpreted-intermediate-representation implementation. I'm not sure it was first. UCSD Pascal came out in 1977, which is a year after Micro Focus was founded, and I thought the MF COBOL INT format went back to their first compiler implementation.

          IBM's System/38 was commercially released a year later, and it used an intermediate-representation abstraction as well; when that was carried over into the AS/400, it permitted architecture-independent binaries which enabled the 400's move from a CISC to a RISC ISA. And, of course, the S/360 was founded on the idea of keeping the same ISA across different hardware implementations, though of course it was "run anywhere as long as that anywhere is some sort of System/360".

          But for all of that, p-system may well have been the first attempt at a truly architecture-neutral compiled format with the intent that it be executable on many target systems.

          1. C R Mudgeon

            Re: USCD

            "p-system may well have been the first attempt at a truly architecture-neutral compiled format with the intent that it be executable on many target systems."

            Is BCPL's O-code similar enough to count? That dates back to 1967 IIUC.

            1. prandeamus

              Re: USCD

              Is BCPL's O-code similar enough to count?

              Close but no cigar. Ocode successfully separated the front ends of the compiler (lexical, syntax analysis, and all that) from the code generator. The front ends outputted Ocode, and up to that point pretty much all BCPL compiler were the same. Some variations existed e.g. the Acorn BCPL compiler was structured as a set of overlays or whatever because of the extreme RAM limit on a BBC model B. The OCode was fed to the code generator stage which could vary wildly between implementations. So it was not originally conceived as a run-time environment.

              Moving on a little, M. Richards did produce INTCODE, that was a simple virtual machine designed for interpretation, to help with bootstrapping. You could cross-compile BCPL to OCODE and code generator would output INTCODE, and then write a simple but slow INTCODE interpreter on the target machine. I think the idea there was to be "just good enough" to bring up a compiler on the target machine for which you could write a native code generator. INTCODE was a bootstrap aid, not an execution environment.

              The next iteration was something like CINTCODE which was designed for interpretation. CINTCODE was designed for microprocessors where address space was at a premium (C=Compact). So in the Acorn implementation of BCPL for BBC Micro, the code generator takes OCODE as input and CINTCODE as output, and the CINTCODE was interpreted by code in the *BCPL language ROM. CINTCODE implementations did permit escapes to native compiled code for those time-critical bits. The BBC Micro BCPL was created at Richards Computer Product, run by John Richards, lovely fellow. M. Richards continued to develop this stuff further in academia so there's probably a divergence between the two over time.

              That said, I think CINTCODE is roughly contemporary with UCSD p System.

              I'm an old man now, forgive my slips of memory.

          2. An_Old_Dog Silver badge

            Re: USCD

            Let's also remember the Western Digital Pascal MicroEngine (yes, that Western Digital. WD makes only hard drives now, but used to make video cards and other things).

            The MicroEngine was a CPU whose machine language was UCSD p-Code.

      2. ldo Silver badge

        Re: USCD

        Also I should add, UCSD Pascal was very influential in the up-and-coming PC world. Borland’s Turbo Pascal, for example, could be seen as a UCSD dialect.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: USCD Berkeley?

      [Author here]

      Whoops!

      My bad.

    3. Anonymous Coward
      Anonymous Coward

      Re: USCD Berkeley?

      A typo is a typo ... (I expect no conspirational ill-intent on the part of the author there ...) ... USCD, UCSD, tomato, ...

      1. IvyKing Bronze badge
        Flame

        Re: USCD Berkeley?

        Except that UCSD (UC San Diego) is emphatically not UCB (UC Berkeley or UC Bezerkeley), which is often referred to as "Cal". Don't get me started on the "Cali" BS.

        N.B. I am a Cal grad, though know UCD, UCI, UCLA, UCSB, UCSC, and UCSD grads, not sure about UCSF or UCM.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: USCD Berkeley?

          [Author here]

          > I am a Cal grad

          This is part of the thing. I have never been to California in my now over half a century life, nor any part of the West Coast of North America.

          To me, these are all just abbreviations, or mispronounced names, or worse still, abbreviations of mispronounced names. (To me, it is BAR-kley, but as I understand it, to Americans, it is BER-kley.) It is sometimes hard to keep them all straight, whereas it is obvious to me how to pronounce Worcester and Gloucester and Leicester, and if you are Californian, I suspect you might get those wrong.

          I apologise for my error. I fear it was some kind of finger-based autocomplete based on somewhere on another continent, many thousands of kilometres away, that I've never seen, never been to, and could not point to on a map. They're all just acronyms to me: UCSD, BSD, UCLA. I do not have the foggiest where Berkeley is in relation to San Diego or San Francisco or Los Angeles or Su Madre. It would be much easier to me to find Zürich on a map than Berkeley: I've been to Zürich several times, and rather like the place.

    4. C R Mudgeon

      Re: USCD Berkeley?

      As an aside:

      "University of California, Berkeley" or, colloquially, "Cal"

      Is that true only in a local enough area to make the abbreviation unambiguous? Are any of the other UC campuses known locally as "Cal"?

      Or does Berkeley have that nickname state-wide?

      And is Berkeley ever "UCB"? I don't think I've seen that, even though many (all?) of the other campuses are known as "UCx".

      1. Dave559

        Re: USCD Berkeley?

        UCB certainly exists (or existed) in at least as much as SUN put Berkeley originated programs into /usr/ucb (page stupidly requires JavaScript to even load, but that's sadly Oracle for you [1]).

        I'm on the wrong side of the pond to be able to say whether Berkeley is ever referred to as UCB in the physical world, however…

        [1] On the other hand, the page author includes a nice little easter egg with a link to this "The Joy of UNIX" button badge, with a SUN logo on it. :)

  9. Jumbotron64

    Sad news indeed. I never became much of a programmer. After my first experience with programming in 1982 being punch-card coding Fortran and taking my stacks to the university Vax mainframe for running and output and then giving BASIC a go on my Commodore 64 I pretty much gave up on it. But I always wanted to give it another go. I decided in my elderly years to try again and went over to rosetta.org to look at code examples of various languages.

    It became clear to me after about 15 minutes of looking at various code examples that to my eyes and my mind that there were certain languages that just naturally "looked" better and read better. That is to say that although I have a way to go to finally get my head wrapped around code logic, the way these certain languages presented said logic and the syntax used just made things "click" in my head. I could "see" more intuitively what the programmer had in mind and where he/she was going with it.

    And those languages were...Ada, Delphi, Lua and Pascal. So I decided to give each a go with an IDE if available if not a RAD framework. Delphi was out quickly because of cost. Lua is nice but not really any kind of RAD framework outside of gaming and it's an interpreted language best for scripting....no problem with that. That left Ada and Pascal. Once you learn one the other is a piece of cake.

    Finally settled on Pascal through Free Pascal using the Lazarus IDE. WIsh I had this as my introduction to programming back in 1982. But BASIC, Fortran and COBOL was the menu back at my small university.

    Thank you Niklaus Wirth. As an American I know both your name AND your worth. And you are worthy indeed.

    1. PRR Silver badge

      > Delphi was out quickly because of cost.

      Boogle "delphi free"

      embarcadero lets "community" use their Delphi CE for free, up to some income point that you may not be aspiring to.

      Borland Delphi 7.0 (2002) is abandonware at WinWorldPC.com 'License' (no-nag) codes down the page.

      There seem to be free-beer imitations but my Googl-Foo is sore.

      1. katrinab Silver badge

        Lazarus is the free Delphi clone, still in active development.

      2. deltics2

        In addition to the terms of use restrictions, Delphi CE is a different SKU than any other edition; it is not like Visual Studio Community Edition, which is identical to the paid-for edition (or was; it has been a long time since I had cause to use it so I don't know for sure these days).

        If you can stomach wading through the "feature matrix" (28 pages, listing things that really shouldn't count as "features" in 2024) you will find some perplexing omissions in the "Community Edition".

        Bizarrely, code formatting tools are missing in the CE SKU but most ironic of all, given that the Terms of Use are especially compatible with developers working on free (and most likely, therefore, open source) projects, most (if not all) of the VCL source code is also not present. Aside from the irony, the VCL source is a primary learning resource for anyone new to the framework, not to mention incredibly useful when tracking down elusive bugs to identify whether the problem is in the VCL rather than your own code (which happened to me more than once in my many years as a career Delphi dev).

        And AFAIK, even with the CE you still have to contend with the Embarcadero license infrastructure which appears to be a constant thorn in the side of the (dwindling) Delphi community.

        1. david 12 Silver badge

          The Delphi library code was never open source -- and it inherited that from Turbo Pascal. Even for DOS, the library source code came with the more expensive Professional or Enterprise version.

  10. Vincent Manis

    Wirth, ALGOL68, and the Meta key

    A couple of observations. First, ALGOL 68 did not succeed, but its complexity has been overstated. Partly due to the horrendously un-understandable 2-level grammars of the Report, and partly due to language features that were not yet understood (parallelism and semaphores, among others), it got a reputation as unimplementable, even though there were almost-complete subsets built in the 1970s, and ALGOL68 Genie flourishes to this day. Wirth and Hoare had very good reasons for rejecting it, but I would argue that modern C++ is at least as complex as ALGOL68, if not more so.

    Second, during his California stay, if not afterwards, Wirth had the nickname`Bucky'. At one point he suggested an Edit key to set the 8th bit of an ASCII character on a keyboard. This was the base for the Meta, Super, and Hyper keys of various Stanford and MIT keyboards, and the modern Alt, System Logo, Command, and Option keys of modern keyboards. The bits these keys set are known as `bucky bits' to this day.

    1. Anonymous Coward
      Anonymous Coward

      Re: Wirth, ALGOL68, and the Meta key

      Wirth and Hoare had very good reasons for rejecting it, but I would argue that modern C++ is at least as complex as ALGOL68, if not more so.

      Q.E.D.

      1. Michael Wojcik Silver badge

        Re: Wirth, ALGOL68, and the Meta key

        But where would we be without template metaprogramming and Koenig lookup and half a dozen kinds of smart pointers? You don't want to make programming easy, do you?

        (Actually, I don't mind C++, when the source code I'm dealing with is well-written. The problem is that IME it almost never is. Generally it's a ghastly, unreadable mess that often misses several much better and idiomatically C++ ways to accomplish basic tasks — partially because the language is Too Damn Big and most practitioners never seem to have studied how to write good C++ code.)

  11. ldo Silver badge

    ALGOL 68 Blew My Mind

    Picture this: first-year computer science student, encountering the resources of a University library for the first time. Picked up this “Revised Report on ALGOL 68”. Already knew something about BNF, and about its limitations (like being unable to say that “an identifier being used in an expression must already have been declared in some way compatible with its use”). And discovered that somebody had worked out a solution to these very limitations!

    And then later discovered that most people had no interest in using van Wijngaarden grammars (or attribute grammars — I used to annoy one CS lecturer by continually getting the two mixed up) as part of any later language definition. I was also doing physics courses at the same time, and imagine if some physicist had said “quantum theory is too hard, let’s forget about it and go back to doing things the good old-fashioned Newtonian way”.

    1. PauloTos

      Re: ALGOL 68 Blew My Mind

      Funny you should say...

      At Manchester Poly in 1983ish a fellow student wanted to write a poem in Algol68C but he'd only got to one stanza:

      WHILE not logical world end

      DO

      blow my mind

      OD

      Still appeals to me.

      1. ldo Silver badge

        Re: ALGOL 68 Blew My Mind

        If I wanted some light relief among my Comp Sci reading, I would go look at issues of “SIGPLAN Notices”. This was from the ACM Special Interest Group on Programming Languages. They tended to be a little less formal than ... some of the other groups.

        Some writers did not like the Algol 68 convention of writing words backwards for closing statement brackets, and expressed their opinions (entirely subjective, of course) in creative ways.. One article I remember had the title “do Considered od Considered Odder Than do Considered ob”. The body was a single sentence along the lines of “this convention of reversing words should be carried out to the letter, so to speak.”

        Another reaction took the form

        comment Blecchh! tnemmoc

        1. MacroRodent

          Re: ALGOL 68 Blew My Mind

          > Some writers did not like the Algol 68 convention of writing words backwards for closing statement brackets,

          The original unix sh (shell) and its clones like Bash still do that. "if" is terminated by "fi, and "case" by "esac" (but curiously, "do" ends with "done", not "od"). Wonder if the style came from Algol 68? It was probably a hot topic around the time the sh was designed.

          1. ldo Silver badge

            Re: ALGOL 68 Blew My Mind

            Yup, those keywords would have been copied from Algol 68. Also I think the “long” and “short” prefixes for different sizes of numeric type were copied by C. As was the “void” type.

            Quite a lot of terminology, near as I can tell, seemed to originate with Algol 68. Some of it became popular, others did not. E.g.

            * “coercion” for implicit type conversion from the actual type of an expression to the expected type in its context

            * “deproceduring” for the act of calling a procedure/function and replacing the call in the expression with the returned value

            * “elaboration” for the process of converting the program source to some target executable form and executing it

            * “heap” for a memory region which from which allocation is done dynamically

            * “mode” for what everybody else called a “type”

            * “name” for what everybody else called a “pointer” or “reference” or “address”

            * “overloading” — having different meanings for the same operator, chosen according to context

            * “transput” being their name for both “input” and “output”, or “I/O” as we would say

            * “voiding” for discarding the result of an expression

            How many do you recognize?

            1. Michael Wojcik Silver badge

              Re: ALGOL 68 Blew My Mind

              “name” for what everybody else called a “pointer” or “reference” or “address”

              Except that an ALGOL-68 "name" is not the same thing as a "reference" in other languages. In particular, ALGOL's call-by-name has rather different semantics than call-by-reference, and is now justifiably obscure. And ALGOL-68 had call-by-name and call-by-reference (and call-by-value), so the two have to be kept distinct even in the context of ALGOL programs.

              At least that's my understanding. I played around a very little with ALGOL; I've never used it in any significant way.

              1. ldo Silver badge

                Re: ALGOL 68 Blew My Mind

                No, Algol 68’s “name” had nothing to do with what Algol 60 described as “call by name”. This is just adding to the confusion. The meaning was what I said it was.

                Algol 68 did, I think, have a kind of equivalent of “call by name”. They used a different term for it: “proceduring”.

              2. ldo Silver badge

                Re: ALGOL 68 Blew My Mind

                Correction to above: the “proceduring” idea may have been present in earlier versions of the Algol 68 spec, but it is not in the final “Revised Report”.

              3. gnasher729 Silver badge

                Re: ALGOL 68 Blew My Mind

                “Call by name” is back in Swift as “autoclosure” arguments. Instead of passing a value, the caller passes a closure that the callee can evaluate.

                How it is used was probably not what Algol68 expected. It allows to make || and && part of the standard library instead of the language, same with asserts or logging statements.

          2. Ciaran McHale

            Re: ALGOL 68 Blew My Mind

            Donald Knuth wrote an interesting-but-dated (and hence of historical significance) book called "Literate Programming". The book was a collection of essays, some of which had been written a few years before I did my undergraduate degree in university. One of those "just before my time" essays discussed competing proposals for the syntax of looping constructs, such as while loops, do-while loops, repeat-until loops, and a few more. Some of these variations I remembered being taught in university, while other ones seemed strange because I had never encountered them before, but reading about them made me realise that much of what we take for granted in programming might seem obvious in hindsight but had been hotly contested when first proposed.

          3. coconuthead

            Re: ALGOL 68 Blew My Mind

            "do" didn’t end with "od" because "od" was the octal dump utility. Fun fact – it’s still there on macOS Sonoma! (Although I’ve always used "hexdump".)

            1. Michael Wojcik Silver badge

              Re: ALGOL 68 Blew My Mind

              od is indeed required by the Single UNIX Specification (the successor to POSIX and XPG). The SUS Rationale says "The od utility has been available on all historical implementations", though it's not clear what they mean by "all historical implementations". Certainly versions (with certain differences) were present in both BSD 4.x and SysV.

      2. Dr Paul Taylor

        od ... do

        In Slovene (and probably some other Slavic languages) "od" means "from" and "do" means "to".

        1. Martin an gof Silver badge

          Re: od ... do

          And in Welsh, for (English word) "now", northern Welsh dialects use "rwan" while southern ones use "nawr". Never heard a satisfactory explanation of that other than it being bloody-mindedness on the part of the Gogs (northern types) or the Hwntws (them over there), depending on your own point of view.

          M.

        2. HumPet

          Re: od ... do

          Which caused us some trouble when migrating DB from Oracle to PostgreSQL. Oracle had no problem with "od" and "do" columns, however, in PostgreSQL, "do" was a keyword. So we had to perform some column renaming (we didn't want to use quotes which was one way around the keyword).

  12. ldo Silver badge

    Semicolon Wars

    Here’s an old controversy from the time that nobody seems to have mentioned (yet): should semicolons be statement separators, or statement terminators? The Algol tradition was that they were separators that went between statements, which meant you should omit the semicolon if there was no statement following (remember that a closing bracket symbol like end is not a statement in itself).

    However, this rule was loosened a bit in many Algol-like languages (including Pascal) by admitting that a “null statement” (consisting of no symbols at all) could also be a statement. This allowed you to put in a semicolon before that end, by simply pretending there was a null statement in-between.

    Ada went further, by explicitly making semicolons into statement terminators, and making them compulsory after every statement. Algol 68 went the other way, by prohibiting null statements: you had to put in an explicit symbol like skip to denote a no-op statement.

    1. Dagg Silver badge

      Re: Semicolon Wars

      The one issue with the use of semicolons in Algol was the classic dangling else problem

      https://en.wikipedia.org/wiki/Dangling_else

      1. ldo Silver badge

        Re: dangling else problem

        That was easy to solve. It was just a bug in the BNF grammar for Algol 60, which was fixed in later languages, including Pascal.

        1. MarkMLl

          Re: dangling else problem

          "fixed in later languages, including Pascal"

          No it wasn't. Wirth was in a rush, he based his earliest Pascal compiler on ALGOL W which used recursive ascent (rather than descent) and the changes required would have taken more time than he thought was available.

          In practice, because ALGOL-68 was delayed, he could have slowed down a bit and done the job properly.

          Apropos the semicolon, being a /separator/ rather than /terminator/ it couldn't appear before an ELSE (and some Pascal implementations were picky if it appeared before e.g. UNTIL).

          But after various people- UCSD, Borland- had "improved" the language we've ended up with a mixture of structures which handle single statements (e.g. if-then-else) and structures which require END (try-finally-end).

          The comparative lack of success of Modula-2, which tidied this stuff up, is unfortunate. But Wirth no longer had Kathleen Jensen to help him make it intelligible.

          1. ldo Silver badge

            Re: dangling else problem

            Maybe you don’t understand what “dangling else” meant. It meant that, according to the original Algol-60 grammar, a construct like

            ␣␣␣␣if A then if B then C else D

            is ambiguous, because the else could be attached to either the first or the second if...then.

            Once this ambiguity was discovered, it was realized that the most reasonable interpretation was “an else shall attach to the last unpaired if”, and in later languages, everybody’s formal grammars were amended appropriately. This was the interpretation in Pascal, and I’m pretty sure the fix was present as early as Algol-W as well.

            1. MarkMLl

              Re: dangling else problem

              I am fully aware of what it means. The problem persisted in Pascal implementations: it is more than just a specification issue.

              It was fixed in Modula-2, which Wirth designed at his relative leisure after- I believe- switching to recursive descent which is easier to maintain.

              It was fixed in Ada, where Wirth- and other authors of the ALGOL-68 Minority Report- served on the HOLWG.

              But most importantly it was fixed in ALGOL-68 which required an explicit FI etc., hence Wirth was aware that it was a significant issue before embarking on Pascal.

              1. ldo Silver badge

                Re: dangling else problem

                Compare:

                Wirth was aware that it was a significant issue before embarking on Pascal.

                with

                The problem persisted in Pascal implementations

                Do you see a slight ... incongruity there?

                1. MarkMLl

                  Re: dangling else problem

                  Yes, which is my point. Have you actually read the ALGOL W source with an eye to working out how to make a significant change in the syntax?

    2. MarcoV

      Re: Semicolon Wars

      Actually Modula-2 did that already. But it also changed the block structure in other (and IMHO better) ways to fix dangling else

  13. Michael Hoffmann Silver badge
    Pint

    Mentally going back through the labyrinthine course of my career (and life), I can safely say there were 2 things without which I probably wouldn't be where I am:

    1) finding a mentor who taught an eager but clueless kind assembly language (6502!)

    2) learning and using Pascal (Apple's UCSD implementation on the 16KB extension card), followed soon after by Modula-2

    Thank you and vale Dr Wirth!

    1. Jumbotron64

      Well said !

  14. ldo Silver badge

    Then And Now

    I have to quote this (long) bit from his Algol Bulletin article, because implicit in it is so much about the computing environment (both technical and economic) back in those days:

    ❝I am discouraged! I thought I had recognized the short-comings of the commercially available languages; they provide no guidance to programming discipline, and lack a logically coherent structure. Algol 60 had provided an answer and a solution. Unfortunately, it lacked some features which many practitioners badly need. It was relatively easy to remedy some defects and some defaults of Algol 60. We had implemented such an extension of Algol; it contains the data types long real, complex, and character strings, along with one other major extension. But guess how many requests we received for this system. None! Nobody is interested in a new language, particuarly if it is not supported by the big manufacturers or has received the blessing of some standards committee. But I see no chances that either a big manufacturer or a committee will ever produce a language acceptable to our standards or clarity, simplicity, and rigor.❞

    Notice the assumption that the only sort of language implementation that might be worth using is a “commercially-available” one, with the backing of a “big” company behind it. There was no thought that an implementation with source code freely available could ever be considered “commercial”, either in usage or quality. And on top of that, every OS was different; there was nothing like the near-universality of POSIX-type OSes we enjoy today.

    1. anonymous boring coward Silver badge

      Re: Then And Now

      There were no home or privately owned computers in those days, of course.

      1. Michael Wojcik Silver badge

        Re: Then And Now

        <woody_allen>My aunt has one.</woody_allen>

      2. ldo Silver badge

        Re: Then And Now

        In the days before micros, there was some group trying to set up what they called the “Community Memory Project”. The idea, as I recall, was to get hold of some (relatively) cheap-and-cheerful little machine (e.g. a PDP-11) and let people come in and use it for free.

        1. Vincent Manis

          Re: Then And Now

          I visited the Community Memory storefront in Berkeley back in 1974. They had a Teletype connected to a remote Xerox Data Systems machine (possibly a Sigma 2 or 5), and it was running what we would now call BBS software. The CM folks wanted to use it as a tool for connecting community groups and individuals. I don't know what happened to the project.

      3. Roland6 Silver badge

        Re: Then And Now

        Additonally, there was no guarantee software written for a machine from one vendor would work on another machine from the same vendor; Unix (and C) really was revolutionary, as was the entire concept of “open systems” ( with either a lower case or capital “O” and “S”).

  15. Bebu
    Windows

    Long shadows

    Coincidently I had, only recently, cause to mention Nilklaus Wirth and Oberon.

    The text "The School of Niklaus Wirth: The Art of Simplicity" 2000 covers a lot of his work (except the last 23 years :)

    I recall while reading this book, my being surprised that Oberon had just-in-time compilation quite some time before java.

    Wirth's influence on computing is probably now so pervasive that we don't recognise it.

    1. kventin

      Re: Long shadows

      re "Oberon had just-in-time compilation quite some time before java": they called it slim binaries. beuatiful concept. iirc they realised cpu is so much faster than disk they can compile from pseudocode while loading program from disk?

      https://dl.acm.org/doi/pdf/10.1145/265563.265576

  16. anonymous boring coward Silver badge

    Unfortunately "Reiser's Law" is very live and well. Very.

    I'll have to check out Oberon.

    1. Dr Paul Taylor

      Reiser's Law

      Wikipedia calls it Wirth's Law but also gives several alternative expressions, such as "What Intel giveth, Microsoft taketh away".

      (Incidentally, Reiser was Martin Reiser, not Hans Reiser of the filesystem and murdered mail-order bride.)

      We had an article and discussion recently, maybe this one, about how dramatically computers improved between 1983 and 1993, but how little they had done since then. That is, in terms of utility, not hardware.

      I remember thinking, sometime during the 1990s, that we were probably then at the peak of usability of computers.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: Reiser's Law

        [Author here]

        > (Incidentally, Reiser was Martin Reiser, not Hans Reiser of the filesystem and murdered mail-order bride.)

        I really should have disambiguated and clarified that point.

  17. eric.verhulst(Altreonic)

    Great man, thinking things through

    Wirth and Hoare have been very inspirational in developing our Virtuoso RTOS. It's a pity that the software world took the opposite path of complexity. It's even amazing that our software-driven world hasn't collapsed yet. I remember being with Niklaus Wirth at one of the first conferences on embedded software. After a pretty clear presentation on garbage collection in Java, he stands up and says "Now I know why I shouldn't be using Java". It was the real conclusion that could be drawn from the presentation, but nobody was expecting it to be voiced so clearly.

  18. Puketapu

    Algorithms and Data Structures was my first computer book I read at University. Pascal was the second language I ever learned (BASIC was first to my shame)

    Smart man

  19. roosterben

    Big loss to the IT community!

    I didn't know much at all about the creator, really enjoyed the article.

    Fond memories of learning Pascal as a teenager, per another Reg commenter above it was my second language after Basic. Later I did a bit of Pascal at Uni then used Delphi in the workplace for some COM programming. I think if the Microsoft juggernaut hadn't pushed Visual Basic so hard Delphi may have been very popular as it was far superior language in terms of ease of understanding, performance, OO support and libraries.

  20. bregister

    fond memories 2

    Very sad news.

    I remember programming Modula2 on an Atari ST.

    Bring back the ST!

  21. Locomotion69 Bronze badge

    Niklaus Wirth has been the man that learned me (indirectly) how to program in a proper way.

    Thank you, professor Wirth - and RIP.

  22. Anonymous Coward
    Anonymous Coward

    "I pulled out my copy of the draft report on ALGOL-68 and showed it to her. She fainted."

    What was that prize for terrible erotic fiction called again?

  23. Scene it all

    That name/value joke is hilarious. Somewhere I have a copy of the Algol-68 report and it is a monster. They invented their own meta-language just to describe it, which made comprehending it even harder. (The computer does not "execute" a program, rather it "elaborates the definition".) The best part was the humorous quotations dropped into the highly technical descriptions. In the section on comments it has this quote from "The Mikado": "Merely corroborative detail, intended to lend artistic verisimilitude to an otherwise bald and unconvincing narrative."

    1. cpage

      Italic vs Roman full stops

      I saw somewhere (can't now find the reference) that the Algol-68 report made considerable use of roman vs italic text, and that to build a compiler from the specifications in the report you would have to be able to work out which full-stops (periods) were roman and which italic. That was, I'm sure, only one reason why it never took off.

      1. Roland6 Silver badge

        Re: Italic vs Roman full stops

        > only one reason why it never took off.

        I think another was that it was a language that really needed to be taught; I remember many of my fellow undergrads struggling to come to terms with the language.

        Pascal was a walk in the park and as the success of Turbo Pascal demonstrated, was easily assimilated by hobbyists and those who had only previously encountered Basic;

        I suspect MS Visual Basic owes much to Pascal.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Italic vs Roman full stops

          [Author here]

          > I suspect MS Visual Basic owes much to Pascal.

          Especially VB6.

          The author of TP, and lead of the Delphi project, was Anders Hejlsberg.

          https://en.wikipedia.org/wiki/Anders_Hejlsberg

          MS headhunted him and he led the VB6 project, the first version of VB with a full native-code compiler.

          And the last version of VB. Given a big budget, he went on to create J++ and then C#.

  24. skataf

    Turbo Pascal

    At the end of the day, Borland's Turbo Pascal is what made Pascal successful still being used today as Delphi. When I first used it back in the day, it knocked my socks off. A built-in IDE and instant run. No separate compile and linker.

    1. Frogfather

      Re: Turbo Pascal

      Yup, and all on a single 5.5" floppy disk!

  25. MarkMLl

    Wirth and ALGOL-68

    The thing that has long surprised me is the complete lack of support that Wirth appeared to get from his former boss at Stanford, John McCarthy. The quip about "knuthput" in the "Final Word" cited suggests that he felt that the entire Stanford department had become hostile to him.

    Not having McCarthy's support, and probably correctly feeling that he was too junior an academic to stand up to somebody of McCarthy's stature, he resigned from the committee in May 1968 ** and spent the next few months modifying his ALGOL W compiler to implement a new language for (I speculate) the Autumn postgrad intake to work with. As it was, the delay before an intelligible ALGOL-68 specification was available probably meant that he needn't have rushed.

    At my most charitable, I think that Wirth was sufficiently inexperienced to realise that he had to cultivate his former colleagues and explain what he was doing and why.

    Somewhat less charitably, accounts elsewhere suggest that McCarthy had lost enthusiasm for ALGOL by about 1965, when he had Stanford's Burroughs (i.e. ALGOL-based) mainframe replaced by an IBM.

    Perhaps cruelly, there's even a possibility that McCarthy intentionally sabotaged ALGOL-68 to remove a threat to LISP.

    ** https://www.tomandmaria.com/Tom/Writing/DijkstrasCrisis_LeidenDRAFT.pdf

    1. ldo Silver badge

      Re: Wirth and John McCarthy

      I have read elsewhere that McCarthy was an IBM man through and through. He had a key part in the deal that saw IBM donate a machine, free of charge, to Stanford. Then he basically sabotaged access to the Burroughs machine, which was very popular, making it essentially unusable so people would switch to the IBM machine.

      Though I didn’t expect him to see ALGOL-type languages as a threat to LISP. In fact, wasn’t there a “LISP 2” project (abandoned before completion, admittedly) which tried to give an ALGOL-like syntax to LISP?

      1. MarkMLl

        Re: Wirth and John McCarthy

        According to Waychoff ** IBM had made a significant donation to Stanford in the early 1960s (i.e. pre-McCarthy) which they'd used to build a computer centre... and then put a Burroughs mainframe in it. Also according to the same narrative McCarthy was unhappy that Burroughs wouldn't give him a way round their memory protection so that he could take over memory management with LISP. It's very easy to interpret that as McCarthy wanting to get rid of Burroughs, with IBM prepared to bend over backwards to keep him sweet. (I'd note here that SLAC kept their Burroughs, and it contributed to the design of the Starwars-era S1 supercomputer which had an OS implemented in an extended Pascal.)

        There was, without any doubt whatsoever, an enormous amount going on behind the scenes which might never be disentangled. We know that Wirth and Dijkstra were friends, we know that Dijkstra was very public in his disdain for IBM, we know that Wirth spent at least one sabbatical on the US West Coast but there's little indication that he visited Stanford.

        So while it's very common to see accounts of there being a Europe vs USA battle centred around ALGOL/Pascal vs FORTRAN/COBOL, it might actually have been Dijkstra/Wirth vs IBM and, at a deeper level, IBM doing everything it could to disparage "The BUNCH": i.e. Burroughs et al.

        ** http://archive.computerhistory.org/resources/text/Knuth_Don_X4100/PDF_index/k-8-pdf/k-8-u2779-B5000-People.pdf which is Donald Knuth's copy, hence the curious annotation on the front page (which has never been elaborated, including in Knuth's oral history where he has nothing but praise for Burroughs). I believe that copy came from Bill Parker who was a Burroughs FE in the UK, and had trained on the next generation Burroughs mainframes in November '79; I suspect it got to Knuth via Warwick University which was another Burroughs site despite its proximity to an IBM campus.

  26. Ciaran McHale

    WIrth's insightful comment's on Pascal's importance and shortcomings

    In the 1990s when I was a graduate student at university, the departmental library had a collection of technical journals and conference proceedings. My vague recollection is that the most interesting conference was a once-every-five-years one called "History of Programming Languages" and it contained papers written by the designers of programming languages. I found it fascinating to read such papers about various languages I knew, including Forth and Pascal. If my memory serves me correctly, the Pascal paper contained a few insightful comments, such as...

    Pascal did not invent the concept of structured programming, but it was the first widely popular language to support structured programming, and this popularity was largely due to the Pascal compiler (which compiled into bytecode) and the bytecode interpreter being small enough to fit into the limited memory available in computers of the time. The original intention was that the use of bytecode would serve as a stepping stone to help somebody port Pascal onto a new computer architecture, and it was expected that whoever was doing such a port would then implement a bytecode-to-machine-code translator to improve execution speed, but the bytecode interpreter ran fast enough that the bytecode-to-machine-code translator was rarely implemented.

    Wirth acknowledged that although Pascal had introduced many programmers to the concept of structured programming, the language contained some limitations/deficiencies that were addressed by newer programming languages, and hence he said (from memory): "Pascal set new standards that it could not live up to". That comment gave me an important insight into the "C is better than Pascal (or vice versa)" arguments of my classmates during my undergraduate days. More importantly, it gave me a useful mindset for doing a critical analysis of any new technology that comes along: acknowledge the ways in which it is better (perhaps 10x better) than predecessors, while being aware of the possibility that the new technology might contain significant shortcomings.

    I fell into the camp of not liking the programming languages that Niklaus Wirth invented, but that never stopped be from appreciating his significant contributions.

  27. Anonymous Coward 99

    Pascal wasn't just for purists, despite the implementations

    In the 80s I had many long & heated conversations about how pure Pascal was, and therefore couldn't be used into real world stuff like bit twiddling where you knew word ordering - despite there being statements that these were intended so that Pascal was useable as a real world control language, and not just abstract computing. This presaged the wars over Java/C# and the underlying metamachines/machine interface libraries by at least 20 years. When they got into dogma like "else statement are unnecessary in Case switches, because all inputs should be known", us engineers left them too it.

    As a result, I mostly skipped from assembler to C, and then C++, while waiting for Aida to sweep the world.....

    1. Crypto Monad Silver badge

      Re: Pascal wasn't just for purists, despite the implementations

      The entire original Macintosh OS was written in Pascal - that's a major real-world use.

      When writing applications in C, you had to convert C strings into Pascal strings before passing them to system calls.

      1. ldo Silver badge

        Re: The entire original Macintosh OS was written in Pascal

        No it wasn’t. Large parts of the original Lisa OS may have been done in Pascal, but I don’t think anything that went in the Macintosh ROM came from Pascal code.

        It is instructive to refer back the original BYTE Magazine special on the introduction of the Mac. There you will read things like how the QuickDraw graphics engine, originally written for the Lisa, had to be condensed from 160K of compiled Pascal code down to just 20K of tightly-optimized assembly-language code.

    2. Missing Semicolon Silver badge

      Re: Pascal wasn't just for purists, despite the implementations

      At one time, I was writing embedded Pascal for a Z80 system (A bulk cable reel tester). The runtime included a page switcher, so the "upper" EPROM could be switched to one of several installed. This was not automatic, so you had to *concentrate*.

      Having been too young to do Pascal (or perhaps, Pascal was for CompSci students, not EEng) I had to hastily read the manual (no Internet!) and convert from C & PL/M86 to Pascal.

    3. Bill 21

      Re: Pascal wasn't just for purists, despite the implementations

      In the 80's. I was writing real realtime software with Oregon Pascal and pSOS. There was lots of bit-twiddling and endian stuff to talk to other systems; surprisingly little had to drop down to 68000 m/c due to the wonders of variant records. Worst bit was the linker only able to handle the 1st eight characters of the names leading to ugly prefix warts, e.g. 4x15x1_S(omeName)

      1. ButlerInstitute

        Re: Pascal wasn't just for purists, despite the implementations

        A very large amount of the work I did in the job I had from 85 to 2001 was written in Pascal. This was Oregon Pascal 2 which had a number of useful extensions to standard Pascal to allow the bit twiddling and turning off bits of checking, and getting addresses of arbitrary variables, and separate compilation, and default ( I think called "otherwise" ) in case statements.

        The provided linker was a bit rubbish so we'd written our own (that was before my time so I never used the old one).

    4. fg_swe Silver badge

      HP MPE, Apple Lisa, Apple Macintosh

      All of them either successful or at least very influental systems realized in Pascal-like languages.

      Hewlett-Packard MPE started out as a mid-level computer, but rose into almost mainframe-class symmetric multiprocessor computing in the 80s and 90s. It was cancelled by HP management despite customers loving it for robustness and security. MPE ran on very powerful PA RISC CPUs in the 90s, which were then faster than Intel products.

      Pascal was much more than just an "educational" language. It still has features which are superior to C, like number domains and arithmetic exceptions.

    5. Anonymous Coward
      Anonymous Coward

      Re: Pascal wasn't just for purists, despite the implementations

      As a placement year student at the end of the 1980s I wrote Pascal code that ran on VMS systems for the control room operations system for Eastern Electricity's power grid (North London, Suffolk, Essex), with the system having a projected 20+ year service lifetime.

      A large scale solution developed by a IT company that EDS shortly afterwards bought to establish their UK presence.

      1. munnoch Silver badge

        Re: Pascal wasn't just for purists, despite the implementations

        My industrial placement in 1987 was at FMC doing VMS Pascal for a process control system in their engine plant. Write the code, put it on mag tape, drive to plant, install... The other programmers were all engineering graduates so as a Comp Sci student I was a God amongst them... That was a great summer.

    6. itzumee

      Re: Pascal wasn't just for purists, despite the implementations

      The entire back-end of London Underground's ticketing system that was rolled out in the late 1980s was written in VAX Pascal running on VAX/VMS minicomputers that talked to the PDP-11 minicomputers at each station, which in turn talked to the ticket machines at said station. VAX Pascal's language extensions meant that anything you could do in C (on VMS) you could do in VAX Pascal.

  28. Bitsminer Silver badge

    Algol-68

    I was at a seminar at the Uni sometime in the mid-70s or so. A visiting prof, don't remember who, wrote on the blackboard:

    A := B * C;

    And stated: "This statement requires nine run-time checks."

    That was the end of any interest in Algol 68.

    1. Pete Sdev Bronze badge

      Re: Algol-68

      Hmm, slightly sceptical, though not familiar with Algol68, only Pascal and derivatives.

      *compile-time* checks would be

      - A,B, and C have been declared

      - A,B,C are numeric types.

      - If B or C is floating point (real), A must be of type floating point.

      - The type of A should not be smaller than that of B or C.

      Runtime check would be for the overflow flag.

      1. Bitsminer Silver badge

        Re: Algol-68

        IIRC, and definitely it was a long time ago, but possibly the variables could be dynamically allocated. Possibly even dynamically typed, who knows?

        In those days, efficiency and speed were considered the prime goal, and so a dynamically typed language like, oh, Python, would have been ridiculed.

        Things have come a long way since then.

    2. nijam Silver badge

      Re: Algol-68

      > "This statement requires nine run-time checks."

      True in other languages besides Algol-68, so your reaction was a little unfair.

    3. ldo Silver badge

      Re: Algol-68!?

      That claim doesn’t make any sense to me. The whole point of the complex compile-time checks would have been precisely to minimize the amount of work needed at run-time.

      Maybe they were thinking of PL/I? That one could end up with the most innocuous source constructs generating surprisingly complex code, due to inadvertently triggering automatic conversions between things like FIXED and FLOAT types.

      Also, Algol-68 used semicolons as statement separators, not terminators. So that ending semicolon is another clue pointing to PL/I.

    4. Roland6 Silver badge

      Re: Algol-68

      >” And stated: "This statement requires nine run-time checks."

      That was the end of any interest in Algol 68.”

      So no fan of interpreted languages?

  29. Will Godfrey Silver badge
    Thumb Up

    He had a pretty good run

    He'll be remembered long after the current crop of 'experts' are forgotten.

    P.S. Thumb up because there's nothing to be sorry about.

  30. chololennon
    Pint

    My respects

    I am a hardcore C/C++ developer, but I have fond memories of Pascal, I used several versions of Turbo Pascal and Delphi. Also, in the early 90s I took a course in university about data structures. The course was based on Wirth's classic book "Algorithms + Data structures = Programs" which I still have. My respects to Dr. Wirth, I learned a lot from him.

  31. Missing Semicolon Silver badge

    Patents

    As with all these luminaries of software past, some louse will "patent" something that is obvious to anyone familiar with their work. When they are not around to shout "I did that in 1973!", stupid stuff will get passed through.

    1. fg_swe Silver badge

      So ?

      You take decades-old papers and shoot down the patent as Prior Art. OK ?

      1. ldo Silver badge

        Re: You take decades-old papers and shoot down the patent as Prior Art. OK ?

        Surprisingly, this is

        easier said than done.

  32. Sir Lancelot

    RIP professor Wirth

    Thanks for making me realise I did not want to be an electrical engineer after all!

    First Pascal, Fortran and Commodore PET Basic at technical university, then on to CHILL (CCITT High Level Language) and 808x assembler on ITT's public S12 telephone exchanges.

    Later on to C on Apollo workstations and that was end of my professional programming track - I wonder why ;-)

    And yes, I can still read and understand the Pascal programs I wrote a long time ago without major mental contortionism!

    1. fg_swe Silver badge

      Let Me Guess

      You did not have valgrind on Apollo and you accidently destroyed your heap structures by means of buffer overflows, use after free etc. Local bugs destroyed the global program integrity. You gave up. I can understand.

      C is a language suitable for automatic program generators, not for men.

      1. anonymous boring coward Silver badge

        Re: Let Me Guess

        "C is a language suitable for automatic program generators, not for men."

        It's suitable for real men. Or real programmers.

  33. Sceptic Tank Silver badge
    Pint

    Hamba kahle prof. Wirth, as they say around these parts.

  34. fg_swe Silver badge

    A Giant Has Died

    In the best spirit of european science, Mr Wirth created elegant, minimalist ,yet robust languages and systems. His "thin" book on compiler construction is refreshing light as compared to the dungeon of the "dragon book".

    Computer Science will stand forever on his shoulders, or at least as long as classic imperative computers will be in use.

    His work on Algol is under-appreciated, as this language was used to create early memory-safe operating systems.

    If he had only been more vocal about the virtues of Algol, Pascal and his other languages, he might have had even more impact on applied computer science.

    Groundbreaking systems such as Apple Lisa, Apple Macintosh were implemented in Pascal. The large-scale HP MPE operating system was realized in a Pascal variant.

    The full relevance of his work will most probably be discovered in future years.

  35. sedregj Bronze badge
    Go

    Goodbye and thank you

    end;

  36. Mikel

    Nooooo

    A great loss.

  37. Steve Davies 3 Silver badge
    Pint

    He will be missed

    I first came upon Pascal when working for DEC. As I'd had some experience with CORAL-66 (An Algol type language), I was assigned to work on a project that used the VAX/ELN OS as a base.

    Since then, and despite many years of having to work in C/C++, I preferred to work in Pascal. I jumped on board with Delphi for a while but the silly costs for the version that gave you access to Database components put me off it for life. Then along came Lazarus. At first, it was very flaky but has steadily improved.

    I use an application written in Lazarus Pascal every day to analyze the logs on my Wordpress Server.

    Thank you Mr Wirth for giving me access to a sane programming language for all these years.

    I'll have [see icon] one of these tonight in your name.

  38. OJay

    Real Programmers

    Such a great loss. RIP

    I can't believe no one has posted this yet. I only recognised his name based on that article.

    https://www.ecb.torontomu.ca/~elf/hack/realmen.html

  39. MarthaFarqhar

    My first formal teaching in CS used Modula-2. Naively I thought all languages would be this sensible and readable.

    RIP Prof Wirth.

  40. Jumbotron64

    I would like to add to my earlier post above and, because as of yet I haven't seen this mentioned, that Hardware Description Languages such as VHDL and Altera Hardware Description Language to name a couple are derived from Pascal and Ada, which of course Ada itself is derived from Pascal.

    1. kventin

      fwiw oracle's pl/sql is based on ada

      (if you ever see a mention of diana in your error, flee. you're in the black magic realm)

  41. Bitbeisser

    Started to learn Pascal in 1976, and to this day, it is still my favorite every day programming language. Only the implementation has changed...

  42. RedneckMother

    late to this news / game, but...

    What a mind! What a positive influence on software (among other topics)!

    RIP. I had an "early" introduction to him / his influence on software, languages, etc... he had a profound influence on our present software and software language situation.

    I was "blessed" to attend a symposium with him as the speaker. It was a JOY to listen, and attempt to follow his thoughts. He had so very many profound and insightful perspectives, which he freely and gladly shared.

    Y'all 'scuse me. I am in mourning.

  43. HumPet

    R.I.P. Mr. Wirth!

    And thank you, The Register!

    Your obituary inspired me to contemplate about how N. Wirth influenced our company and our real-time application server Ipesoft D2000 used for SCADA/MES/EMS systems. Here is my result:

    https://d2000.ipesoft.com/blog/r-i-p-niklaus-wirth

  44. jreagan

    I first learned Pascal on the CDC 6000s at Purdue in the early 1980s and later went on to be on the Pascal Standard Committee and the project leader for the DEC VAX Pascal compiler. I'm still in charge of it (and others) for OpenVMS on the x86 platform. So that early Digital Pascal V2 compiler from 1982 is doing quite well today, 42 years later.

    While the committee didn't have any interactions with Prof Wirth, we did go on to do a significant upgrade and created Extended Pascal in 1989 (the politics of the ISO and IEEE organizations pushed us towards a separate language, not a Pascal revision).

    I had a listing of the ETH compiler in my archives but I donated it to a collector several years ago. I didn't scan/photograph it before I got rid of it.

  45. Herby

    When it comes to Pascal...

    I am reminded of Brian Kernigan's quote.

    That being said, Worth did advance the state of the art quite a bit.

  46. Graham Perrin

    A transcription of Niklaus Wirth's “Closing word”, transcribed by Douglas Creager on 2024-01-05:

    Closing word at Zürich Colloquium

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like