back to article The Stonehenge of PC design, Xerox Alto, appeared 50 years ago this month

Although it only gets credited with one of them – because Steve Jobs slipped up* – all modern end-user computers owe three defining aspects of their design to the Alto. Modern computers get many influences from many sources, but one of them far outshines all the others. Its signficance, though, is "more honoured in the breach …

  1. Alumoi Silver badge
    Joke

    Round corners!?!

    Do I see round corners on this thing? Apple should sue! Oh, wait...

    1. gnasher729 Silver badge

      Re: Round corners!?!

      Do you see overlapping windows on that thing? Bill Atkinson saw them an implemented them, which was hard work. (The same technology that he invented for overlapping windows also made rounded corners feasible). The only problem: His memory was wrong. Xerox didn’t have overlapping windows.

      1. Doctor Syntax Silver badge

        Re: Round corners!?!

        At about 8:30 on the video is what I take to be the Parc demo and it shows overlapping windows.

        1. Anonymous Coward
          Anonymous Coward

          Re: Round corners!?!

          Sort of overlapping Windows. If I remember, only the foreground window updated, so if you moved it around, you couldn't see the data on the parts of the screen you uncovered. Also, you'll notice that only a couple of programs used overlapping windows. Most of the programs were full screen only, and they were all "exit and load a different program" affairs.

          So it was all bits and pieces. It was impressive, but as they say: lots of people saw the machine, Steve Jobs was the one who thought to "copy it"...

      2. Liam Proven (Written by Reg staff) Silver badge

        Re: Round corners!?!

        [Author here]

        > Xerox didn’t have overlapping windows.

        Your memory deceives you. As Dr Ben Goldacre liked to say often, he named a book after it: I think you'll find it's a little bit more complicated than that.

        The Alto did have overlapping windows, a concept which Alan Kay invented.

        http://www.sis.pitt.edu/mbsclass/hall_of_fame/kay.html

        The problem was that they didn't work very well. A program could not write into a background window that was partially obscured by another window, and if a window was uncovered, the program responsible for it had to redraw the entire contents, which was very inefficient. In essence, they faked it, as Kay himself describes on Quora:

        https://www.quora.com/Why-did-Alan-Kay-choose-rectangular-shaped-Windowing-as-the-initial-style-of-GUI-What-were-the-considerations-and-impracticality-of-differently-shaped-windowing-approach/answer/Alan-Kay-11

        Apple's Bill Atkinson couldn't believe it could be as clunky as that. He convinced himself that there had to be drawing into partially-obscured windows _which didn't overwrite the window on top_ and that when a partly-covered window was _uncovered_ the program would only have to redraw the stuff that had been "hidden".

        So that's what he implemented in Quickdraw:

        https://www.folklore.org/StoryView.py?project=Macintosh&story=I_Still_Remember_Regions.txt

        His explanation was: "I didn't know it couldn't be done, so I did it."

        https://johnkary.net/blog/i-didnt-know-it-couldnt-be-done-so-i-did-it/

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Round corners!?!

          * liked to say _so_ often

          Oops!

          Also, to the best of my knowledge, Ben is alive and well, and I suspect he probably does still say it. He certainly used to when I talked with him at Skeptics in the Pub. I didn't mean to imply he was no longer with us.

          1. Woza
            Joke

            Re: Round corners!?!

            You're one fake Guardian obit away from imitating ChatGPT

        2. DS999 Silver badge

          A lot of the best advances coming from this

          "I didn't know it couldn't be done, so I did it."

        3. Mage Silver badge
          Windows

          Re: Round corners!?!

          I agree, the article is inaccurate.

          My memory of what I saw (portrait CRT, full GUI) in the Xerox shop opposite Belfast City Hall in 1974-1976 (I was working near there then) approx doesn't mesh with Star (link shows landscape CRT and Star was 1981) or this article. It had to be an Alto

          Apple Lisa development started in 1978 and it was released in 1983.

          So the claim that Lisa was started before Apple guys saw a working Xerox GUI seems dubious.

          There were two versions of the Alto, the Parc model and a production version. The Register article omits the production Alto?

          However what I saw in Belfast wasn't for sale. They only sold big copiers. But it was a public demo. It can't have been a Star and must have been an Alto.

          I didn't see a portrait CRT on a computer again till late 1990s.

          1. Liam Proven (Written by Reg staff) Silver badge

            Re: Round corners!?!

            [Author here]

            > My memory of what I saw (portrait CRT, full GUI) [...] had to be an Alto

            Yes, it does. The Alto had a portrait full-page display; the Star had a 2-page display.

            So what does this demonstrate, and in what way does it make my article inaccurate?

            > Apple Lisa development started in 1978 and it was released in 1983.

            Again, so? This in no way falsifies my article.

            The Lisa optionally ran Microsoft Xenix, a resolutely text-only Unix-like OS. The original plan for the Lisa was to be a powerful office computer that could display graphics.

            Displaying graphics does not mean a GUI. A ZX Spectrum can display graphics but it does not have a graphical user interface.

            The Lisa was intended to be a CLI-only machine. This is well documented, e.g.

            https://computerhistory.org/blog/the-lisa-apples-most-influential-failure/

            « Apple had already been working on a computer in its own R&D labs to leapfrog the company’s best-selling, but command-line-based, Apple II personal computer. »

            But *then*...

            « After the PARC visit, Jobs and many of Lisa’s engineers, including Bill Atkinson, worked to incorporate the ideas of the GUI from PARC into the Lisa. »

            Also as per:

            https://www.mac-history.net/2007/10/12/apple-lisa/

            « The Lisa project was started at Apple in 1978 and *evolved into* a project to design a powerful personal computer with a graphical user interface (GUI) »

            Emphasis mine.

            Xerox *showed* the Alto. It didn't sell it.

            The Lisa predated the Star, but was not originally intended to be a GUI machine because the people building it had mostly never *seen* a GUI when they started the project.

            > There were two versions of the Alto, the Parc model and a production version. The Register article omits the production Alto?

            [[Citation needed]]

            I am not aware of any production Alto. Nor is any reference site I've read. So, please can you educate us with some references to this?

            Every history of the Alto I've seen says that circa 2000 units were hand-made, and none sold at retail.

            I think you are wrong.

        4. timrowledge

          Re: Round corners!?!

          Thing is, the Alto was running Smalltalk and so making the windows draw in any manner one wanted is pretty easy. It’s a matter of how much performance one has available and in 1972 there wasn’t so much. Not even on an Alto, which was perhaps 5 years ahead of the general state of things.

          Later, on slightly faster machines we indeed implemented something you might recognise as rather like QuickDraw. Current Smalltalks do a wide variety of things- host windows working via host apis, BitBLT in a single window, web based displays, driving OpenGL or Cairo, etc. Sometimes all of them.

  2. Evil Auditor Silver badge

    ...adding object orientation to Pascal...

    Why mentioning that in an otherwise pleasant read? Object Pascal was one of the biggest piles of programming manure I've ever seen.I was too thick to understand Object Pascal. Still traumatised.

    1. big_D Silver badge

      I used Lightspeed Pascal and Macintosh Programmer's Workshop with object Pascal in the mid to late 80s, programming the original Macs. It was interesting.

      MPW was also a gem, it had a command line tool, which was missing from Mac System/Finder. It made large copy and rename jobs much simpler, for example.

    2. Anonymous Coward
      Anonymous Coward

      I had that problem with Borland's PAL (Paradox -their DB- application language). When that went Object Oriented I decided that maybe it was time to amuse myself with something else, I think my brain just likes lineair paths :).

    3. Doctor Syntax Silver badge

      Having ignored OO for years I found myself slumming on Windows in 2000, using Delphi. I'd encountered UCSD Pascal in the distant past so the nuts & bolts of the language were familiar. Suddenly all the OO stuff made sense - it's just old-fashioned Entity/Relationship with methods bolted on. The obfuscation was zealots (a) being zealots and (b) adopting the industry standard of giving old stuff new names.

      1. John H Woods Silver badge

        Re: "it's just old-fashioned Entity/Relationship with methods bolted on"

        It can certainly feel like that in procedural languages with OO bolted on, like Delphi. But "pure" OO environments like Smalltalk tend to have a completely different feel. Languages and programming environments hint at paradigms, but they don't usually enforce them, I've seen a lot of non-OO written in OO languages and some surprisingly elegant object-based approaches in languages that barely support it.

        1. Michael Wojcik Silver badge

          Re: "it's just old-fashioned Entity/Relationship with methods bolted on"

          Yeah. On the same note, I don't think I agree that Smalltalk was "the most influential" OO language, though that's a subjective evaluation anyway. Most subsequent OO languages deviate pretty significantly from the Smalltalk philosophy. C++ went its own way with multiple inheritance1 and operator overloading (which came from ALGOL), and then steadily accumulating every other feature the committee can think of, like template-based generics. Java made a stab at a more pure OO language but retained much of C++'s syntax, and more importantly its imperative orientation, rather than passing messages. Javascript (LiveScript) was most strongly influenced by Self, with its prototype-based inheritance.

          And of course the original Smalltalk was heavily tied to the GUI IDE2 and interactive development, which has consequences for the language such as late dynamic binding. There are purely textual, compiled Smalltalks now, I believe, but that wasn't the original idea.

          Smalltalk was influential mostly in inspiring people to create OO languages that didn't work like Smalltalk.

          1I don't know of another major multiple-inheritance OOPL before C++. CLOS has multiple inheritance but emerged a year or two after Stroustrop's C++ implementation.

          2With critical color-coding that was inaccessible to people with red-green colorblindness, fortunately a condition unknown in programmers.

          1. timrowledge

            Re: "it's just old-fashioned Entity/Relationship with methods bolted on"

            Given that the original Smalltalk was monochrome (outside a small number of experiments) your colour-blindness issue is implausible.

            The UI is one of the great triumphs of Smalltalk. A decent integrated development system with a self reflective language that can debug itself, analyse itself, is far ahead of the tedious text edit, save file, run compiler, try to run result if there was any, run ugly annoying debugger, try to work out what to edit in what file, repeat until exhausted. Making a text only Smalltalk really doesn’t have much value if it loses the live nature of the system.

          2. Anonymous Coward
            Anonymous Coward

            Re: nope, Smalltalk80 was The Bomb..

            Lets see. No one paid the slightest attention to OO until SmallTalk broke to a wider audience in 1981. It was a very big deal in 1981 and by 1983 OO and GUI's were everywhere as the Next Big Thing. After 1981 there was a mad rush to bolt OO onto existing languages. You should have seen how may languages the cross languages implementation of MacApp (the first viable OO app framework) was supposed to be on. Almost a dozen. In 1985.

            C++ won the race because of CFront and super-setting C and no other reason. It was and is a truly ugly "language". A total dogs dinner. That "grammar" in the ANSI spec, a fantasy. Try implementing a verifiable compiler from it. And dont get me started on templates, Lisp macros reincarnated. With all the same flaws.

            CFront kind of worked until broken by templates. The first really dependable compilers for C++ only really arrived in the early 1990's. I know for our projects C++ only made the cut for the first time in 1993. Because of compiler stability. But templates and exceptions could still break all compilers out there for another decade or more. And still do if you know where the language spec kludges are.

            Java is C++ with all the crap thrown out. A very clean language spec. With an actual complete language grammar. Any and all problems are with the implementation. Which is another subject

            I remember having a long team discussion about implementation of Multiple Inheritance in CLOS, would have been late 1985. Porting code that had been shipping for several years. Several years before CFront was stable enough to compile MI half way correctly. The C++ "spec" had MI in 1985 but it did not work. MI is one of those great idiot filters. Anyone who uses MI really does not know what they are doing. Or writing toy code. Because if you think for even 20 secs about how you would actually implement and compile MI you start seeing just how many gotchas there are. If you want random untraceable bugs in shipped code, MI is the way to go. The few times I've seen people try to do MI in product code the codebase was always a throwaway. A mess.

            I'd guess you dont know how any of these languages really work. At the system level. Apart from Algol I've either written compilers / VM's for all of them or had to deep dive the compiler / interpreter codebases to fix serious bugs. Although I'll give the Apple SmallTalk80 guys a break as it was still just a late Alpha and a 8Mhz 68000 was just a not enough horsepower to run it cleanly. In 1985 on a prototype MacPlus.

            1. ThomH

              Re: nope, Smalltalk80 was The Bomb..

              > Anyone who uses MI really does not know what they are doing. Or writing toy code.

              Or using pure abstract base classes as a surrogate for protocols.

            2. timrowledge

              Re: nope, Smalltalk80 was The Bomb..

              “Java is C++ with all the crap thrown out. “

              No. Just ... no. Primitive types? Seriously?

              1. Anonymous Coward
                Anonymous Coward

                Re: nope, Smalltalk80 was The Bomb..Java..all the C++ crap gone

                Read the pure Java language spec (well pre lambda stupidity)

                Now read any of the ANSI C++ specs (pre C++11 lambda stupidly)

                Now think about how you would write and validate and verify a compiler for that language. All the edge conditions. And more importantly where the spec forces arbitrary implementation decisions, none of them "correct"

                Apart for the subject of implementation of multi dim arrays at classfile interpret / compile level the Java language spec the Java language specification is very clear and complete. Whereas with C++ you could easily fill a few hundred pages of docs about edge condition and situations where you just have to punt. Here is just one example at random. The spec for exceptions. In Java its water tight and complete. In C++ if you have the slightest idea about they are speced and compiled you will never ever use or depend on them. I have nt in the 35 years I've been writing C++ code. Now try to implement that in a way that is half way stable.

                And so on. Yes, the Java language is C++ with all the crap thrown out.

                People who have to implement compilers/interpreters/VM's and runtime environments have a very different view on languages from people who just use some subset of the language to write code. So whats the first thing you look for when looking at a new language? People like me look for the formal spec and the grammar. And looks to see just how clued in the language architects are. So recent languages like Swift and Dart looked pretty OK on first pass of the language spec docs. Confirmed by actually writing apps in them. Whereas the first look at a language like Rust, where's the spec? - Its not really finished yet. How about the grammar? What's a grammar. Pure clown-world. Confirmed by the death spiral of Firefox and a look at the code written in Rust for other projects.

                Even writing a non-trivial domain specific language in something like Antlr will give you a better understating of just how this works. Antlr is really nicely done for those domain specific language which are too complex to fit into TCLInterp(). Which handles most situations. TCL still one of the most useful libraries out there when you need to add non-end user application scripting without reinventing the wheel.

              2. ThomH

                Re: nope, Smalltalk80 was The Bomb..

                Agreed. Java is Objective-C cut to look like C++, as per one of its creators. So, yeah, primitive types, interfaces, reflection, objects [almost] always on the heap, etc.

    4. Will Godfrey Silver badge

      I had a look at Pascal...

      It wasn't a long look!

      1. Lil Endian Silver badge
        Pint

        Fair enough.

        But I'm glad the people that created these so-now-out-of-date languages like Wirth and Kemeny did what they did when they did.

      2. Michael Wojcik Silver badge

        Pascal was the first HLL after (traditional) BASIC that I did any extensive development in. (I had done some assembler, for a couple of different ISAs, and a COBOL course at school.) Pascal was what taught me structured programming and data structures more ambitious than parallel arrays. I wouldn't want to write production software in traditional Pascal (though I have done, a bit, over the years); but as a learning tool I found it invaluable.

        I admit that's partly because I had Turbo Pascal 2.0 and Borland's Turbo Pascal Tutor book, which was really done well.

    5. Anonymous Coward
      Anonymous Coward

      Still traumatised.

      I never got on with Pascal - either with Pascal on VMS (I stuck to FORTRAN) or, when I was asked to help someone struggling with UCSD Pascal on P-System *.

      Then progressed (?) to C, dabbled a bit with C++ till a change of job/destiny with Smalltalk - took to it like a duck to water.

      *In the days of Borland and IDEs, the Open University in the UK were subjecting their foundation year students to the line-editor based system - when we enquired why... the answer was they will stick with that until they run out of the stockpile of course material, when they will revise and update.

  3. mmonroe

    Proper paper orientation

    I've always thought the portrait screen is the correct way to go. At work, they have finally supplied a monitor I can rotate.

    1. Arthur the cat Silver badge

      Re: Proper paper orientation

      I've always thought the portrait screen is the correct way to go

      For small(ish) screens yes. For large ones like my current monitor (curved 37" 3840x1600), portrait mode would be a bit weird. It's a lot easier going wide than tall, both in space and neck movement.

      1. John H Woods Silver badge

        Re: Proper paper orientation

        My current favourite configuration is the almost square 16:18 such as the LG Dual-up

        1. Woza

          Re: Proper paper orientation

          I do both - main monitor is landscape, with a second monitor next to it in portrait orientation. The latter is good for reading documents or long chunks of code.

          1. Michael Wojcik Silver badge

            Re: Proper paper orientation

            I exclusively use a laptop with no external screens, having tired of a multi-monitor setup circa 1991 when I left IBM. But if I had external monitors I don't think I'd want one in a portrait orientation. The laptop's screen effectively serves as side-by-side portrait screens – I very rarely have any window occupying more than three-fifths of the width – and I can easily get 50+ lines of code or text visible in such a window, which is plenty for a page, in my opinion. Having more visible text would just make it easier to lose my place.

            1. Anonymous Coward
              Anonymous Coward

              Re: Proper paper orientation

              You obviously have a much much MUCH better laptop than my employer provides. It might be "full HD", but the screen is ... rubbish. We get tiny 13" screen laptops, that are so small that the default is to scale stuff to 50% (not 75%, but 50%) which while making it actually big enough to see, means that it's more like "here's you'r new laptop, please set your expectations back to the 1980s" - since the effective screen size is 960x540, which is only 8% more pixels than the 800x600 SVGA standard that we once thought was massive !

              Fortunately, they also provide usable screens at the office, and I have a better one at home - if they didn't, I'd have refused to use the laptop on the grounds of it being a health hazard. If only MS's steaming pile of software wasn't complete and utter [insert string of expletives here] at handling this - so I have to spend time moving things back to where they should be every time I switch between display setups (OS X mostly handles this well).

              In effect, the laptops we have take me back to the early 90s when Apple introduced the Mac LC - and at least some people bought one computer and multiple sets of screen, keyboard, and mouse since it was (just) small enough to fit into a decent briefcase and treat it as a portable.

    2. Anonymous Coward
      Anonymous Coward

      Re: Proper paper orientation

      the portrait screen is the correct way to go

      Yes, all screen doors are easier to go through in portrait.

      No, wait ..

    3. J.G.Harston Silver badge

      Re: Proper paper orientation

      But my eyes are horizontally mounted, not vertically....

      1. Anonymous Coward
        Anonymous Coward

        Re: Proper paper orientation

        Ah, but that's going to be the Next Big Thing in Silly Valley: Roman desks. After the market for standing desks started to peter off they obviously had to come up with something new to relieve the few remaining IT workers of their money, so the new trend will be computing like the Romans ate: lying down.

        Anyone complaining that that is stupidly impractical will be silenced by alleging they're clearly not part of the modern 'in' crown.

        Sorry, my sarcasm genes escaped for a moment. Where was I?

        :)

        1. that one in the corner Silver badge

          Re: Proper paper orientation

          Lying down, lounging and computing in a louche fashion also fits with the ongoing "AI" interactions:

          "Bing, peel me a group" (of files)

          "Alexa, have this dataset washed and taken to my analytics"

      2. Liam Proven (Written by Reg staff) Silver badge

        Re: Proper paper orientation

        [Author here]

        > But my eyes are horizontally mounted, not vertically....

        Unless you have some problems, they work as a pair.

        And unless you mainly read Mongolian or very old Japanese or Chinese language texts, you read top-to-bottom...

        Never notice how almost all books, magazines, letters, leaflets, and newspapers have portrait oriented pages?

    4. Roland6 Silver badge

      Re: Proper paper orientation

      And it is probably the fourth defining feature, that the majority missed.

      Xerox were a document company, XNS was intended to facilitate document production, hence having the screen oriented so that a user could see an entire page in a single view makes sense, naturally scanning and reading a portrait page is much easier than a landscape page.

      Okay with 24inch HD displays, we don’t need to turn the screen to see a full A4 portrait page at 100%, but the principle of having a screen that facilities the wysiwyg viewing and creation of portrait pages applies in spades to smaller screens. It is perhaps notable that neither Apple iPad or Microsoft surface support the use of their devices in portrait mode with a detachable keyboard.

    5. Doctor Syntax Silver badge

      Re: Proper paper orientation

      "I've always thought the portrait screen is the correct way to go."

      It depends what you're doing. Writing on one document with another containing reference material would probably be harder that way. Your rotating monitor is probably a better option than one fixed vertically.

      1. Anonymous Coward
        Anonymous Coward

        Re: Proper paper orientation

        I find it hard to read text on a rotating monitor, though. I just can't follow text that is spinning around.

        :)

        1. that one in the corner Silver badge

          Re: Proper paper orientation

          But it is good after a Friday liquid lunch, just so long as you can match the rotation speed and direction to counteract the way the rest of the room is moving.

        2. Doctor Syntax Silver badge

          Re: Proper paper orientation

          It's ideal for displaying rolling news.

    6. Lomax
      Go

      Re: Proper paper orientation

      I agree with Victor Hasselblad: why confuse things by having an orientation at all - just go square and stop worrying about it! Square monitors are in fact readily available, if not cheaply, and are often used in air traffic control rooms. See for example the 1920x1920 Eizo FlexScan EV2730Q. Makes sense if you think about it.

    7. lamp

      Re: Proper paper orientation

      I had a portrait monitor connected to my Mac in 1991. It was excellent. Had on the network, using the VAX as a server.

      1. jake Silver badge

        Re: Proper paper orientation

        In 91, or thereabouts, the Radius Pivot display could be turned between portrait and landscape on the fly, at the whim of the user. I installed several dozen at a design house in Silly Con Valley. Normally, I didn't (and don't) do Apple work, but the hardware-shy Mac using "designers" offered me a lot of money to plug them in for them. Who would say no?

    8. Anonymous Coward
      Anonymous Coward

      Re: Proper paper orientation..had those in 1987 on the Mac

      Remember the first time I got a prototype Radius Full Page Display for my MacPlus. Early 1987. Used them for years until the 21 inch mono displays came down in price in the early 1990's then moved to them.

      My Mac II in 1987 even had dual display with contiguous desktop. You could stick 6 video cards in the MacII for one large desktop (we tried it out) but those color displays were not cheap. It was a big deal decades later when Windows could finally handle two monitors without serious problems. Something we had on MacOS Classic for many years.

    9. Roland6 Silver badge

      Re: Proper paper orientation

      > At work, they have finally supplied a monitor I can rotate.

      Whilst this functionality isn’t really anything to do with xerox. It really irritates me how rotation on ipad/ipad just works, yet on any other platform it’s a total dogs dinner. I suspect even W12 will have no functionality that supports auto rotation; it’s been 16 years since the iphone was launched and MS and Apple have an IP agreement, and if you take into account the Radius monitor from the late 1980s it’s actually been 30 plus years since the need for rotation on the PC platform was identified.

  4. boblongii
    Facepalm

    Xerox and foolishness

    Those 100,000 alone shares are worth $3.4B dollars today, allowing for stock splits. Xerox's market cap is $2.34B.

    1. James O'Shea

      Re: Xerox and foolishness

      if you look up 'idiot' in the dictionary, you'll see a reference to 'Xerox Management'.

      1. vcragain

        Re: Xerox and foolishness

        I'm 83 now, 'I was a mainframe programmer - Assembler, Cobol - discovered my first personal computer around 1980/81 & I remember all it had was Lotus123 or straight Dos. However I bought a Tandy for myself & played around with it, then finally a PC with Windows & then branched into coding on the PC. Retired some years now, but this story is still very fascinating to me having watched it all unfold & I remember competing with one of my sons who was a Mac fan & me a PC fan for which was the better machine. I still prefer the PC - he still uses a Mac. Poor old Xerox - how those guys must have wanted to kick themselves when they finally realized what they had done - but this article is the first time I had ever heard their part of the story !

        1. Dickie Mosfet

          Re: Xerox and foolishness

          "Dealers Of Lightning: Xerox PARC and the Dawn of the Computer Age" by Michael Hiltzik is well worth a read. You can usually pick up a cheap, used copy on eBay or Amazon.

          1. John Brown (no body) Silver badge
            Coat

            Re: Xerox and foolishness

            Or borrow a copy and Xerox it :-)

            1. Fruit and Nutcase Silver badge

              Re: Xerox and foolishness

              Or borrow one from your Local Public Library, if it hasn't been closed due to cutbacks

          2. Michael Wojcik Silver badge

            Re: Xerox and foolishness

            I also like Smith & Alexander's Fumbling the Future.

    2. DS999 Silver badge

      Re: Xerox and foolishness

      One could have hardly expected that this scrappy little company competing with the likes of Atari, Commodore, Radio Shack and so forth would someday be the most valuable company in the world so its stock should be held forever.

      If they'd made a similar deal with the other companies I listed and decided to hold the stock for 44 years they would have ended up with nothing.

    3. Ken Hagan Gold badge

      Re: Xerox and foolishness

      That suggests that Xerox management gave away the future twice, first by not making it themselves and second by giving away the proceeds after they sold it.

      1. DS999 Silver badge

        Re: Xerox and foolishness

        They did make it themselves, when they eventually released the Star. It didn't fail in the market because of anything Apple did; the result would have been the same if Steve Jobs had never been allowed to see the Alto.

  5. big_D Silver badge

    Smalltalk

    I loved Smalltalk when I was a teenager, it was such a wonderful concept. I actually did a project on it at college, comparing it to normal 3GLs, this was around 1985.

    1. John H Woods Silver badge

      Re: Smalltalk

      Smalltalk is still thriving. Check out pharo.org, for starters.

      1. John 110
        Headmaster

        Re: Smalltalk

        Anyone who got their degree from the Open University in the 90's will have used Smalltalk (M206 for the win! -- Frogs anyone?)

      2. that one in the corner Silver badge

        Re: Smalltalk

        Also squeak.org and squeakland.org

        IIRC Pharo forked from Squeak a while back. Pharo has a "serious" feel to it, whilst Squeak still keeps up the fun side of it and has material that clearly harkens back to Alan Kay's work. And Squeak still uses a version of the painting from the August 1981 Byte front cover (still have my copy, hopefully stashed carefully): that issue is available on archive.org and IMO is still a good read.

        Pharo seems to have more solid funding in place at the moment.

        1. David 132 Silver badge
          Thumb Up

          Re: Smalltalk

          Aaaaargh. Thanks to your comment, I've been sucked down a fresh rabbit-hole and have spent the last hour reading that edition of Byte. Thanks, I think?

          (https://archive.org/details/byte-magazine-1981-08/mode/2up?view=theater, for those similarly tempted)

    2. Scene it all

      Re: Smalltalk

      I visited PARC back in the day and saw a demonstration. It was the Smalltalk language that most impressed me, not the GUI. The idea of SENDING A MESSAGE instead of CALLING A FUNCTION makes networking a natural extension. Erlang went on to take the message metaphor as central and it is much easier to write distributed programs that way. Most of the other O-O languages consider it as just a function call.

      1. J.G.Harston Silver badge

        Re: Smalltalk

        The idea of SENDING A MESSAGE instead of CALLING A FUNCTION

        This is one of my blind spots whenever people attempt to explain OO programming. Every time I try and work out what makes it different from non-OO programming it just comes across as buzzword bingo.

        What's the difference between

        hey_jim_do_this(6); // call function, passing "6" to it

        and

        hey_jim_do_this(6); // send a message, passing the message "6" to it

        ?

        1. Roland6 Silver badge

          Re: Smalltalk

          > What's the difference between…

          A state-of-mind!

          Messaging is effectively asynchronous communication, whereas function calling is synchronous. Yes I know I implement message passing by having the application call the messaging API, however, the API with respond with message accepted , not the result of the function, which will be delivered at some later time.

          1. that one in the corner Silver badge

            Re: Smalltalk

            You can also play interesting tricks with message passing - as they can[1] literally be messages, a set of name/value pairs, one of which is the "name of the function" and the others are named parameters. There can be missing parameters, which the receiver fills in with defaults. More interestingly[2], there can be extra parameters that the receiver simply ignores, which can be useful for messages that the receiver is going to pass on, e.g. to a member variable of the receiver: that member var may know how to make use of the extra parameter[3].

            As Roland6 said, using literal messages also allows for playing with asynchronous actions. As they go through a scheduler, so you can give some messages priority, such as a timer tick. Or simply use them as a natural queue to buffer data sent to hardware, such as audio, network or just serial traffic. Lazy evaluation becomes a message saying "when the time is ready, send this message to this receiver" - but look, that is precisely how your scheduler is implemented, as a list of these "eval" messages, so all you're doing extra is delaying when a particular "eval" goes on the ready queue; add in a flag "don't schedule me directly, instead clone me and schedule the clone" and you can get closures to pas on as callbacks[4].

            You also get RPC for free (from the p.o.v. of the application coder) - the scheduler just sends the message over the wire, the caller and receiver need no changes whatsoever (aside from some way to identify which receiver is located where[4, again].

            Of course, when writing to such a messaging model, 99% of the time you can just code as though it is all plain old synchronous function calls and ignore what is going on under the hood, but when you want to exploit funky goodness it is just there, ready[5].

            [1] can, not necessarily are, in any particular implementation

            [2] as default arguments are old hat these days

            [3] ok, this raises more questions, such as "how does the sender know so much about the internal structure of the receiver to be able to add these extra parameters, what happened to encapsulation?". Well, this is just a quick'n'dirty comment, so gonna ignore those tough questions!

            [4] such crude descriptions

            [5] sorry, no idea if any extant language/library actually implements any of these things, this all comes from stuff done years ago, before The Day Job got all C/C++/other-non-message-passing-language

          2. timrowledge

            Re: Smalltalk

            More than that - a function call is a jump to an address, an instruction known in advance by a typical compiler of dead-text languages. It says ‘go there and follow those instructions ‘. A message send is ‘please Mme. receiver, would you be so kind as to handle this request to find the letter sent to billg’. Depending on what the receiver is, anything might happen as a result- you might have a database dig out the letter, or a prank BOFH object set fire to your chair. The lookup of the action performed is dynamic - at least in a proper language- and that leads to being able to make dynamic environments like Smalltalk.

            If your experience is limited to static languages with no live debugging and development then you are missing out on a substantial amount of joy.

            1. Michael Wojcik Silver badge

              Re: Smalltalk

              This is not a hard-and-fast distinction, since many non-message-passing languages support dynamic binding for subroutine calls.

              The essence of message-passing is that it's a further abstraction for the flow of control. Semantically, it's a request (or offer) of a piece of work to a recipient, rather than an imperative change in the instruction pointer. Deferring responses, message forwarding (including to distributed systems), flexibility of interpretation – those all follow from the additional abstraction; they're possible consequences of it, not necessary aspects.

              Of course you can do all of these in a language and still call it a "function call" (or, more likely, "method invocation") rather than using the message-passing metaphor. The name doesn't govern the abstraction. But that's the idea.

        2. John H Woods Silver badge

          Re: sending a message instead of calling a function

          You can't really tell with that example because you are using 6 as an argument in both cases and in the latter you have omitted to specify the recipient.

          The key mindset difference is the delegation of responsibility to the recipient, the expectation that the recipient may have state, and that the class (or type) of the recipient may have a bearing on both.

          1. J.G.Harston Silver badge

            Re: sending a message instead of calling a function

            The recipient is specified in both of them. "jim".

            Ok:

            hey_do_thing(jim, 6); // call function to tell jim to do something with parameter '6'

            hey_to_thing(jim,6); // send message to jim to do something sending '6'

            The mention of delegating responsiblity to the other end - well, that's *exactly* what passing the thing on to jim is doing *anyway*. Otherwise it would never make the function call in the first place. "How does the caller know the internals of the callee?" Wall, again, that's exactly why it is just passing it over to jim, the caller neither knows nor cares what jim's internal structure is. "Buffering data..." Again, exactly what just passing out of the caller to somebody else that knows what it is doing.

            As I said originally, everything people attempt to explain OO programming to be just end describing what I'd say: well, that's just.... /normal/... programming, doesn't everybody do it that way *anyway*? You want to send to a printer, *you* don't send to the printer, you just pass the data to a printer buffer, and the printer "thing" that knows about printers does the actual talk-to-printer stuff. You want to sent to a serial port? Again, you just putbuf(serout, byte); and the thing that knows about serial interfaces does the serial interface stuff. You want to send a message to another computer on a network? net_tx(dest, data, length). You want to write to a file? write(file, data, length).

            It all appears to be strawman arguments. You're starting from assuming the *caller* is fidding with printer I/O hardware, or networking hardware, etc., and advocating the caller should just pass that work on to somebody else, BUT WITHOUT FIRST TELLING ME THAT'S THE STRAWMAN YOU'RE STARTING FROM, so of course I have no idea what on earth you're waffling on about. The strawman does not exist, and what you're prosletising is already normal standard practice, of course you don't try to do the thing-specific stuff, you call (or message) a thing-understander that understands the thing and does the thing-frobbing itself.

            This may sound confused and rambling, but that's because everything about "OO - the great new thing" seems to make no sense in the first place. It's not the "great new thing", it's the normal.

            1. that one in the corner Silver badge

              Re: sending a message instead of calling a function

              You have ignored entirely the use of class hierarchies, which is core to OOP.

              Instead, you are concentrating on message passing - which is NOT even core to OOP! Simula isn't message-passing but it definitely does objects. FWIW message-passing is discussed in some of the other comments here.

              Oh, and by the way - your examples of just calling write() or putbuf() or ... Well, if you were trying to do OOP then *you* would just code up a call to write() and the identity of the thing you pass to it would determine if the underlying call was a putbuf() or a net_tx() or... You may be familiar with this from using a (FILE *) as the argument to write(): yup, you can consider Unix devices and "everything is a file" as OOP.

              And, yes, you *can* just write Unix devices in C with a struct containing a few function pointers and other fields - because you not *need* an OOPL to write OOP, the languages are, like all programming languages, just trying to make it easier to apply the concepts (how well they do that, of course, is up for debate and the reason for the next 10,000 PLs to be created).

              Considering how much material there is that describes OOP, in its various forms and implementations, INCLUDING its cons (it has things it isn't great at, especially in some implementation forms - that is why, for example, templates were added to C++ so early on) and your shouty argument, I'm going to call your claims about THE STRAWMAN as being itself a strawman argument.

              PS: OOP hasn't been "the great new thing" for decades - we were using Simula-67, from, um, 1967. Smalltalk was released as Smalltalk-71 before Smalltalk-80 which is - hang on, run out of fingers

              1. Mage Silver badge
                Headmaster

                Re: … ignored… class hierarchies, which is core to OOP.

                They aren't.

                I did learn about OOP before learning C++ in 1987. I find it frightening that that was 37 years ago and I learned about OOP over 40 years ago.

                You can do OOP in Modula-2, but C++ has a nice syntax for classes. Also Modula-2 has co-routines, mutexes which allows well designed messaging and parallel processing, opaque modules with import & export, typed procedure variables, both stronger types than Pascal (no assign of same anonymous types) and "magic" types (with functions to return size of type or an array to make safe device drivers). Buffer overruns are still a big security issue. Biggest flaw in C++ was/is C compatibility.

                Also macro are very evil disasters waiting to happen and templates dangerous and evil.

                I've given up programming now, but we seem to be going backwards on GUI design / styles /flexibility and not progressing in programming or applications. See Chrome, Firefox, Thunderbird, LO 7.x vs 6.x (undocked toolbars more broken), Android (mess with external storage, printing, copy/paste, larger screens, poorer GUI).

            2. Michael Wojcik Silver badge

              Re: sending a message instead of calling a function

              You can implement object-oriented programming in a non-OO procedural (or functional, or data-flow, or whatever) programming language. OO is an orientation to programming. The fact that you can do OO with a non-OO language does not mean OO is meaningless; it means OO languages are just providing syntactic sugar to make an OO approach easier.

              The precise defining characteristics of OO languages are up for debate (Kay insisted that message passing was one, for example), but generally include encapsulation, polymorphism, and inheritance. You can do those in C. You can do them in (non-OO) COBOL. You can do them in LISP. You can do them in assembler. OO languages just make it more convenient.

              When you write code that makes extensive use of encapsulation, polymorphism, and inheritance, then you're writing OO code, regardless of the language. When you use an OO language and eschew those things, you're not.

        3. Anonymous Coward
          Anonymous Coward

          Re: Smalltalk..the messaging killed it..

          The main reason why despite huge push by people like ParcPlace / Digitalk and IBM SmallTalk80 pretty much went nowhere in the market was the huge performance overhead of this messaging. Now the bytecode interpreter did not help but that could be speed up by a native incremental compiler. But the message dispatching, that was a performance brickwall. Not a lot you could do with that.

          Now OO Lisps also did message dispatching to objects but this was very straightforward to integrate into how functional languages worked at the lowest level. Everything was a dynamically dispatched call. One way or another. No one expected Lisp to be a speed demon anyway. For non trivial applications GC made sure of that.

          Message dispatching to objects is one of those ideas that looks great on the lecture theater whiteboard but in a real world language that needs to get the most performance on constrained hardware (every real world application) anything other than static or vtable dispatching is a performance killer. And GC. Which Smalltalk80 also had. But that was way down the list of performance gotchas.

          Saying that the original Adele Goldberg Smalltalk80 books are still a worthwhile read. Bough my copy in 1983 and it still gets taken out every now and then. To see how a bunch of very bright people with amazing creativity and originality approached some particular system level language problem. Like approaches to implementing meta-classes for execution to pick one example. Forty years on and still the best book on the subject. An honorable legacy.

          1. swm

            Re: Smalltalk..the messaging killed it..

            In my opinion, Smalltalk-76 was the best Smalltalk. Peter Deutsch (sp?) did Smalltalk-78 and had a JIT compiler in it. Both had a special character set. Smalltalk-80 used ASCII and lost something. I wrote a microcoded floating-point for Smalltalk that sped floating point up by at least a factor of three. I also microcoded some commonly used methods.

            Smalltalk-76 also had the best debugger I have ever seen. An error (generally "message not understood" resulted in a stack window. When expanded there were six panes. The top two were the stack trace window and a code pane with the piece of the code causing the error highlighted. You could click on any other stack frame to see the code and highlighted piece.

            The next two windows showed the "class" variables with a window that could execute code in the class environment.

            The last two windows showed the local variables with a window that could execute code in the local environment.

            There were many options for restarting, continuing etc.

            The class browser was a pleasure to use for examining code or writing code. You could modify anything, including "system" code.

            The system was persistent so all of your windows etc. were restored when restarted.

            1. timrowledge

              Re: Smalltalk..the messaging killed it..

              You do know that modern Smalltalks have all that and more in the debuggers, right? And the dynamic code generation is really pretty damn good at making things fast. It’s all moved on quite a bit since ‘76. PIC message sends are about the same speed as ‘normal ‘ procedure calls.

          2. Michael Wojcik Silver badge

            Re: Smalltalk..the messaging killed it..

            Message dispatching to objects is one of those ideas that looks great on the lecture theater whiteboard but in a real world language that needs to get the most performance on constrained hardware (every real world application) anything other than static or vtable dispatching is a performance killer.

            Erlang seems to have solved the message-passing overhead problem nicely, since it was written for embedded (soft) real-time systems in the mid-1980s, when hardware was rather more constrained than it is now, and has been used successfully ever since.

            1. Anonymous Coward
              Anonymous Coward

              Re: Smalltalk..the messaging killed it..nope

              Erlang is a functional RTOS language with primitives to match. A nice solution to a very specific problem. There is no class system support. Like in SmallTalk. Or Java. etc. And all messaging as such is process based mainly. So more like the language equivalent of QNX than any OO language or language package. But if you tried to write large Lisp like applications in Erlang you would get Lisp like performance numbers. Due to the nature of dynamic function dispatching and symtab etc overhead. And GC.

    3. Fruit and Nutcase Silver badge
      Thumb Up

      Re: Smalltalk

      In addition to squeak and pharo already mentioned, two commercial offerings of note are Cincom and VAST (further development of IBM VisualAge Smalltalk)

      Both have significant use in key systems across a diverse variety of industries...

      https://www.cincomsmalltalk.com/main/industries/

      https://www.instantiations.com/visualage-to-vast/

      A potted history here...

      https://en.m.wikipedia.org/wiki/Smalltalk

  6. Eccella
    Big Brother

    I worked for Rank Xerox and at the Business Efficiency Exhibition in the NEC in 1980 or 81 we had a demo suite of the Networked Star systems. To illustrate how a network would pass data between workstaions we had fluorescent tubes which flashed as something was done on one workstation then sent to antoher workstation for editing. These were sit down hourly shows. At nearly every session the comment would be passed " Oh I couldnt work in an office with all thos flashing lights!"

    1. Anonymous Coward
  7. Stephen Wilkinson

    Thanks for the reminder about Borland Delphi, we used it in the first year of the Software Engineering degree (along with Java) which seems a lifetime ago now.

  8. _andrew

    First? The Xerox and Symbolics and TI Lisp machines were contemporaneous

    They also had very similar (to the Alto, not necessarily the Apple follow-ons) windowing GUIs, mice, networking. Single user machines. Common Lisp and Interlisp had object systems. All put in a corner after the (first) AI winter.

    1. Arthur the cat Silver badge

      Re: First? The Xerox and Symbolics and TI Lisp machines were contemporaneous

      Having had a Symbolics machine on/under my desk(*) and worked in Smalltalk as well, the Symbolics interface was more text based than ST-80. Genera (the Symbolics OS) looked a bit like emacs on steroids.

      (*) Bloody noisy, like having a fan jet nearby.

      1. jake Silver badge

        Re: First? The Xerox and Symbolics and TI Lisp machines were contemporaneous

        "Bloody noisy, like having a fan jet nearby."

        The early 1980s to mid 1990s was peak fan noise, when processor and bus speeds nearly outran the technology of air-cooling. Lots of money went into developing heat transfer technology in this time period. Such tech also helped with the physical down-sizing of lasers and etc.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: First? The Xerox and Symbolics and TI Lisp machines were contemporaneous

      [Author here]

      > The Xerox and Symbolics and TI Lisp machines were contemporaneous

      The first Symbolics hardware, the LM-2, shipped in 1981.

      https://en.wikipedia.org/wiki/Symbolics

      The Genera OS first shipped in 1983:

      https://wiki.c2.com/?GeneraOs

      That's a decade later than the Alto and contemporaneous with the Apple Lisa instead.

      Xerox' Lisp machine was the Dandelion, which is the underlying architecture of the Star -- the successor machine, not the Alto itself.

      TI's first Lisp machine was the Explorer, which also shipped in 1983. See the flyer for it here:

      http://classic.technology/texas-instruments-explorer-computer-system/

      You have your dates wrong. This stuff was a decade later than the early Alto demos. That's why the Alto is so significant, and that's why I wrote the article.

      1. Some Random Kiwi

        Re: First? The Xerox and Symbolics and TI Lisp machines were contemporaneous

        After the Alto was the D0 aka Dolphin, then the Dorado, the Dandelion (on which Star was first released), a Dandelion variant with a floating-point coprocessor and larger control store known as a Dandetiger, and the Daybreak.

        You can run Interlisp in your browser (via VNC) at http://online.interlisp.org/ or get the source for everything from https://github.com/Interlisp/ and compile the VM implementation (C, mostly POSIX syscalls, X11 for graphics). The resurrection is still a work in progress.

    3. Anonymous Coward
      Anonymous Coward

      Re: First? The Xerox /Symbolics / TI Lisp machines were contemporaneous..nope..not even close

      Well the Alto was in 1973 and the Star *shipped* in 1981 but the first usable Symbolics, the 3600, did not ship till 1983. As for Interlisp. Not really in the running. And TI were very late to the game. When it was basically over.

      Plus you could get a Xerox Star for less than $35K in 1983 but the 3600 started at $100K plus. And those peripherals... You could buy a small car for the price of one of those small boxes. A nice car.

      The 3600 did not have a GUI. More like a cut down version of the Burgundy Book XWindows running emacs. As for objects systems. Commons Lisp had CLOS and Symbolic had Flavors. CLOS was much nicer. Especially if you have to implement a platform API support stack for it. In asm. Flavors was just weird and funky. I learned to hate emacs from using the 3600. OK, which Cntl, Meta, Super, Hyper, key combo with which of the three mouse buttons do you use to copy text from this buffer? It was just one of those machines I never ever warmed to. Unlike pretty much every other kit I have got my hand on over the years. By 1986 it was basically obsolete.

      The custom Lisp machines were not killed by the AI Winter. By the time that winter arrived those very expensive dinosaurs had already being killed off by the little mammals. PC's running Lisp language implementations on the host CPU or ISA / NuBus accelerator cards. I worked for and shipped one of those mammals. Great experience. If only to know exactly why functional languages never took over the world. Except one area by stealth. By only by pretending not to be a functional language. Hint, ECMA...

  9. disgruntled yank

    Stonehenge?

    Perhaps El Reg could treat us to an anniversary piece about the VAX 11/78, the Cahokia Mound of minicomputers.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Stonehenge?

      [Author here]

      > the VAX 11/78, the Cahokia Mound of minicomputers

      I guess you meant the 11/780? That's the machine I learned VMS and Fortran on. :-)

      Even so, it's not my area of expertise, but I will think about it.

      1. disgruntled yank

        Re: Stonehenge?

        What a difference a digit makes... My mistake.

    2. Lil Endian Silver badge
      Pint

      Re: Stonehenge?

      Silbury Hill: PDP 11/73 running RSTS/E

      1. rnturn

        Re: Stonehenge?

        Heh. 11/70 running RSX-11M-PLUS and F77.

      2. Doctor Syntax Silver badge

        Re: Stonehenge?

        So what would be Long Kennet? IBM360?

        1. Lil Endian Silver badge
          Happy

          Re: Stonehenge?

          Maybe the 360 would suit Avebury's stone circle? I reckon West Kennet's Long Barrow would befit EDSAC.

          Who knew the neolithic had so much compute power?!

  10. Arthur the cat Silver badge
    Headmaster

    the language that begat C

    If one is being pedantic (see icon) BCPL begat B begat C. There used to be a joke about whether the next language would be D or P

    1. Gene Cash Silver badge

      Re: the language that begat C

      I was sort of disappointed when I found out that BCPL didn't stand for Before C Programming Language

      1. Vincent Manis

        Re: the language that begat C

        It actually stands for “Basic CPL” where CPL was a language that Christopher Strachey and his colleagues worked on during the 1960s and early 1970s. It would have been an excellent language, had its developers ever converged on an actual final specification for it. Martin Richards noticed that a small subset would be ideal as a systems programming language; he developed a highly portable compiler that could be ported by writing about 100 lines of Fortran/assembler/whatever for the target machine. As well, backends could be written to produce good-quality code for a range of machines. (I benchmarked it in the early 1970s as producing code for a couple of problems that ran about 1/3 faster than the same code on IBM's Fortran G (non-optimizing) compiler.

        BCPL's downfall came because it worked best on a word-oriented machine, such as the IBM 7090/7094 on which it was first implemented. There was only one type, the word; you needed to add a minimal notion of data types to make byte or double-precision data work well. (This was exactly the reason that B was replaced by C.)

        1. prandeamus

          Re: the language that begat C

          Ah yes, I love to nostaligic references to BCPL on a Friday morning.

          The word-oriented thing caused me a lot of problems when I encountered the language as a student in 1981, trying to move code running under a proprietary operating system into something embedded, on 6809 with memory mapped I/O. I had programmed something like

          LET addr = 1234

          !addr = 99

          to poke the byte value 99 into the byte-addressed hardware location 1234. No, no, no, no, no, no, no. That poked the WORD 99 into the WORD address 1234, which was translated (often at runtime) into a byte address by shifting it one bit left. So although the compiler was able to use the nice 6809 instruction set, the generated code was full of left shift operations. The correct solution in the case of course is

          LET addr = 1234

          0%addr = 99

          Using the super high tech byte indirection operator that was added after the Richards/Whitby-Strevens reference manual.

          One of the little miracles of the mid-80s was getting BCPL to run on the 6502 for the BBC Micro (Richards Computer Products/Acorn). The standard compiler, ahead of its time in many ways, already had a number of machine-independent representations of compiled code: OCODE to power the native code generators and INTCODE for bootstrapping an interpreted version. These concepts were developed into a very compact intermediate code representation called CINTCODE, which was then interpreted. As a bonus, the interpreter had an address space of 64K words = 128K bytes. Custom versions were developed to map to shadow RAM and sideways ROMS of the later BBC models, allowing code to reach unheard of sizes - graphical display systems of 80K of CINTCODE spread over 5 16K ROM images transparently switching in and out of the address space like magic.

          Thank you for reading this old man's TED talk.

          1. that one in the corner Silver badge

            Re: the language that begat C

            I was going to mention BBB Micro BCPL but you beat me to it :-)

            > Custom versions were developed to map to shadow RAM and sideways ROMS

            And now I want to hunt down those versions - as if I didn't have enough things on the list to find out about. Hmm, I know where the BBC Master *ought* to be, but I bet its capacitors are a bit dry...

            Almost forgot, it was BCPL that donated the pling operator to BBC Basic, as Cambridge Uni had a lot of BCPL knowledge. Which was also shared with the Amiga, porting the Tripos (IIRC) OS when Commodore's own OS was not ready in time.

            1. prandeamus

              Re: the language that begat C

              Oh yes, we all called it pling. In BBC basic of course, the address was a byte address by the unit of transfer was a 4-byte integer, right?

              As for tracking down the sideways ROM release I doubt very much that it was a commercial. Acornsoft did publish the basic compiler and full runtime, and two additional modules for floating point math(s) and for generating standalone and rommable code.

              The brains behind the product was John Richards who founded Richards Computer Produce in Blewbury, Oxfordshire, near Didcot. Chris Jobson was the first employee and they ported the compiler and runtime first to Z80 CP/M systems and then to 6502 and the Beeb. The Beeb version was a modest success: ROM had the CINTCODE runtime (the VM) an a standard runtime library to encapsulate BBC MOS calls, and RAM-based file system, and a simple version of we would now call dynamic linking. The single disk contained the multi-pass compiler, assembler, a screen editor and a debugger, which was quite an amazing setup for 1984. You could even use the compiler with a cassette tape filing system if you were a masochist with time on your hands, but it was more sane to have two 80-track drivers or whatever since there wasn't much room to spare for source code on an single sided 100K disk.

              I applied for a job there in 85, but declined John's offer at the time. After I notified him about a bug in the assembler he wrote back graciously sending a bug fixed version and a new job offer. I worked for several years on Reuters based products. John was a very savvy businessman as well as being techie, a combination I now know to be rare.

              The sideways RAM and ROM thing was developed for a rack mount version of the Beeb that Acorn built for Reuters around the time of the stock market "Big Bang". RCP (Chris Jobson as I recall) wrote a modified runtime with "ISE" or in-situ execution. Word addresses < 32K mapped onto the native 6502 memory map. Addresses with the top bit set, in certain cases like global vector calls, were mapped to special case code.

              GLOBAL $( Foo:42 $)

              LET someproc () BE $(

              foo(99)

              $)

              Would grab Foo from the global vector and map it to an entry point in a sideways ROM. Like overlays but without having to manually load the code into RAM. There was a "build ROM" tool and we spent many happy hours rearranging each hunk of CINTCODE to fit into 5 ROM images.

              Bottom line- fantastically neat implementation, but internal only. If the dot-matrix printout of the user guides for the Reuters stuff survive after nearly 40 years I'd be amazed. You could seek out RCP or its successor companies. They were in Didcot about 15 years ago, have not been in touch since then. I doubt anyone there will remember me not least because I changed my surname on marriage. I hope that helps.

          2. Vincent Manis

            Re: the language that begat C

            *Grumpy* note. I invented the % operator for byte subscripting, and made the changes to the compiler and System/360 code generator to make it work. This work was communicated to Martin Richards without my name on it, and he in good faith incorporated it into his distribution.

            Sorry, I just had to vent. :)

            1. prandeamus

              Re: the language that begat C

              Pleased to make your acquaintance, oh byte operator person!

              ! was both unary and a binary operator

              % was only binary, so as I recall it would only work as 0%byteaddress rather than %byteaddress

              Assuming my memory hasn't gone to pot that's my forty-year-old chip on my shoulder :)

              :)

              1. Vincent Manis

                Re: the language that begat C

                Dyadic (binary) only was a design decision. I did this on an IBM System/360, and addressing arbitrary bytes wasn't something you needed. So in w%o, you start with a word pointer w, and add a byte offset o. I also had in mind word-addressed machines (which still existed in those days), where accessing an arbitrary byte in memory was nonsensical. (As much as I can capture my thoughts from 50 years ago.)

        2. Arthur the cat Silver badge

          Re: the language that begat C

          BCPL's downfall came because it worked best on a word-oriented machine

          There was a derivative of BCPL (whose name escapes me right now) that was targeted at (and could run on) 8/16 bit micros. The syntax was even more minimal than BCPL, addresses were byte addresses rather than words, and it had byte and word load/store operators. Our 2nd years used to use it when learning about micros. This is ~1980.

  11. Martin Gregorie

    I remember seeing a Star in 1984

    Back then Logica had a few networked units in its HQ offices, where they were being used for secretarial purposes. I only saw them once.

    However, I don't remember being particularly surprised because I'd already seen, and been able to have a brief play with, a very similar machine at an internal BBC Future Equipment exhibition: In this case it was an ICL Perq, which was equipped with mouse, keyboard and vertically oriented monochrome white screen, big enough to display an A4 page image at close to actual size and with considerably better resolution than any other display I'd seen at the time. There's more on the Perq here: https://en.wikipedia.org/wiki/PERQ

    There was a lot of good stuff in that exhibition, but for me the top items were the Perg and the NRDC Surround Sound system, which beat the pants off any other surround sound system I've heard since: its 'presence' control was amazing: it let to pick your position the concert hall: from being near the musicians: bright sound, little audience noise, changing to a softer sound and surrounded by coughers, foot tappers and shufflers as you moved your point in the sound field toward the rear of the venue. Sadly, the NRDC system got killed off soon after that: a great pity because it was well suited for FM broadcasting.

    1. James Anderson

      Re: I remember seeing a Star in 1984

      Had the pleasure of working with the successor system called Viewpoint.

      It had what I still consider the best word processor. Miles ahead of anything you could run on MS-DOS or Unix.

      As far as I know there were only two customers in the UK themselves and the Inland Revenue.

  12. Doctor Syntax Silver badge

    Somewhere on YouTube there's a series of a computer museum refurbishing an Alto (spoiler alert - they started by replacing the PSU electrolytics). IIRC the hardware looped through a number of phases, mostly handling the peripherals, only one of which was actually running code by emulating a DG Nova in microcode. It's well worth finding & watching.

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      I think you might mean CuriousMarc's video series?

      https://www.youtube.com/playlist?list=PL-_93BVApb58I3ZV67LW3S_JEMFnDrQDj

      It's Alan Kay's _own_ Alto. Xerox let him keep one.

      Coming soon should be a piece on one of the more impressive offsprings of Kay's work, from an interview I conducted last night. The chap suggested I could go meet Dr Kay, who now lives in London... (!) Sadly I am quite far from London, but this may yet happen.

      I am also working on an interview with Don Hopkins, the man who coined the word "copyleft", of NeWS fame among many other things.

      1. Gene Cash Silver badge

        I met Dr. Kay once, back in 1982, at the Atari Computer Camp in Asheville, North Carolina. They rented a Univ of N. Carolina dorm for the summer. It was in the Great Smokey mountains and absolutely eye-wateringly beautiful country. It was quite an experience for a kid from an illiterate redneck-ville of only 8K population. I was also randomly picked to be interviewed by Jane Pauley for the TODAY! show. I happened to be standing nearest the camp counselor when he said "we need a kid for the TV thing" They didn't sync the camera with the computer monitor so I'm pointing to a completely blank screen and describing things that aren't there.

        Anyway, he was Atari's Chief Scientist at the time, and I showed him how I hacked up a program to command the 810 floppy disk drive by talking directly to its CPU. He was impressed and offered me a job at Atari, but unfortunately there was the Great Video Game Company Crash of '83, Atari was dismembered, and Kay left to join Apple.

        I would REALLY like to hear an El Reg article on NeWS if someone could swing it. I always thought a UI on top of Display Postscript was a huge idea, strangled to death by Sun keeping it tightly proprietary and so now we're STILL stuck with X11.

        1. Michael Wojcik Silver badge

          Having used both DPS (NeWS and otherwise) and X11, I'm happy with the outcome, personally. Why do you prefer DPS?

      2. Doctor Syntax Silver badge

        "I think you might mean CuriousMarc's video series?"

        Yup, that's the one.

        "The chap suggested I could go meet Dr Kay ... this may yet happen."

        I'll look forward to reading that.

        1. 9Rune5

          CuriousMarc

          It sounds like your encounter with CuriousMarc's content was a brief one.

          There is an immense amount of divine content on his channel. Everything from restoring the Apollo Guidance Module to various mainframe bits and pieces.

  13. Bitsminer Silver badge

    Star vs VT100

    In the office I worked at in the early 1980s, there was a (lonely) Xerox Star used by the secretaries.

    Meanwhile we working folk were using VT100s and PDP-11s just down the hall.

    Such a contrast in technology between the visioneers (Xerox) and the engineers (at DEC).

  14. karlkarl Silver badge

    This was an interesting read. So Steve jobs was apparently blinded by GUI (and thus overlooked some of the more interesting tech at the time).

    This is all too true with some of the new beginners coming towards Linux. They are all very hung up about GUI and all the (admittedly crumby) desktop environments, WMs, compositors, etc that they are missing out on some of the really useful learning opportunities.

    1. yetanotheraoc Silver badge

      figure of speech

      "Blinded by GUI" means he saw the desirability of a sexy GUI for the end users. "This is all too true with some of the new beginners coming towards Linux. They are all very hung up about GUI..." See? Jobs was, and still is, correct!

      Commenting on the Remarkable 2, @Arthur the cat writes, "when doing anything that requires typing I usually need a browser open as well". Most of my own work involves a terminal side-by-side with random other gui application. Such a low bar is why I get on equally well with Windows, Mac, and Linux. The iPad also can do a split screen, although the command line is a third-party sandboxed app meant for ssh, so it's of limited usefulness in a local-machine scenario. For my purpose even the "dated" Windows 3.1 was good enough. https://forums.theregister.com/forum/all/2023/03/16/remarkable_launches_type_folio_keyboard/#c_4637060

      But all that means nothing to the GUI-has-to-be-sexy folks, and Jobs got that. Blinded by the light.

    2. Anonymous Coward
      Anonymous Coward

      Blinded by GUI's? Really?

      So I'd guess that you never actually written any commercial software for the 99.99% of ordinary users who are locked out by CLI's. Or wrote CLI application before GUI's took over.

      "Learning opportunities"? The vast majority of users want to get stuff done, get their work done, they dont want to be forced to learning arcane and pointless trivia because some programmer was too arrogant or lazy to think of others. The people who might want (or are forced to) use the software.

      Plus GUI's in Linux hide all the crap that was mostly a big waste of time when we were forced to have to do this stuff back in the 1970's and 1980's. You know, 50 years ago. And usually it was done far better 40/50 years ago. Every time I've been forced to descend to the Linux command line (while trying to get real work done) when I've finally waded through the swamp and worked out what magic incantation solves the problem its always been - well thats bloody stupid - It was far more straight forward to do that in VMS/RT-11/RTSE/MVS etc. Or even System V / BSD or Solaris for that matter.

      So no missed "Learning opportunities". Quite the opposite.

  15. Sparkus

    There is a Star emulator available

    on github. It's functional and only slightly rough around the edges.

    https://github.com/livingcomputermuseum/Darkstar

  16. ian 22

    What about MESA?

    I was part of the team developing the 8010, using the Alto. What a revelation! Although the cartridge disk drive only had a capacity of 2MB, it was adequate for the time.

    While I never used BCPL, I did use MESA.

    The network speed was (I think) 10Mb/second, but it was quite fast until it was 'productized', with multiple added envelopes in each packet.

    The Computer Museum in Seattle has a working Alto, and an 8010.

    1. Arthur the cat Silver badge

      Re: What about MESA?

      The network speed was (I think) 10Mb/second

      ISTR the earliest form of Ethernet was 3 Mb/s, and 10 Mb/s was the massive improvement that came later. All on fat coax with vampire taps.

      1. Roland6 Silver badge

        Re: What about MESA?

        The ‘laugh’ with network speeds back then was that they were comparable to system bus speeds and thus there were examples of using the network as a bus extension.

  17. Anonymous Coward
    Anonymous Coward

    OO, Function Calls, Message Passing.....

    Smalltalk -- OO + Message Passing

    Python -- OO (or not, depends on programmer) + Function Calls

    C++ -- OO (or not, depends on programmer) + Function Calls

    .....and then there's Compiled vs Interpreted.......

    .....and finally.....Who Cares? What matters is whether the final application is useful, supported, maintainable....as judged by the end user!

    Just saying!

    1. that one in the corner Silver badge

      Re: OO, Function Calls, Message Passing.....

      Well said.

      Might add "performant" but guess you could roll that into "useful", or at least " useable".

      But definitely it is "If it works for the End User" that matters (even if he sometimes ... or ... and occasionally ... mutter, mutter).

  18. Nitromoors

    Missed Links

    Well that admission just goes to show that Jobs really was a man of Form over Content!

  19. Wayland

    Imperial Collage London

    Back in 1979 I went on a school outing to Imperial Collage London. They were demonstrating two of these PARC machines. They where doing card tricks on screen with a pack of overlapping cards in monochrome. I was very impressed.

  20. brunob

    ICL PERQ and the fourth thing

    I remember having the ICL PERQ (https://en.m.wikipedia.org/wiki/PERQ) demonstrated to our company some time in the early 1980s.

    Clearly a clone of the Alto, but I contend that the 4th thing that the Alto 'invented' was turning the screen so that it was the same shape as a piece of paper so you could see exactly what a whole page looked like before printing. I was blown away by the simple idea. I still regularly have 1 screen rotated for the exact same reason.

  21. ske1fr
    Windows

    Refreshes my memory

    "Alan Kay: Face Values" in PCW, 1987. Blew my mind, the discussion on Jerome Bruner's enactive, iconic and symbolic representation model was somewhat influential.

  22. vcragain

    As a programmer going from mainframe to PC in this startup timeframe, it became increasingly frustrating to me that so many languages developed for use on these platforms - it became obvious after a while that there was no way to keep up with all of it - I tried for a while then got really annoyed with all the brains spouting competing code frameworks - I wanted to be competent & all of you were busy inventing things I did not have the time to learn - so disheartening, yet wonderful to see. The world is now way ahead of my retired brain, and I run into trouble now with those stupid, stupid little screens you are all running your lives with ! I have given up ! Oh well, it was a fascinating time to be alive !

  23. Chris Evans

    Three mouse buttons!

    I see the mouse had three mouse buttons! I wonder how they were designated?

    RISC OS it is SMA (Select, Menu & Adjust)

    Windows (Select, program-specific feature, display a shortcut menu or other program-specific feature.)

    I don't know how other OS's designate them.

    1. swm

      Re: Three mouse buttons!

      Every program had its own interpretation of the buttons. Sometimes in conjunction with the shift and control keys.

      There was also a 5 key "chord" device.

    2. timrowledge

      Re: Three mouse buttons!

      Smalltalk was always select-menu-windowmenu until Windows got common enough to start warping people’s minds. So these days it tends to be select on main button, menu on secondary, window on tertiary or alt+secondary etc.

      Fun bonus fact - I did the original port of Smalltalk to ARM (it was a launch day language for the Cpu) and had many discussions with the RISC OS team about mouse use & behaviour, menus, etc. I *still* make a living using Smalltalk on ARM.

  24. Blackjack Silver badge

    So is there any article about the whole Os/2 Windows NT thing in the Reg? Even better, it would be a good theme for a podcast.

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > the whole Os/2 Windows NT thing

      What have you got in mind?

  25. Pete Sdev Bronze badge
    Thumb Up

    Another thing

    One thing that strikes me with the Alto that they also got right was the portrait orientation of the screen. Which if you're mostly working with documents makes perfect sense. Something that was ignored for decades until tablets came along (and out of ergonomic necessity smartphones).

    I'm not sure since when it's been possible to rotate the display output on Xwindows (xrandr ?), though it wouldn't be practical until flatscreen monitors became widespread.

    1. Mage Silver badge

      Re: Another thing

      There was at least one make of rotatable CRT in 1990s.

      Manual changing of resolution was possible before either auto-rotation by signalling* from screen or the 0, 90, 180, 270 or similar settings added. Mac, Windows & Linux with CRTs in late 1990s.

      (* I've seen USB and VGA cable signalling).

      1. ThomH

        Re: Another thing

        There was also at least one first-party Macintosh portrait display in the 1980s, in addition to the various Radius models others have mentioned. I guess they're artefacts of monitor sizes at the time and the 1980s Macintosh's commercial niche of desktop publishing.

  26. This post has been deleted by its author

  27. A_Badger_Broke_My_Wrist=>https://youtu.be/iXdrFfpn3Is

    Xeros got Xeroxed? Thaaaaat's karma! ;)

    It still looks like a somewhat modern design - have straight lines always signalled "Modern"? Pillars vs caves, Arial, Helvetica vs Times New Roman, today's multithings' totally flat and straight with smallish rounded corners vs Altos slightly curved but kinda straight lines? Are we heading for a world filled with devices all as utterly flat and sharp cornered as Kubrick's monoloith from 2001? Will it feel like the future when someday, everything is made that way? And when it finally looks and feels like the future, will that signal the end of human life on earth, as we sense within our collective subconscious that we've reached the zenith of all technological progress possible here, having doomed life this planet? I expect most everyday objects which feature in our colonies elsewhere will be round and become progressively more monolith-like, as the cycle begins again, and continues until all objects in the entire universe are made oblong. And that's all the meaning there is to your life. (Space telescopes should be looking for polygons made by competing alien civillisations)

  28. Paul Hovnanian Silver badge

    Steve Jobs memory

    "Now, remember it was very flawed. What we saw was incomplete. They had done a bunch of things wrong, but we didn't know that at the time."

    And then Microsoft grabbed the market for doing things wrong and ran with it. And Apple never caught up.

  29. martinusher Silver badge

    Its all surprisingly logical

    I've spent my entire working life (more or less) developing computers or things to do with computers and if there's one thing that could describe what it was like it could be running up a steep hill covered with loose ground. You were always battling not-quite-there technology with never quite enough resources (money).

    Cue old Monty Python sketch that includes "uphill in the snow -- both ways", of course. But it really was like that, you could never get budget for anything. My earliest contact with micros was in the early 70s when my immediate boss wanted to use an 8008 in an instrument. Great idea, but the sort of thing that caused a 'sucking noise between the teeth of his boss'....too expensive, obviously. It carried through another job where pencil and paper were cheap (or rather engineers were grossly underpaid) but design software wasn't. By the late 70s/early 80s I had at least an Intel MDS to work on but it cost about five times the cost of my (5 bedroom) house - weighed a ton, too. The attraction of living in the US became very great because they could actually afford parts (and the garages here are about the size of a typical British industrial unit). But even there the same 'design out of the price list' mindset was rampant with skills and vision being definitely in short supply compared to the UK (but the weather's a lot better and, hell, they're sponsoring my visa so who am I to complain?).

    Now we're positively awash in cheap processing power, more memory than we should ever be able to use, communications to every desktop (and pocket) so things should be all rosy.They're not, of course -- quite apart from corporate wrangling about the spoils (rounded corners, anyone?) we seem unable to use the resources at our disposal adequately. Software is inefficient, buggy and often a major pain to use and it along with communications bandwidth seems to be mostly used for meaningless fluff and unwanted, intrusive, adds. (Cue sketch again....but seriously, was it really worth it?)

  30. Bump in the night
    Happy

    Idealism

    Am I misremembering my youth or is it really in such stark contrast?

    A shiny future of optimism and idealism at least in the tech world, or what we now have now of the cyber wars, greed and needless feature itis.

  31. Anonymous Coward
    Anonymous Coward

    I spent 4 weeks in the mid 80's at Xerox PARC learning about Smalltalk, then returning to the UK to write a complete network management front end suite.

    I worked, drew graphic images of network topology, highlighted nodes with faults, and trending minor errors, allowed you to re-configure nodes on the fly.

    Most fun I ever had programming. Still playing with Squeak.

    1. timrowledge

      I hope you’re on the Squeak mailing list!

  32. avu2

    Interpress

    We may remember for good balance a 4th pillar of the cathedral built at Xerox PARC : the Interpress printed page description language, whose authors founded later Adobe with the ubiquitous PostScript of our PDFs.

    At my company location all the existing IBM Selectric typewriters were replaced with those high end Xerox Star document processing workstations able to display a wysiwyg (another first) rich text full page with typographic grade type faces and print it at high speed / high quality on any Xerox photocopier Ethernet connected. It was supposed to help the main Xerox business but that did not happen, too early, too expensive.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Interpress

      [Author here]

      > the Interpress printed page description language

      This is true, but Interpress didn't come out of the Alto. It is one of the most influential things to come out of the later Star line of computers.

      Citation: https://www.daube.ch/docu/pdl02.html

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like