back to article What is this computing industry anyway? The dawning era of 32-bit micros

This is the third part of The Register FOSS desk's roundup of some of the more memorable missteps and could-have-beens from the beginnings of the microcomputer industry until today. In part 1 we looked at some of the bad decisions of the eight-bit era, and in part 2 the more expensive mistakes of the 16-bit era. The 32-bit …

  1. Bebu
    Windows

    All about software

    As I think was noted in part 1, the actual hardware became less important than the (application) software users wished to run.

    Although some of the software that users want to use makes me think they deserve all the Windows shenanigans they have to endure. :)

    Unfortunately the ultimate non starter in the 32 bit race was of Unix for the desktop - both the 386 and 68020 had ports of Unix although the advent of the 386 was instrumental to Linus' creation of Linux (arguably a Claytons' unix. ;)

    Looking forward to the Itanic and amd64's ambush in part 4. :)

    1. Mage Silver badge

      Re: 386 and 68020 had ports of Unix

      There was even a sort of UNIX (MS's Xenix?) for the 286. One big Wang 286 box ran it and needed an extra PCB for PC compatibility to run DOS.

      Archimedes came with Risc OS, but I think in 1987 had a UNIX too.

      There was even Cromix from Cromenco for their hybrid Z80/68000, maybe 1985 or 1986.

      1. Blue Pumpkin

        Re: 386 and 68020 had ports of Unix

        Not forgetting the original SCO - we used to run it on Compaq 386 laptops in preference to Windows

        1. Doctor Syntax Silver badge

          Re: 386 and 68020 had ports of Unix

          SCO was Xenix - they were the chief resellers & then took over the project. The problem was that it was too expensive. If they'd aimed at selling in quantity at a competitive price I doubt we'd be seeing Windows and certainly not Linux today.

          1. fromxyzzy

            Re: 386 and 68020 had ports of Unix

            SCO never really innovated or improved Xenix/SCO Unix/OpenServer after they took it over. They even wound up purchasing Unixware from Novell and doing very little with that other than maintenance. The driver situation was particularly appalling.

            The reason is that they were only interested in massive corporate distribution and the hefty maintenance contracts that came with. By the 90s, Xenix and descendants provided the point-of-sale backbones for a lot of retailers and lots of commodity small-to-medium business servers. However, they put no thought in to potential competition and just assumed that customer lock-in would ensure at least a couple more decades of easy money, just as it had up until that point. They did not think that companies would ever be willing to consider using Linux, even as companies like VA Linux and Red Hat started selling and supporting hardware for the growing market and IBM began spooling up programs to investigate the potential.

            So within a few years of their full takeover of the license for Xenix they were on the back foot, then absolutely on the skids, then reduced to desperate lawsuits accusing Linux of intellectual property theft.

            1. Michael Strorm Silver badge

              Re: 386 and 68020 had ports of Unix

              > So within a few years of their full takeover of the license for Xenix they were on the back foot, then absolutely on the skids, then reduced to desperate lawsuits accusing Linux of intellectual property theft.

              Bear in mind that the "SCO" that gained infamy for the anti-Linux lawsuits *wasn't* the original company. The original "Santa Cruz Operation" sold off its declining Unix division in early 2000- and with it the rights to the SCO name- to Caldera, which later renamed itself SCO Group.

              The lawsuits started a couple of years later under Darl McBride, who had been with Caldera for several years before the buyout and had nothing to do with the original SCO.

              1. anothercynic Silver badge

                Re: 386 and 68020 had ports of Unix

                And if I remember correctly, Caldera started off as a Linux shop (selling OpenLinux) around the same time RedHat did, and then spotted an opportunity to get the 'UNIX' name when the original SCO offered its Unix division for sale. By getting Xenix and UnixWare, they thought they would have it all, and then McBride started his lawsuit crusade.

                Novell eventually piped up and said "no, us selling you Unixware didn't mean you owned 'UNIX' as a trademark, we still own that" and that rolled on until the SCO-formerly-known-as-Caldera went pop.

            2. Michael Wojcik Silver badge

              Re: 386 and 68020 had ports of Unix

              SCO never really innovated or improved Xenix/SCO Unix/OpenServer after they took it over.

              I don't think that's fair. They essentially re-implemented Xenix on top of SVR3 and the iBCS ABI. With ODT they added X11, NFS, and other major features. They added in Merge 386 for running MS-DOS applications (though that was developed by Locus and had already been ported to some other UNIXes; it wasn't a SCO project); and then later incorporated Platinum's Merge 4 with Win95 support.

              In subsequent years they merged in a number of features from SVR4.

              Even before the Calderafication, there was talk at Real SCO of merging OpenServer and Unixware. That didn't actually happen until the SCO Group body-snatcher era, though.

            3. Alan Brown Silver badge

              Re: 386 and 68020 had ports of Unix

              "The reason is that they were only interested in massive corporate distribution and the hefty maintenance contracts that came with."

              And as the wheel of time has turned, today that's an accurate description of Redhat

      2. ChrisC Silver badge

        Re: 386 and 68020 had ports of Unix

        Acorn definitely had a Unix running by the start of the 90s at least, because when I was touring universities in 1990 deciding where to apply to, I remember one of the computer labs in one of them being filled with what *looked* like the A310 Archie I was familiar with, but on closer inspection turned out to be running some version of Unix.

        1. richardcox13

          Re: 386 and 68020 had ports of Unix

          Uni temporarily had a test unit. Problem it was supplied with insufficient memory to get reasonable performance even for a single user.

          This would have been 1990 (just before I graduated).

        2. druck Silver badge

          Re: 386 and 68020 had ports of Unix

          That would have been the R140.

          1. timrowledge

            Re: 386 and 68020 had ports of Unix

            I had an R260(?) with 8Mb for a while, part of working on the Active Book stuff. I think. It was an awful long time ago. And the Active Book had a sorta-Unix at bottom.

        3. J.G.Harston Silver badge

          Re: 386 and 68020 had ports of Unix

          My first "proper" job after fleeing the UK was working with networked Acorn/PC/Unix systems which included RiscIX systems.

          ....wish I'd taken photos....

      3. toejam++

        Re: 386 and 68020 had ports of Unix

        There was also Coherent from Mark Williams, a Unix V7 clone that ran on the PDP-11, Z8000, MC68000, and 8086 (though Coherent V3 required a 286 and V4 required a 386).

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: All about software

      [Author here]

      > (arguably a Claytons' unix. ;)

      Could you explain what that means?

      1. Paul Kinsler

        Re: Claytons' unix

        ... i.e. the unix you have when you're not having a unix:

        https://en.wikipedia.org/wiki/Claytons

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Claytons' unix

          Aha! Thank you!

          (I *did* Google before asking...)

      2. Dave 126 Silver badge

        Re Clayton's Unix

        It looks to be this:

        Never bet against the cheap plastic solution. Or, equivalently, the low-end/high-volume hardware technology almost always ends up climbing the power curve and winning. The economist Clayton Christensen calls this disruptive technology and showed in The Innovator's Dilemma [Christensen] how this happened with disk drives, steam shovels, and motorcycles.

        - http://www.catb.org/~esr/writings/taoup/html/ch02s04.html

        The OP might have been aware of the convention of referencing economists and others by their surname rather than by their first name, and then chose to ignore that convention.

      3. HarleyBird

        Re: All about software

        Cleary Bebu is Australasian ...

        Clayton's was "the drink you have when you're not having a drink" way back in the day.

        https://en.wikipedia.org/wiki/Claytons

    3. Anonymous Coward
      Anonymous Coward

      Re: All about software

      "Looking forward to [...] part 4. :)"

      Me too! Cool series of articles ... and looking forward to Part 0 as well, about two-bit computing (same as it ever was, or wasn't), and 4-bit (into the blue again, for the first time in 4004).

      Mind you, with the way things are going in "AI", INT4, FP4, and two-bit might be making a Once in a Lifetime comeback, and so Part 5 could really loop back to Part 0 ... or, as David Byrne would say: My God, what have we done?

  2. Christopher Reeve's Horse

    Meanwhile, at home...

    All very business orientated, but you've also got a generation of people who grew up with 8bit and then 16bit machines who loved the variety and mix of productivity applications, programming, and gaming - and had got used to a GUI OSs courtesy of the Amiga 500 and the Atari ST.

    By 1993 or so these machines were long in the tooth, and once desktop PC's started emerging with 'multimedia' capabilities (SoundBlaster cards, CD roms etc) and a basic GUI OS or Windows 3.1 the transition into becoming aspiring PC owners was the only reasonable proposition. At the time the equivalent mac was much more expensive and significantly less capable. I think the consumer space was a huge influence on what was also going on in parallel in small businesses and offices.

    1. MonkeyJuice Bronze badge

      Re: Meanwhile, at home...

      I remember being an Amiga 500 kid, confident in the knowledge that I had a seriously good piece of hardware, until one day I walked into a video game store and on the screen, they were playing Alone In The Dark on probably a 386/DX. It became immediately apparent that my blitter had finally become obsolete. I ditched 68k asm for x86 the following year. The shift to segmented memory, specialised registers for everything was... a learning curve.

      1. werdsmith Silver badge

        Re: Meanwhile, at home...

        Alone in the Dark was such a pain. To run, it was necessary to have a separate config.says with all unnecessary stuff stripped out and everything else loaded to himem. And same with autoexec.

        Switch these two files over and reboot before running Alone in the Dark, then switch it all back and reboot before going back to general use.

        1. Patrician

          Re: Meanwhile, at home...

          The joys of a multi-config autoexec.bat; I remember running the below:-

          1. Using a QEMM config

          2. Using Himem

          3. Standard

          4. Stripped out bare minimum for Ulitima 8

          5. Using unidriver.sys for Sim City 3000 and Transport Tycoon (My graphic card at the time had the capability to run VESA but some games for some reason, just would not detect that capability hence the use of unidriver.sys)

    2. Anonymous Coward
      Anonymous Coward

      Re: Meanwhile, at home...

      The Mac around 1993 was definitely more expensive, but it was more capable in many ways - they all had audio built in, for example. High res scanners, networking, some Plug and Play too. So yeah, Macs were pricey. And seldom seen outside of schools or Graphic Design outfits.

      And many domestic households only had one computer of any flavour, so it had be suitable for the parents (who at work were more likely to use a DOS or other text-based computer system), and Little Johnny who wanted to play games, of which more were available on DOS than on Mac - that's if Lil Johnny didn't already have an Amiga or Megadrive, machines that boasted game libraries to turn a PC gamer green with envy.

      But then as you noted, the capabilities of PCs and Macs continued to grow, especially in the area of 3D games.

      By this time though, Lil Johnny might have a say in what computer he uses at home, and any money spent on thoughtful Apple features was money not spent on commodity computer power, which translated directly to being able to play the latest games at frame rates above a stutter.

      1. MonkeyJuice Bronze badge

        Re: Meanwhile, at home...

        The issue I found was even in 1999, QuickDraw wasn't.

      2. ThomH

        Re: Meanwhile, at home...

        Say it quietly, but the Mac is the best way to play many of the small subset of titles that did get ported in the early-to-mid-'90s exactly because it didn't have a VGA-equivalent low-resolution mode. So a bunch of games run at 640x480 rather than 320x200; good examples are things like Chuck Yeager's Air Combat and Prince of Persia. Though bad examples also abound where they just pixel doubled the art — Flashback springs to mind.

  3. Henry Hallan

    ARMed and Ubiquitous

    The part of the story you've missed is that the project that Furber and Wilson worked on in a downstairs room in the old Fulbourn Road waterworks turned out to be the dominant CPU for ... well, pretty much anything with a battery.

    Which means phones. So. Many. Phones.

    Apple might not have stuck with ARM - although I do wonder what pressure Intel exerted - but ARM devices are now in pockets around the world

    But that seems to be a common theme: the right decision but a decade early

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: ARMed and Ubiquitous

      [Author here]

      > turned out to be the dominant CPU for ... well, pretty much anything with a battery.

      Yup. How does that fit in a list of greatest mistakes, then?

      I'd call it the greatest single comeback in the history of the computer industry, myself.

      My comment was that Acorn perhaps killed off its workstations too soon. I could have gone into _why_ at greater length, perhaps, but I thought the Be tie-in would speak to non-Brits a bit more.

      For my money there were 2 possible angles:

      1. Laptops. They could have made thinner lighter more powerful longer-lasting cooler-running laptops than anyone else in the late 1990s. But they barely tried, with the A4 The NewsPad was never released.

      https://chrisacorns.computinghistory.org.uk/Computers/NC.html#NewsPAD

      2. Galileo and its implausible ambitions of building a radical next-gen OS in-house, rather than partnering with anyone else. I even tried out a RISC PC running Taos, which was astounding, but that never went anywhere. Taos was also in the running for a next-gen Amiga and if that Amiga had been Acorn-based that would have been a lovely idea... if probably not a lucrative one.

      https://arstechnica.com/gadgets/2018/03/a-history-of-the-amiga-part-12-red-vs-blue/

      > although I do wonder what pressure Intel exerted

      Back then? None at all, I'd think. Apple's dalliance with Arm on the desktop was about 20 years before desktop Macs went Intel.

      > the right decision but a decade early

      That I agree with!

      1. Henry Hallan

        Re: ARMed and Ubiquitous

        Mistake, no. "Could-have-been?" Absolutely so!

        Of all the quirky and oddball things I remember in my forty-mumble year career in IT and telecoms, ARM is the most incredibly successful - and the reason it succeeded against incredible odds is because it was so outrageously better than the competition

        ARM is the hardware equivalent of Linux in that regard. It came from a nowhere budget and it is driving the corporate competition slowly but surely to oblivion

        Of course my own memories are coloured by a short stint working in that old waterworks. Good times

        1. Alan Brown Silver badge

          Re: ARMed and Ubiquitous

          ARM's single biggest problem at the moment is that Uncle Sam has claimed control of who is allowed to have it

          That will be its downfall

      2. toejam++

        Re: ARMed and Ubiquitous

        > Laptops. They could have made thinner lighter more powerful longer-lasting cooler-running laptops than anyone else in the late 1990s.

        If only the DEC StrongARM had been on the market when Apple was looking to replace the Motorola 680x0 with a RISC processor.

        In the early 1990s when Apple was deciding between ARM, MIPS, PowerPC, and SPARC, the ARM processor was very under-powered in comparison. Laptops weren't nearly as popular then as they are now, so desktop usage was the primary consideration. Going with ARM would have hindered their desktop series' performance. And adopting ARM just for laptops later in the decade wouldn't have made sense from a software library standpoint.

        Just to give some insight into how Apple viewed ARM at the time, here is a clip from Allen Baum who worked at Apple:

        And then, while Newton was going on, some people from DEC came to visit, and they said, "Hey, we were looking at doing a low power Alpha and decided that just couldn’t be done, and then looked at the ARM. We think we can make an ARM which is really low power, really high performance, really tiny, and cheap, and we can do it in a year. Would you use that in your Newton?" Cause, you know, we were using ARMs in the Newton, and we all kind of went, "Phhht, yeah. You can’t do it, but, yeah, if you could we’d use it." That was the basis of StrongARM, which became a very successful business for DEC.

        1. _andrew

          Re: ARMed and Ubiquitous

          Nice as the StrongARM was, for its time, I don't think that it ever had a floating point unit. Even in integer the multiply did 12 bits at a time. Which is all fine for what Arms were used for at the time: hand-held 2D GUI devices. MIPS, PowerPC, and SPARC (and Alpha) all had good floating point, and "PC" class systems, even in laptop form factor, needed that for spreadsheets and 3D graphics and (eventually) JavaScript.

    2. ChrisC Silver badge

      Re: ARMed and Ubiquitous

      "ARM devices are now in pockets around the world"

      And kitchens, living rooms, cars, vending machines, and a myriad of other places where processors (either as is, or wrapped in the peripheral layers that turn them into microcontrollers) are used without most people ever being aware of their presence. And whilst ARM may not have taken off on the desktop until recently (don't forget that Apple's new family of desktop CPUs run on ARM cores), it's pretty much a certainty that the average desktop over the last decade-ish would have still had at least one ARM core in it somewhere controlling some part of the system.

      So yes, ARM's dominance of the mobile phone processor world is justifiably something they can be proud of, but the way they've also come to dominate the wider world of embedded processors is equally if not moreso impressive, given the myriad of established devices they had to compete against initially. And as someone who develops products in that field, and still has the battle scars from having had to work with some of the less user-friendly alternatives, I'm really rather happy about just how easy it now is to find an ARM-core part that will tick all the boxes and avoid any thoughts of having to consider something like a PIC or, shudder, an 8051 derivative...

    3. doublelayer Silver badge

      Re: ARMed and Ubiquitous

      I wonder how much of this is speculation based on what we would want rather than what is realistic. Apple could have used ARM chips in Macs at the time, and yes, that would have been decades before they actually did. At the beginning, those would have been fine. However, would that necessarily have kept ARM strong in comparison, or would ARM have become the new PowerPC: good enough, but there is something faster that eventually needs to be switched to?

      It is relevant that the current ARM chips came from the enhancement of lower performance, lower-power mobile chips. In fact, might a 1990s-era push for faster performance for mostly desktops have harmed ARM's power usage, something that was important to getting it in every mobile device in the 2000s and 2010s? It certainly did for X86 as Intel discovered when nobody wanted to put an Atom in a tablet.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: ARMed and Ubiquitous

        [Author here]

        > would ARM have become the new PowerPC: good enough, but there is something faster that eventually needs to be switched to?

        That *is* a legitimate point. A bit before the turn of the century, Arm stopped being competitive on the desktop in terms of raw CPU performance, although it wasn't shameful even then.

        (I wrote this back in 2010: https://www.theregister.com/Print/2010/06/25/riscos_beagleboard/ It's the first mention of the Iyonix I can find.)

        But I think the salient thing, or things, here are two:

        1. The DEC StrongARM (~1996) shows one bit of potential. Acorn built Arms to be small, cheap, cool-running, and sip power. DEC tried its hand and made an Arm that was at least an order of magnitude faster, and was still cheap and cool-running. The point being that if you attack the problem with different emphasis, such as "make it as quick as possible and here's a billion bucks to do it", as Apple has showed, there was for a long time a _lot_ of unexploited compute potential in the basic Arm design.

        2. There was potential to go SMP very early on, and thus be able to demo extremely impressive looking (if not truly representative) benchmark scores. I owned a dual Pentium 133 server (pre-MMX). It was passively cooled, just, and it still drank juice. By the PII and PIII era, SMP x86 boxes were _expensive_. Around 2002 I had a dual Athlon XP machine, which as a bare box -- I brought my own disks, optical drives and graphics -- was £1000. These things were quite loud as well as power-hungry, and the CPUs were a double-digit percentage of the price.

        What Acorn never did was offer multicore Arm desktops, because RISC OS couldn't handle it.

        But while adding a 2nd CPU to an x86 PC added 20-30% to its cost and 50-75% to its power and cooling needs... and a 2nd was all you got, unless you spent as much as a house on a quad-CPU server... adding a 2nd CPU to a Risc PC 2 would have cost about £50. For the cost delta of a dual-processor PC, they could have offered a quad-processor machine that drank *less* power than a uniprocessor PC, was passively-cooled and silent running.

        Apple could have done the same, had it committed to Arm early. Remember there were dual-CPU Macs in the MacOS 9 era:

        https://lowendmac.com/1997/power-mac-9600/

        And even quad-core Mac clones!

        https://lowendmac.com/1996/daystar-genesis-mp-plus/

        The only thing they could use >1 CPU for was certain Photoshop filters, but still, they sold.

        The point is: there _was_ a demand and it could have been feasible.

        1. FirstTangoInParis Bronze badge

          Re: ARMed and Ubiquitous

          So get Ubuntu LTS and run it bare metal on a Xeon desktop and as a native VM on a ARM MacBook Air. There is absolutely no comparison, Ubuntu ARM on MacBook wins hands down. So how about Apple start selling Mac Minis to the workstation market running Linux, preferably with the graphics drivers actually working, and see where that leaves Intel?

        2. doublelayer Silver badge

          Re: ARMed and Ubiquitous

          I wonder what would have happened in practice if it had been Apple's ISA of choice at that time, because they didn't have billions to throw at their in-house chip designers who have a decade of experience on that ISA and their home-built versions of it. They would have had much less money and they'd have had to give it to someone else to do the chip design work or try to buy ARM. How long would it take before they said something along the lines of "we could extract a lot more performance per watt from this chip if we had the time and money, but we need raw performance now, so let's forget about the power problems and just put more fans in this Performa".

          If they had, it's possible that ARM might have focused on the desktop market and become a competitor to Intel and AMD in the late 1990s, but that would mean that it wouldn't have been such a big player in embedded. I wonder if that wouldn't have been worse in the long term for ARM if it found itself wedged between X86/X64 on the high end and something like a better SuperH for embedded. Almost certainly less software would have been written for it, fewer compilers would target it, and there would be less industry experience running it if it was mostly an Apple chip than if it's needed whenever you want to build something handheld.

        3. Roo
          Windows

          Re: ARMed and Ubiquitous

          "2. There was potential to go SMP very early on,"

          Discrete Microprocessor SMP was a dead end in practice. You never got anything close to linear scaling - maybe 30-50% more oomph for > 2x the component count (remember all that extra glue logic). The other problem was that process improvements would lead to single processor clock speeds and cache capacity would pretty much eclipse any SMP solution within 6-12 months regular as clockwork. That

          did change with NUMA, and then process economics combined with NUMA made multi-core chips worth the extra development & manufacturing cost.

          I suspect the biggest barrier to ARM's adoption was reliance on third party fabs, in those days the those third party fabs were a step or two behind the bleeding edge proprietary foundries - so from the PoV of a desktop/server vendor there was a risk your entire line of business would be dependent on foundries that are a generation or two behind the competition. The DEC Alpha ran into this problem later on with Global Foundries and Samsung.

          in retrospect I believe ARM actually did a pretty good job of navigating the swamp in the 90s (and of course Apple + VLSI Tech were major shareholders from day one of ARM).

      2. Anonymous Coward
        Anonymous Coward

        Re: ARMed and Ubiquitous

        Posting anon, ex Intel here. I clearly recall the prevailing mood in the company, circa 2008-10 as the ARM threat began to encroach on desktop/laptops, being summed up by one manager as "We can bring our power consumption down [to ARM levels] far more easily than ARM can bring their performance up [to IA levels]"

        Not quite up there with lines like "640K should be enough for anyone" or "They couldn't hit a barn door at this distan-*bang*" but still. *Shakes head*

        1. doublelayer Silver badge

          Re: ARMed and Ubiquitous

          I still have to give them credit for doing a pretty good job at decreasing their power consumption figures significantly. Not to phone levels, but there's a reason why the low-end X86 boxes nearly all use Intel parts rather than AMD. That market segment may not be the most alluring, but they did find a demand for their low-power chips and they can be surprisingly good.

        2. Alan Brown Silver badge

          Re: ARMed and Ubiquitous

          "We can bring our power consumption down [to ARM levels] far more easily than ARM can bring their performance up [to IA levels]"

          That was explicitly stated on El Reg and other places at the time

          Frankly it was "about time!", given that x86 had the worst performance/watt of any of the CPUs it had previously driven out of relevance but Intel's sudden focus on power consumption was detrimental to performance for a while

          By way of comparison, I had Supermicro X7 boards drawing 300W+ at idle. That dropped down to 150W in equivalent X8 boards and under 50W on X9 ones (just the boards, not the drives/controllers plugged into them

  4. 45RPM Silver badge

    Wow! What happened with this article? Did you get bored and submit it incomplete? No mention of the ARM and RISCOS (arguably the most important 32bit since it leads to the world of today, with its phones and tablets and power efficient servers, laptops and desktops), no mention of the 68020 and its offspring, powering high end Amiga, high end ST (well as high end as the ST ever got </flame off>) and macOS (the offspring of which are still with us today).

    But still you squeezed in the Hobbit, interesting but about the most irrelevant CPU ever made, barely even a footnote in history, and ignore SPARC, PowerPC, MIPS and other evolutionary dead ends which, although dead (or dying) are all still more relevant than the hobbit.

    C- See me after class.

    1. Dan 55 Silver badge
      Unhappy

      No mention of Nokia choosing ARM for The Matrix slide 8110 phone and then Nokia, Ericsson, and Motorola choosing ARM and Psion to write the OS, based on EPOC. But maybe the article was focused most on PCs.

    2. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      Er. You did get that this was meant to be about _failures_, right? Mistakes, SNAFUs, missed chances, and other assorted fsck-ups?

      1. 45RPM Silver badge

        Absolutely I understood. And if Commodore and Atari hadn’t mucked up then they’d still be with us. If Apple hadn’t mucked up then perhaps it wouldn’t be with us in its current form, or perhaps at all, today (if the original update plan for macOS, Copland, had been followed rather than a move to NextStep would macOS today be Unix based? Would the iPhone exist? Maybe not). So perhaps Apple’s mess in the 90s was fortuitous. And System 7.5.x was a truly woeful OS - and most people haven’t heard of its rather lovely sibling, A/UX.

        So many errors. None discussed here.

      2. ThomH

        Does launching and then almost immediately ditching the Falcon* in favour of the Jaguar not count as an error? I mean, I guess maybe not, because neither seems especially viable with hindsight.

        * check out the various demos of it running Quake infeasibly well for 1992 hardware, due to cunning use of the DSP.

        1. Michael Strorm Silver badge

          Atari Falcon already obviously doomed at launch by Tramiel and Atari's cheapness

          No hindsight needed for the Falcon.

          As someone who'd already ditched the Atari ST for an Amiga, I already knew as soon as the Falcon was announced (late 1992) that- however interesting and desirable it looked- it was virtually certain to flop.

          The ST (upon which the Falcon was based) had already been yesterday's news for some time, its market share and former desirability in rapid decline by then. Even the Amiga- which had knocked the ST off its perch as the machine du jour in Western Europe- was in turn rapidly losing ground to the inexorable rise of ever-cheaper PC clones.

          I already knew Atari- or rather, Jack Tramiel's Atari Corporation- too well by that point. Unlike the original, well-funded Atari Inc., (*) Atari Corp. always smacked of cheapness and even then I could have told you that they couldn't- or wouldn't- have marketed their way out of a paper bag.

          If the Falcon was to have had *any* chance of swimming against the tide, it would have required a level of effort and commitment that- by then- it didn't even occur to me to expect from Tramiel and an obviously-in-decline Atari Corp.

          (*) Whose computer and console division Tramiel had bought out in 1984.

          1. Michael Strorm Silver badge

            Re: Atari Falcon already obviously doomed at launch by Tramiel and Atari's cheapness

            To add to- and generalise- my above comment, it's clearer from what I've read since (including Tramiels' time at Commodore) that the ruthless cutting of costs and corners had *always* been his way.

            It's been observed that Tramiel was essentially a "hardware man", and that his early success and later decline were due to that (or rather, his inability to change when the market matured).

            This worked fine during the early 8-bit era (up to and includiung the C64), where everything was new and Commodore offered machines that offered something new or obviously far better- or cheaper!- than what came before and essentially sold themselves. Third-party support quickly estabished itself and that momentum and success was self-reinforcing via the network effect.

            But by the time the computer market had matured and support for software and games was a big deal (*), Tramiel had either failed to understand the change or was unwilling to spend money on marketing and sofware he'd never needed to before. (**)

            Of course, it didn't help that Tramiel had acquired a reputation for screwing over business partners and those that tried to support his machines. I get the impression that this was already a problem in the US by the mid-1980s, where many there had been burned and were scared off from- or set against- supporting Tramiel's "new" Atari.

            The Falcon and the Jaguar were really just a last gasp of that failure

            (*) Look at how the ZX Spectrum continued to sell on the back of its huge back catalogue of cheap/copied software long after the hardware had been superseded technically.

            (**) I mean, look at the Atari Lynx- *fantastic* hardware for a handheld at that time and some of the games were good, but there were few licenses, and you got (e.g.) "Blue Lightning" (impressive for a handheld attempt at Afterburner, but the name smacks of "no-brand knockoff") or "Dirty Larry" (even more blatant and embarrassing unofficial ripoff title more reminiscent of all those early obvious-knockoff-but-not-important-enough-to-get-sued 8-bit titles like "Space Evaders", ""Jawz" or "2003: A Space Oddity").

      3. FirstTangoInParis Bronze badge

        So how about Oracle taking over Sun Microsystems and killing it virtually overnight? Shame because Suns were as reliable as and tougher than old boots (try dropping one from a height; the hard disk was a goner but the rest of it still worked) and the mainstay of network management for about 20 years, likely the same actual hardware too.

        1. Liam Proven (Written by Reg staff) Silver badge

          > So how about Oracle taking over Sun Microsystems

          Linux made Unix a commodity. When something is a commodity it becomes fungible. The brand doesn't matter: it's all much of a muchness and the cheap stuff is good enough.

          https://www.theregister.com/2024/07/02/foss_ai_blockchain/

          Apple survived and thrived not because it runs Unix, although it does, but because of a very strong focus on look, feel, quality, ease of use.

          Sun made some of the best proprietary Unix hardware with a high-quality proprietary Unix, just as free commodity Unix and good-enough PC hardware made that irrelevant. It failed to commercialise OpenStep or any of the other stuff it acquired. It failed to make useful thin clients. It failed to make open source that most people cared about.

          It was going to die. Oracle got it just in time to milk a bit of residual value from it.

          Apple changed the world a few times over. With mass market GUI computers, with pocketable digital music players, with smartphones for non-techies, and then with the product that inspired the smartphone, tablets for non-techies. It's in the process of finally killing CISC chips now.

          The legacies of all the work at Sun are Linux (indirectly), Java (mostly indirectly), and ZFS (which may yet vanish in a few years).

          As Andrew Eldritch put it: it's all about that Vision Thing.

          1. JoeCool Silver badge

            Don't forget the Networking!

            I know it's outside the scope of the discussion, but Sun had really good enterprise clustering - HW, SW, Admin+Management, the whole package. I wonder if Xaas cloud shifting came along just a little too late for them to take advantage. Or maybe they were always going to lose out to Docker and that whole commoditized ecosystem.

          2. anonymous boring coward Silver badge

            I think the legacy of Sun is popularising large LAN networking early on, as well as popularising X-Windows workstations early on (which led to others speeding up implementing windowing environments).

            Many of the things you say Apple invented were actually (significant) refinements of existing products. Nothing wrong with that at all.

        2. Anonymous Coward
          Anonymous Coward

          The death of Sun

          What killed off Sun was the AMD64. Sun relied on the Unix Workstation market place and it turned out that the Unix Workstation existed because of their large flat address space. The 64bit x86 CPU took away the reason why the workstations were superior to PCs. Everyone's workstation business crashed, HP pulled their IA64 off the market over a weekend, a company that had left old models on their product line for years while bringing out new one because it made it easier for government long procurement cycle customers suddenly dropped a whole range, the PA boxes were declining anyway. SGI similarly saw their business go and had to find a new niche. Sun's server business wasn't big enough on its own to carry the company.

          AMD64 + Linux meant that commodity systems could suddenly do what exotic boxes were needed for just a few days earlier.Thanks, it had been a good living up until then.

        3. Alan Brown Silver badge

          By the time Oracle purchased Sun it was a shuffling zombie that was about to keel over regardless

          Spending 10 times as much for Sun vs commodity x86 servers wasn't something most buyers were going to do and even the most avid Sun fans were migrating to cheaper hardware

          What Oracle wanted was Java. The rest was superfluous

  5. Anonymous Coward
    Anonymous Coward

    "NT can be, and usually is, administered by an idiot." --USENET

    While developing, SIGSEGV massively beats impromptu reboots. NT improved markedly little over DOS in that respect. And then there was that 49.7 day itch.

    1. Anonymous Coward
      Terminator

      Re: "NT can be, and usually is, administered by an idiot." --USENET

      > And then there was that 49.7 day itc

      49 days, 17 hours, 2 minutes, and 47.296 seconds, when a 32-bit tick counter clocks back to zero.

    2. anonymous boring coward Silver badge

      Re: "NT can be, and usually is, administered by an idiot." --USENET

      To be fair, NT was a massive step up from DOS. I assume badly written device drivers would have been the culprit?

  6. Anonymous Coward
    Anonymous Coward

    OS/2: the expensive flop

    elReg: ‘Had OS/2 1 targeted the 386 from the start, it could have matched its billing as "a better DOS than DOS" in 1987 .. This expensive flop led to the split between IBM and Microsoft.

    MS was working on Win3 whilst also pretending to promote OS/2:

    SteveB went on the road to see the top weeklies, industry analysts and business press this week to give our systems strategy. The meetings included demos of Windows 3.1 (pen and multimedia included), Windows NT, OS/2 2.0 including a performance comparison to Windows and a "bad app" that corrupted other applications and crashed the systemREF

    'The demos of OS/2 were excellent. Crashing the system had the intended effect – to FUD OS/2 2.0. People paid attention to this demo and were often surprised to our favor. Steve positioned it as -- OS/2 is not "bad" but that from a performance and "robustness" standpoint, it is NOT better than Windows'. REF

    "I have written a PM app that hangs the system (sometimes quite graphically)." REF

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: OS/2: the expensive flop

      [Author here]

      > MS was working on Win3 whilst also pretending to promote OS/2:

      Yes, it was.

      But the 2 companies announced OS/2 1.0 in April 1987 and it appeared in December.

      Windows 3.0 was May 1990, three years later. And this in a time when the industry moved dramatically faster than it does today. That was an aeon in the 1980s.

      I think MS still really believed in OS/2 1.0, and a lot of code that later appeared in Windows 3.0 first appeared in OS/2 1.1, such as Program Manager, the main UI of Windows 3.x -- since it didn't have a "desktop". The fake-3D-shaded window controls of Windows 3.0 first appeared in OS/2 1.2.

      Take a look at the timeline:

      https://www.os2museum.com/wp/os2-history/os2-timeline/

      A lot of stuff went into the first 3 releases of OS/2 *1.x* that reappeared 2-3 years later in Windows 3.0.

      It is OS/2 1.3 that was mainly IBM because Microsoft had moved on by then.

      And remember how small this stuff was. These are systems that shipped on a single-digit number of floppy diskettes. We're only talking in the region of 10-20MB of code here.

      My impression is that MS really believed in OS/2 and invested a lot of time and effort into it, and it saw 1.0 flop.

      Then, after a ton more effort, 1.1 delivered on the promises that 1.0 didn't -- it had a GUI, it had graphical apps, it had useful multitasking -- do little better. It flopped too.

      A load _more_ effort and refinement, and OS/2 1.2 came out in autumn 1989. Now it dual-booted, now it had the fancy next-gen filesystem, with long filenames and so on. And still nobody cared. Nobody wanted it.

      I reckon that by the timeframe between 1.1 shipping and 1.2 getting to beta stage -- 1988-1989 -- MS realised it was too little too late, and it was a $350 OS (a guess, I don't know, but the SDK alone was $3K) for a $5000-10,000 4MB RAM PC, was too much.

      Meanwhile the DOS world was just trucking along fine. 386SX PCs were starting to appear which weren't the primary high-end 32-bit power workstations, they were cheap commodity kit, intended to run DOS better. They ran things like DESQview great, allowing DOS users to multitask. If you weren't that fancy, if you didn't need multitasking, they ran QEMM386 or 386MAX or some other memory manager and your DOS memory worries just went away.

      A $1500-$2000 PC with a $50 OS was all most people had or needed.

      So they came up with a quick and dirty $150 OS for those boxes.

      I was there. I was working in this industry in 1988. I told my boss, and my boss' boss, that Windows 3 was going to be a huge hit, that they should get as many copies in stock as they could, and it would sell out anyway.

      They laughed at me. We had 17 copies in, and they sold out before 9:00 AM when we opened. We had a *queue.* Never happened before or since. Within a week I was the most valuable person in the company because I was the _only one_ who knew Windows at all. It was a joke product and nobody cared.

      We never got or sold a single copy of OS/2, but they sent me on a training course for it, which must have cost them £2000 or more.

      1. PhilipN Silver badge

        Re: OS/2: the expensive flop

        With the wisdom (OK cynicism) of age I now believe MS who were always (cough! cough!) fleet of foot intended to milk everything they could out of the IBM connection then go off on their own. Which duly happened.

        It was their own fault because IBM viewed OS/2 as only “one of [their] operating systems” and their PC division wanted to sell as many boxen as possible which by that time meant Windows.

        Even though (I think it was Eric Raymond who wrote) W95 was “shockingly inferior” to OS/2 Warp.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: OS/2: the expensive flop

          I disagree on all 3 points.

          > MS who were always (cough! cough!) fleet of foot intended to milk everything they could out of the IBM connection then go off on their own.

          Your cynicism is right. But MS really believed in OS/2 and invested a lot of the effort that was salvageable: e.g. the UI.

          IBM bollocksed it up. MS got out and did something cheap, quick and dirty which delivered the bit people wanted.

          Back then it was agile enough to pull this off. Now, I doubt it.

          > It was their own fault because IBM viewed OS/2 as only “one of [their] operating systems” and their PC division wanted to sell as many boxen as possible which by that time meant Windows.

          Only kinda sorta. IBM was too fixated on servicing the customers it had already sold to. MS looked to the future. That's usually the right move.

          More generally, it is what biologists call r versus K strategy.

          r: have millions of babies. The vast majority will die. Accept it, move on. The survivors will win out on numbers.

          K strategy: have only a very few babies but look after them, nurture them, so they have the best chance.

          Neither is "correct". Both work. Both work *in the same ecosystem*.

          IBM wanted smaller numbers of high value high margin customers.

          MS saw that victory lies in piling 'em high and selling 'em cheap.

          MS was right. This time, in the PC industry.

          > Even though (I think it was Eric Raymond who wrote) W95 was “shockingly inferior” to OS/2 Warp.

          Nah it wasn't.

          I bought OS/2 2.0 with my own money. And 2.1 but I didn't use it.

          I loved OS/2. In 1992 or '93 it was _much_ better than Win3.

          But the WPS was weird clunky junk. The multitasking blocked on user I/O. The drivers were few and poor. The installer was a nightmare pile of poo. The fancy filesystem didn't even shorten filenames for DOS. Fractint reliably crashed it.

          Nah. Win95 was better in the ways that mattered.

          The multitasking was good enough. It had long filenames that worked, even for DOS. Screw extended attributes and that BS. It ran on almost anything. The UI was a decade better than WPS.

          Today, every Linux apes Win95. There is no modern version of WPS except on OS/2 itself, on ArcaOS.

          Win95 ran on whatever you had, and it ran whatever software you had. It was slutty in the good way: it fitted in, it cooperated, it wasn't fussy and didn't discriminate. The basic cheap version did networking and _good_ networking.

          In every important way, Win95 was better, and it deserved to win.

          I deserted from OS/2 2.1 where most of my kit from OS/2 2.0 didn't work any more. My SVGA graphics didn't work. My parallel-port sound card with its paid-for 2.0 driver didn't work. My fancy mouse with a numeric keypad on it, with a driver ordered from the USA on floppy for more than the price of a cheapo mouse, no longer worked with a 0.1 version update.

          Sod that. Win95 beta 4 or so did everything, it was as fast, it looked better, worked better, ran more, and used all my hardware.

          1. Dan 55 Silver badge

            Re: OS/2: the expensive flop

            Perhaps the r strategy works in a growing market, but we're not in a growing market any more. Maybe MS need to work on switching to a K strategy and nurture the customers it has by giving them something they want, because as it stands there are plenty of reasons to bail on Windows 10/11.

          2. anonymous boring coward Silver badge

            Re: OS/2: the expensive flop

            I think the explanation is much simpler than this.

            Who owned OS/2, and who owned Windows?

  7. Gene Cash Silver badge

    BeOS

    I was a big fan of it. IIRC it was proprietary and sealed to the point of making Apple look like open source. They alienated anyone that wanted to develop for it by being complete controlling assholes.

    1. Ian Johnston Silver badge

      Re: BeOS

      I'm hoping that Haiku might one day let me dump Linux, but alas in its current form (R1/beta4, two years old) it crashes on my PC.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: BeOS

      [Author here]

      > IIRC it was proprietary

      It was, yes.

      > and sealed to the point of making Apple look like open source.

      Well, they did open up the Tracker. Haiku still uses it today.

      > They alienated anyone that wanted to develop for it by being complete controlling assholes.

      Do tell?

    3. cookieMonster Silver badge

      Re: BeOS

      I loved it. Actually bought a BeBox, and only sold it 2 years ago.

      The OS was great, and the one outstanding thing I really loved was the File System and the way you could add attributes to files and query it.

      Was pretty upset when it went away. I’ve tried Haiku but it’s not something I could use on a daily basis.

    4. Chris Gray 1
      Thumb Down

      Re: BeOS

      I remember seeing the ads for BeBox in magazines. Seemed like a good thing. I found out on usenet, however, that system and libraries must use the C++ classes that Be defined in order to do much of anything. No real ABI. As a compiler-writer looking to port his stuff, that killed it for me. I stuck with Amiga's, then Linux.

    5. Michael Strorm Silver badge

      Free as in "BeOS"

      > IIRC it was proprietary and sealed to the point of making Apple look like open source.

      I vaguely remembered that latterly they changed tack and had a version of BeOS that was sort of "free" (but couldn't remember whether that was "as in speech" or "as in beer"), and at least distributable on magazine cover discs. Pretty sure that's where I got the bootable copy I played around with circa 2000-01 (but not much more than that).

      The Wikipedia article confirms that there was a "free" personal edition and that they partially open-sourced it, but this clearly wasn't enough to give them much success.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: Free as in "BeOS"

        > couldn't remember whether that was "as in speech" or "as in beer")

        Desperate last-ditch effort for BeOS 5.

        De-crippled and turned into a rival for their own paid product:

        https://archive.org/details/Beos5.0screen

        It was not partially FOSS, no.

        You know I've written about this, right?

        https://www.theregister.com/2022/01/10/haiku_linux_wine/

        I even linked to my own review of v5 from 24 years ago.

    6. IvyKing Bronze badge

      Re: BeOS

      Be was trying to get some OEM's to istall BeOS on PC's, with one option is providing a dual boot of Windows/BeOS. The contemporary accounts was that BeOS could handle multiple video stream on the same hardware that windows was having problems running one video stream. Needless to say, MS strongly discouraged any attempt at OEM's offering BeOS as an installation option.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: BeOS

        > with one option is providing a dual boot of Windows/BeOS

        Not quite, no.

        That was the _plan_ but Microsoft sabotaged it with threatening behaviour.

        https://birdhouse.org/beos/byte/30-bootloader/

        https://www.osnews.com/story/136392/the-only-pc-ever-shipped-with-beos-preinstalled/

  8. Steve Graham

    Mention of Novell Netware reminded me that Novell were trying to sell us directory software, and that resulted in my taking 3 weeks of training with them on XSLT. I never really understood it. (I didn't get Lisp or Prolog either. My brain must be strictly algorithmic.)

    It was just at the time when Novell had decided to abandon Microsoft Windows and give all employees (except senior managers, naturally) Linux laptops. Which was nice.

    Fortunately, I've never, ever had to go back to Bracknell.

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > Mention of Novell Netware reminded me that Novell were trying to sell us directory software

      Yep, that was my main point.

      > and that resulted in my taking 3 weeks of training with them on XSLT.

      Huh? XSLT is an XML technology and first appeared 5 years after Novell NDS. What is the connection?

      1. Steve Graham

        I think it was a technology for getting data in or out of Directory Services. I think it was also around the time when Microsoft brought out Active Directory which doomed Novell's offerings. Late 1990s?

        1. Liam Proven (Written by Reg staff) Silver badge

          You may have your timeline muddled, perhaps?

          NDS: 1993.

          XSLT: 1998, an age later in 20th century tech terms.

          MS AD: first seen in Windows 2000 Server, February 2000. Best part of a decade after Netware 4.

          https://www.os2museum.com/wp/netware-history/netware-timeline/

        2. TangoDelta72

          Active Directory and Domino names.nsf

          "I think it was also around the time when Microsoft brought out Active Directory which doomed Novell's offerings. Late 1990s"

          Active Directory was a MS conceptual port of [LotusNotes] Domino's names.nsf file. It was a directory, sure, but it was also the configuration database (yes, a database) for your Domino domain. I'd been trained on N/D for a while, and when Win2k/Server came out and I needed to train up on AD, all the main concepts & elements were already in my head.

          1. Liam Proven (Written by Reg staff) Silver badge

            Re: Active Directory and Domino names.nsf

            > Active Directory was a MS conceptual port of [LotusNotes] Domino's names.nsf file.

            Interesting... I did not know that.

            (When I tried to do a head-to-head review of Lotus Notes vs. MS Exchange in the mid-1990s, MS sent me on an MS Exchange Server admin training course. IBM told me to RTFM and that was it. And Notes fans wonder why it failed...)

            I was under the impression that Notes, AD and Exchange all used variants of X.500 addressing. Perhaps that is the common element.

            https://en.wikipedia.org/wiki/X.500

  9. Csmy

    This is the era where the computer itself was no longer your hobby in most cases but facilitated others

  10. Anonymous Coward
    Anonymous Coward

    Arc

    In the late 1980's my desktop at work was a DEC Rainbow running DOS. My home PC was still a fully expanded Acorn Electron, but running out of the oomph I was beginning to need. I seriously considered the Archimedes as it seemed capable of all I'd need for the foreseeable future; a bonus was the availability of an 8086 card that would also give me a PC in the same box - attractive as I could get a lot of the PC software through my work. But, just as I was about to put in my order, Acorn dropped the card and offered 8086 emulation. The Archimedes emulation would have been capable of running DOS software as fast as a normal PC but the change caused me to pause. Whilst the Archimedes had a full suite of "office" software (and a GUI), the pause got me thinking about what I'd be wanting to run a year or so later - and it was going to be PC/DOS software. So, I asked myself why not get a PC - and I couldn't find a good reason not to and ended up buying myself an Amstrad 1640.

    I got to like the GEM GUI, though switched to Windows 3 a few years later when the 386 processor was mainstream. I was on Windows for the next 20 years, with a variety of desktops and laptops until, in 2012, I switched to a Mac (albeit with a Windows VM onboard to ease my transition). That first Mac served me well for 10 years; I'm now on an ARM powered MacBook (and still have a Windows VM for the occasional masochistic foray).

    1. f4ff5e1881
      Thumb Up

      Re: Arc

      I also had an Acorn Electron back in the day, and when I started work and wanted to get a new computer, I briefly flirted with the idea of getting a DOS laptop (there was work stuff I wanted to do on the machine).

      But I was an Acorn nut at heart, and I’d seen the Archimedes running at the 1988 BBC Micro & Electron User Show, and it absolutely stole my heart. So I saved my pennies and got an A310 (later upgraded with an ARM3 and hard disk by the good people at AtomWide).

      My requirement to run DOS software was met by Acorn’s own PC Emulator software. It was disappointingly slow, with monochrome text, but it was just about good enough to run Lotus 123 and Word for Dos that I borrowed (*cough*) from work.

      The poor old Arc eventually ended up in the garage (and later got thrown out) when I migrated to Windows laptops in the 2000s.

      But over the last few years I’ve fallen in love again with the Acorn scene, and resurrected pretty much all of my Archimedes stuff via Arculator, which is an Archimedes emulator par excellence. Fortunately, before I threw the Arc out, I had the foresight to make disk images of everything worth keeping, for posterity, you know. After all, as they say in The Walking Dead, “Everything gets a return…”

  11. heyrick Silver badge

    was that it supported multiple CPUs

    The RiscPC2 would have had it's hands tied behind it's back if it was still running RISC OS.

    Even now, with the diminutive Pi and it's four cores (and similar), RISC OS uses exactly one of them.

    That being said, the earlier RiscPC (1) could take a second processor and it was quite common to have a 486 of some description in there to run Windows 3.1, or some incarnation of W95 for the more adventurous. It was an amusing way to blow the minds of some PC wonks when you're using what looks like a weird slightly outdated PC...and then you switch back to multitasking mode and push the "PC" to the side, and what they're left looking at is something that is sort of like Windows but different in every possible way.

    1. druck Silver badge

      Re: was that it supported multiple CPUs

      The Risc PC 2 was still quite a few years before multi-core chips were common, particularly in the ARM world, and we would have hoped Acorn would have done some work on RISC OS before it became a reality. Of course that could have happened a few years before hand if the RISC PC 1 had two ARM processor cards fitted (instead of an ARM and x86), or when the Hydra board allowed 4 ARM710 chips to be used, but around that time the StrongARM chip came out, and it was far easier to carry on with a single CPU which was 7x faster than try to make full use of 4 slower CPUs.

    2. Torben Mogensen

      Re: was that it supported multiple CPUs

      Already RISC OS 2 in 1988 or '89 had may features that didn't make it into Windows until Windows 95 or later. Its main problem was that it was written in ARM assembler, so updating it to use multiple CPU cores was a major effort. But recall that the move to multicore CPUs didn't happen until about 2005. Until then, the trend was ever faster single cores. But upping the clock frequency increases power usage quadratically, so you get more bang for the Watt by having four cores running at 2GHz than one core running at 4GHz (which would use the same power). But that was almost two decades after RISC OS was designed, so you can't really blame Acorn for not preparing for that. After all, it was only about half a decade between BBC B and the first Archimedes computers.

      And ARM did become faster. ARM2 (using in the first Archimedes) was 8 MHz, ARM3 (used in A5000) was 25-33 MHz, ARM710 (used in the first RISC PC) was 40 MHz, and StrongARM (used in later RISC PC models) was 200+ MHz. XScale (Intel's version of StrongARM) upped this to 400+ MHz. But by 2000 it was clear that ARM was targeted to mobile devices, so low power usage came at the cost of not being at the bleeding edge of performance. ARM has since then broadened its scope to include servers and supercomputers as well as desktop PCs (Apple) and mobile devices. But most of these now use the 64-bit instruction set, which RISC OS can not handle, being written in 32-bit assembly language.

      I originally bought my Archimedes for its hardware, but later came to appreciate it more for its software -- RISC OS was far ahead of the competition, and there was good software for word processing and graphics. And its font manager which used the same rendering engine for print and screen (which didn't happen until much later on Mac and Windows), so you really got what you saw. But RISC OS these days is more or less stuck in the 1990s, with little new software and a lack of compatibility. I don't think there is a Libre Office port, nor Chrome, Opera or Firefox.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: was that it supported multiple CPUs

        [Author here]

        > I don't think there is a Libre Office port, nor Chrome, Opera or Firefox.

        There _was_ a Firefox port.

        http://www.riscos.org/support/firefox/index.html

        A chap called Peter Naulls had an ambitious Unix Porting Project.

        https://groups.google.com/g/comp.sys.acorn.announce/c/MqZ3BNjUB1Y/m/fKALUCsVd7sJ

        It could have transformed the platform, but RO users didn't see the value and complained about relatively trivial UI issues.

        Naulls got a job in Silicon Valley somewhere, and largely disappeared from the community. Damned shame and a great waste.

        These days, some lone genius coders are working on running NetBSD or whole other OSes on a second core in parallel with RO, and there's an effort to make RO re-entrant so it can be parallelised. It's very impressive, but I can't help but feel they'd be better off virtualising the whole thing under something else, and replacing RO with Oberon/A2 or something.

        There'd be a poetic symmetry to that. Acorn tried to write ARX in Modula-2, but it wasn't ready. Oberon is what M2 evolved into. It would close a sort of circle.

        1. anonymous boring coward Silver badge

          Re: was that it supported multiple CPUs

          "but RO users didn't see the value and complained about relatively trivial UI issues"

          Oh, yeah, the users are wrong. THAT's how you popularise a platform...

  12. Mr D Spenser

    Flashbacks

    Amazing the flotsam that arises from reading these articles. In my case, my actual first work PC was an IBM 3270 PC.

    You could toggle between three 3270 mainframe sessions and a single DOS session. Wish I had kept the old slipcase manuals.

  13. Anonymous Coward
    Anonymous Coward

    Jokes...Or Maybe Not Jokes.....

    PS/2 -- Piece of S**t / 2

    OS/2 -- half an operating system

    !!

    1. JoeCool Silver badge

      PS/2 : do you mean

      The architecture or

      The technology or

      The machines

      The technology and actual machines were excellent. The architecture changes are an open discussion, but they were partially motivated to try and move past AT PC short comings.

      Functionally, OS/2 3.0 and NT 3.1 were roughly comparable, but development tools for OS/2 were a lot nicer.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: PS/2 : do you mean

        > Functionally, OS/2 3.0 and NT 3.1 were roughly comparable

        [Laughs hollowly] Yeah, right.

        I deployed NT 3.1 in production.

        It was exceptionally stable, it networked _well_ to Windows for Workgroups, *and* Netware, *and* DEC VAX, out of the box. It ran DOS stuff and it ran Windows 3 apps better than Windows 3.

        (No 64kB heap limit. Win3.0 had 1 64kB heap for all apps and the OS. Win3.1 had 3 of them. Under NT, each 16-bit app got its own. So my clients could run 16-bit Excel on a 1280x1024 screen and load 11 large spreadsheets at once. That size screen _alone_ would crash WfWg 3.11.)

        > but. development tools for OS/2 were a lot nicer.

        They ought to be, for $3000.

        https://www.theregister.com/2024/02/20/microsoft_os2_2_0_beta/

        MS ones were maybe limited but they were basically free.

        1. JoeCool Silver badge

          Re: PS/2 : do you mean

          And I developed telecoms products on both.

          Not deploying into a desktop environment, not needing account management, not needing windows support.

          If those are needed for the sanity of your job, then yes it's a significant gap.

          Although I recall OS/2 3.0 was also very good at running Win32 apps, including larger memory spaces and dedicated process space.

          And as a product platform OS/2 was a very stable, easy to use development & deployment system.

          Workgroups I'll give you (although Netware was still the best LAN user login system at that point), but the NT 3.1 LAN Manager was pretty frustrating to use, including the crude settings gui and the poorly documented device configuration management (See: Modem, serial port).

          Netware died with DOS ( I was a developer using that too ) or more specifically, the arrival of NT.

          I recall OS/2 had excellent integration with Netware, that was something that we lost going from OS/2 to NT. That was a Windows program that didn't work so well on NT.

          Vax ? DECNET ?? .. Yes, for both of the developers that cared (and I was one). But that's like claiming that OS/2 had the advantage of copious MVS connection options.

          At that point anyone still on VMS was looking at how to escape.

          That $3000 cost got addresed by the time of OS/2 3.0. The compiler and dev env was a few hundreds (and I wasn't paying out of my pocket in any case) but also for about $500 you could order the entire IBM dev docs library in paper. Ten volumes as i recall. That's invaluable in a dev environment.

          The MS tools and admin interfaces were indeed limited, and time is money.

    2. anonymous boring coward Silver badge

      Re: Jokes...Or Maybe Not Jokes.....

      OS/2 was really nice. Especially compared to the MS alternatives of that time period.

      MS only abandoned it for business strategy reasons.

      1. Liam Proven (Written by Reg staff) Silver badge

        Re: Jokes...Or Maybe Not Jokes.....

        > MS only abandoned it for business strategy reasons.

        If you call "not going bankrupt" a business strategy reason, yeah.

        1. anonymous boring coward Silver badge

          Re: Jokes...Or Maybe Not Jokes.....

          They abandoned it in the early days. If they had thrown resources behind it it would have developed nicely into something great. So you can't extrapolate from what OS/2 became and draw conclusions from that what would have happened if MS had stuck to it.

  14. TReko Silver badge
    Coat

    Intel 432

    Liam can't list all the 32bit failures, but the Intel 432 deserves a mention.

    https://www.theregister.com/2004/02/17/who_sank_itanic/

    The T9000 Transputer chip probably does as well. Not to be confused with the T900 which got Terminated

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Intel 432

      > the Intel 432 deserves a mention.

      It _got_ a mention. I wrote about it last year. See the Bootnote here:

      https://www.theregister.com/2023/05/25/intel_proposes_dropping_16_bit_mode/

      1. TReko Silver badge

        Re: Intel 432

        my apologies

  15. Alan Bourke

    Ah Concurrent DOS

    Fond* memories of setting up cheapo star networks with Wyse terminals hanging off a 16-port serial board. And of course C\DOS didn't run a lot of MS-DOS software like WordStar properly,.

    *not fond

  16. anonymous boring coward Silver badge

    Love the Talking Heads reference.

  17. anonymous boring coward Silver badge

    "In the 1990s, the hitherto dull and businesslike Intel computers, which had been office equipment up until then, about as exciting as a staple remover, caught up. They gradually gained the fancy graphics and multimedia features that had previously been limited to proprietary systems."

    What proprietary systems were that?

    I'm trying to recall what those would have been, apart from simple gaming consoles? (With unimpressive graphics.)

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > What proprietary systems were that?

      The Amiga and the ST mainly. Secondarily, Acorn Archimedes and RISC OS.

      1. anonymous boring coward Silver badge

        Oh, those "fancy" graphics. I was thinking 3D accelerators, etc. Which I had in the early 90s on my PC.

  18. Plest Silver badge
    Pint

    Here's to my dad!

    I'll be forever grateful to my Dad, just a sport centre tech manager by trade, who took the family on a little car ride to Manchester in Oct 1987 to go get an Amstrad ECD 1640 PC. I was happy enough with my Amstrad CPC but I quickly learned how PCs worked, started hacking stufff and learning x86 assembler. 5 years later I was making 3 times what my dad was making, working as a PC techie. When most people have barely started using computers we had a PC at home with DOS 3.2, I was learning valuable skills I still use to this day, skills that have allowed me to stash all that great pay and will hopefully see me retire before I'm 60.

    My dad is still alive and kicking, aged 84, he still builds his own PCs, I'll be forever grateful for his foresight in bringing this vital tech into our family's lives.

  19. HammerOn1024

    Forte'

    "And you may ask yourself, 'How do I work this?' And you may ask yourself, 'Where is that large computer?'"

    Clearly, 'Once In A Life Time' is not your forte'.

    :-)

  20. Andrew Scott

    netware

    Netware 286 was a bit scary to get running. netware 3 and over were cinches by comparison. there were new things to get used to, but really pretty easy compared to v2.15 etc. the 286 versions also had a limit due to the segmented architecture of the 286, and depending on the hardware you used you might find the services processes were limited due to the heavy use of a 64K segment. Understand that it was fixed in 2.2, but never used that.

  21. AlanSh

    PCs are in my blood

    I've been "into" PCs ever since the first IBM PC with twin floppies. And the Compaq luggable (I had one of the first in the UK). Who remembers replacing the CPU with the NEC chip that ran 20% faster?

    I've tinkered with Command.com, done wizard stuff with DOS internals and wrote some pretty good s/w for my company (DEC) when they had PATHworks on VMS and it ran really slow due to following MS restrictions. One of which was that for a .BAT file, command.com opened the file, read the command, closed the file, executed the command, opened the file, found the next line, read it, closed it. Open & close on a VMS system (where networked .BAT files were stored) was very very slow. so, in a networked PATHWorks environment, it could take 10-20 minutes for the clients to execute login commands. I wrote some code and got that 20 minutes down to about 5 seconds, just by keeping the files open.

    And yes, my son has had his own PC since he was 6 in 1990. He would play games (Sonic the Headgehog was a favourite) while I coded next to him.

  22. MDMAok

    My company built PCs when it was possible to import components from Taiwan, build PCs and sell them to major corporates. I remember the IT manager of one such company, telling me that although our machines were more reliable than IBM or Compaq ( we did 48hrs soak test on each before we shipped. Not difficult) he had been instructed by head office to buy only IBM or Compaq at about 50% premium. Oh well. Game over

    At home, I has a PS/2 Model 80 with 4mb RAM and a microchannel graphics and network card. I thought it was miles ahead of Windows. But nobody cared.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like