back to article Sweet 16 and making mistakes: More of the computing industry's biggest fails

Welcome back to The Reg FOSS desk's roundup of the slip-ups and missteps from the dawn of the microcomputer industry onward – at least those that are most memorable to us. In part 1 of this feature series, we took a look at some of the missed chances of the early era of mass-market microcomputers: the eight-bit machines. This …

  1. Doctor Syntax Silver badge

    "it was clear that what buyers really wanted was not multiple apps at once, but the point-and-click ease of a GUI."

    Or possibly they wanted reliable storage. Sinclair had got away with cheaping out on build quality with the Spectrum and obviously thought they could do that again.

    1. Flocke Kroes Silver badge

      Re: QL storage

      I did not have any problems with microdrives but a part of that was I barely used them. The data interchange standard of the time was the 3½" disk and you could get a memory expansion combined with a floppy disk controller that would work reliably if you upgraded the 5V regulator and wedged in a heat sink for it.

      1. An_Old_Dog Silver badge

        Re: QL storage

        ...a floppy disk controller that would work reliably if you upgraded the 5V regulator and wedged in a heat sink for it.

        Soldering on your comp's PCB (and likely voiding the warranty) may be an acceptable solution for you and I, but not for mom-and-pop consumers.

        Were there retailers/repair shops which offered such a service to the public? And, would the general public accept the need for such mods?

    2. TonyHoyle

      In the sinclair market most people just wanted games.

      They tried with the QL.. it had passable word processors etc. and you could get ones with proper keyboards and phone integration (ICL One Per Desk / Merlin Tonto) but just weren't seen as a business computer... everyone just bought a PC instead.

      1. werdsmith Silver badge

        In the sinclair market most people just wanted games.

        I think it was the C64 that became the de facto games console with a keyboard.

        Sinclair user that had started with the ZX81 were more often programmers because the ZX81 was absolutely hopeless at games, so programming was its only real use. Those ZX81 users that moved on the Spectrum also included a lot of programmers and hence the massive cottage industry that sprang up and then collapsed. Spectrum was of course, more a games machine though, absolutely not optimised for any business use. QL would have been a decent office machine with its Psion software suite, if only they hadn't compromised the keyboard. A monitor was required of course, but an office machine wouldn't use a TV RF input anyway. But it was the Amstrad that stole that market in the UK up to 1990. Good keyboard, monitor included, working storage media.

        1. gw0udm

          The QL has a TV modulator and RF output. Quite a strange choice for a business machine, although it does have an RGB port too

      2. mdubash

        Friend of mine (at the time) wrote an entire book with a QL. In fact, he waited until he could get one - about two years - before he started actually writing it.

    3. Dan 55 Silver badge

      Well you couldn't even get multiple apps at once on a QL, not in its released form. Yes it could do multitasking if you wrote the assembly language program to do it but then again you could do something approximating multitasking on the Spectrum too if you wrote the assembly language program to do it.

      It had higher resolution graphics but it wasn't supplied with a monitor and users could get by with lower resolution graphics on a TV. It didn't have a GUI or mouse. It came with decent office software, but you could also find similar software for the 8-bits. It had a better BASIC but then again BBC BASIC was also pretty good. There were 8-bit computers with a better keyboard if you wanted one for office work.

      So it didn't offer that much new over the 8-bits, apart from the built-in tape loop shredders. And Sinclair compromised quality by pressuring to get it released in 1983 even though the Spectrum did away with naming the machine after the year of release. In the end they released a compromised computer in 1984 and people could see it was late and had problems with ROMs... Not a compelling purchase.

    4. Michael Strorm Silver badge

      Multitaking or the lack of a GUI had nothing to do with the QL's failure. No-one expected a GUI at that price (£399) back then and Mac was US$2500 in 1984 money- many times the price- when launched. They weren't competing

      The Mac itself wasn't a success early on (in part due to its price), and is likely completely irrelevant to the failure of the QL which flopped due to its own flaws.

      The QL failed because it was announced way too early, rushed to market, put on sale before the design and OS was even complete and launched full of bugs and flaws, sealing its reputation.

      It failed because they aimed it at businesses which would have been less tolerant of unreliability and it's lack of PC compatibility rather than hobbyists.

      Duplication and supply issues with Microdrive carts also made commercial support a problem

      The QL failed for many reasons, but the Mac wasn't one.

    5. Liam Proven (Written by Reg staff) Silver badge

      > Sinclair had got away with cheaping out on build quality with the Spectrum

      It *did* get away with it. I had an Interface 1 and Microdrives on my Spectrum and it was so much better than cassette tape, it was transformative.

      I think people forget that a floppy interface cost _more_ than a whole low-end computer these days. And then a drive was as much again.

      1. munnoch Bronze badge

        The thing that was really transformative about the Interface 1 which I never see mentioned is the network interface, ZX Net. You could LOAD and SAVE programs over the network to another spectrum, in fact up to 64 of them could be daisy chained. The syntax was a bit ugly but it worked. I believe the QL had the same port?

        My first development rig was two spectrums side by side, one ran the assembler (source saved to microdrive) and once you were ready you squirted the binary over to the other one to run it. Later on we started using PC-based assemblers with a custom parallel link to do the squirting because, you know, reliable storage...

  2. HorseflySteve

    Sinclair QL wasn't 16 bit

    It used the 68008, 32bit internal MCU with an external 8 bit data bus which a colleague of mine described as a bit like an eight litre engine with a 1/4 inch carburettor.

    It was called the Quantum Leap because it leapt from 8bit directly to 32bit internally.

    So, take your choice, it was either 32bit or 8bit but not 16bit.

    Either way, it was still a flop though lots of interesting things came about because of it..

    1. werdsmith Silver badge

      Re: Sinclair QL wasn't 16 bit

      The Sinclair advertising in the 80s made big news of its 32 bit internals, comparing some computing tasks (which were not RAM bound) and claiming huge gains.

      The QL lives on, the hardware has been re-engineered and there are several versions. They are manufactured and released in sporadic batches, for example Q68 SBC.

      For the original QL there are a number of hardware mods still available including interfaces for SD card storage.

      The operating system continued to be developed independently and now manifests as Minerva and also SMSQ/E, available on hardware and emulators. There is an active community now around the QL legacy.

      Ultimately Sinclair's pitch was wrong, but as for needing a GUI, Amstrad seemed to do OK, managing to string out Z80 based 8 bit machines for a few more years simply by having a keyboard/monitor integrated offering with a text interface. Over 3 million sold. There was a market for a machine aimed at office type tasks.

      1. keithpeter Silver badge
        Windows

        Re: Sinclair QL wasn't 16 bit

        Are we talking the green screen Amstrad PCW?

        Absolutely, wordprocessing/spreadsheeting (spreadsheet 3rd party I recollect, dimly) with a package including printer. We had rooms full of those in college libraries. A relative ran a theatre box office off one. Those funny 3 inch disks in cases were expensive but reliable. This was the first small (i.e. non-mainframe) computer I used a lot although I never bought one.

        A *product* with a clear use.

    2. 45RPM Silver badge

      Re: Sinclair QL wasn't 16 bit

      It leapt from 8bit to 8bit - hence the name Quantum Leap (reflecting that it was the smallest possible change - from quantum, the minimum amount of any physical property involved in an interaction). There are advantages to the 32bit nature of some of (not all of) the 68008's internal components, but speed isn't significant amongst them - and most (if not all) can be replicated through clever programming and paging on a 'true' 8 bitter.

      Still, no doubt Linus Torvalds learned a lot from his ownership of one, perhaps we can indirectly thank the QL for Linux - which was a hell of a leap, and nothing quantum about it.

      1. John Brown (no body) Silver badge

        Re: Sinclair QL wasn't 16 bit

        "which was a hell of a leap, and nothing quantum about it."

        Linux wasn't revolutionary or any real sort of leap other than that it was free and open source[*]. Linux was at best an evolutionary step. It's made many more of those steps since, but it was by no means a leap of any technological kind at the time. I tried various Linux back in the day, but eventually settled on FreeBSD, which I still use now, so I'm not dissing Linux, just being realistic about what it was and has become.

        [*] that in itself, has become a huge factor over time, but even back then wasn't all THAT revolutionary. There was already the Public Domain software that had been around for years and often came with , or at least could be requested, the source code. Not to be confused with Shareware, a whole other beast but often all lumped together to the extent that casual users didn't know the difference.

    3. Michael Strorm Silver badge

      Re: Sinclair QL wasn't 16 bit

      The QL claimed to be 32-bit, which it was (partly) internally.

      And to be fair, if one considers it 8-bit due to the data bus, then remember that the early versions of the "16-bit" IBM PC all used the 8088, which also had an 8-bit data bus, so the same applies there.

      1. AndrueC Silver badge
        Boffin

        Re: Sinclair QL wasn't 16 bit

        The Z80 had 16-bit registers and was capable of 16-bit arithmetic.

        ..programmatically only though. Internally it was most definitely 8-bit. Except that it had a 4-bit ALU so perhaps that makes it a 4-bit CPU?

        Only joking.

    4. Liam Proven (Written by Reg staff) Silver badge

      Re: Sinclair QL wasn't 16 bit

      > Sinclair QL wasn't 16 bit

      Oh my.

      Yes, and this is also why the Atari ST was so branded: *Sixteen* *T*hirty-two.

      I should have realised I'd resurrect 1980s platform advocacy, shouldn't I?

      1. HorseflySteve

        Re: Sinclair QL wasn't 16 bit

        Not so much platform advocacy, but it depends what you mean by 8/16/32 bit.

        From a processor internal architecture point of view, the QL was 32bit but I could argue that, as the Z80 organised its registers as 8bit pairs for some operations (even the A and F registers, though that was 'undocumented') and had 2 'prefix' instructions to enable alternative instruction sets making them effectively 16bit instructions, the ZX Spectrum was a mixed mode 8/16bit computer by that definition.

        Both had a physical data bus that was 8bits wide to allow use of cheaper memory devices of the time.

        My point was that nobody described the QL as 16bit as neither processor architecture nor data bus width matched that description.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Sinclair QL wasn't 16 bit

          > My point was that nobody described the QL as 16bit

          Yeah they did. And the ST, and the Amiga, and the Mac.

          I googled and found half a dozen links on the first page.

          The 68000 is generally regarded and described as a 16-bit chip, and machines based on it (and its variants) as 16-bit computers.

          The 68020 was the first fully-32-bit 680x0 in my book and I suspect in the books of the few of us old enough to remember and spoddy enough not to have grown out of caring.

          BTW, it may amuse you to know that one of my colleagues on El Reg read this and messaged me to say "hey, the QL was 32-bit!" in the last hour. :-)

          6800, 6809, Z80: 8-bit.

          68000, 80286: 16-bit.

          68020, 68030, 68040: 32-bit.

          80386, 80486 etc.: 32-bit. Yes, even the 80386SX. No, it is not fair. Life is not fair.

          1. eldel

            Re: Sinclair QL wasn't 16 bit

            You forgot the iAPX 432. Intel's first 32 bit processor. I was trying to write software for that in 1982. No actual compilers were available other than assembler. A complete and total dog.

            1. Michael Wojcik Silver badge

              Re: Sinclair QL wasn't 16 bit

              The '432 was in some respects a good idea — a capability architecture, which offers significantly better security and robustness (because of reducing undefined behavior) for software. It just wasn't feasible to implement at the time, and Intel decided to make too many radical departures from what it knew how to do. Capabilities were good, but a pure stack architecture with no registers didn't make much sense unless you were courting the Forth market; doing fancy memory management at the microcode level is clearly (with hindsight) the wrong approach; and so on.

              1. Torben Mogensen

                Re: Sinclair QL wasn't 16 bit

                (About the 432). It didn't help that Intel decided to use bit-level addressing.

          2. Anonymous Coward
            Anonymous Coward

            Re: Sinclair QL wasn't 16 bit...nope, wrong on 68k family

            This is how it actually was. Working on the bare iron.

            As long as all address pointers were 32 bit clean (no sticking stuff in top 8 bits like the original MacOS 64K ROM Memory / Resource Managers did) code written for the 68000 in 1984 could run unmodified on all future 68K processors and emulators. As all i/o was memory mapped the main difference between the 68000 and 68008 was the 68008 took twice as many clock cycles to read / write. That's all. Assuming the standard number of wait states for the RAM on the motherboard of course.

            For 68k asm programmers the CPU model you wrote for with the 68040 was exactly the same as the 68000. For 99%+ of your code. Same regs, same instructions The only real difference was added FPU/MMU instructions and low level exception handling was more much complex. Although you could roll your own soft-coprocessor with the '20 and later. So add your own instructions. The other use of F traps. Not just MacOS API calls.

            Unless you were doing system level code the 68040/60 instruction set was pretty much the same as the 68000. The initial instruction set was so well architected.

            So.

            68000 was 32 bit address, 32 bit data regs / 16 bit data bus, 24 bit address bus

            68008 was 32 bit address, 32 bit data regs / 8 bit data bus, 24 bit address bus

            68010 was 32 bit address, 32 bit data regs / 16 bit data bus, 24 bit address bus - handled address exception continuations correctly, had barrel shifter

            68020 was 32 bit address, 32 bit data regs / 32 bit data bus, 32 bit address bus - full support for coprocessors, MMU etc

            68030 was 32 bit address, 32 bit data regs / 32 bit data bus, 32 bit address bus - on board MMU, instruction cache

            68040 was 32 bit address, 32 bit data regs / 32 bit data bus, 32 bit address bus - on board MMU / FPU

            The width of the data bus only effected data move clock cycle count and the 24/32 address bus was only of interest to hardware designers.

            Based on way too many thousands of hours in 68k asm land from 1984 to late 1990's.

        2. joeldillon

          Re: Sinclair QL wasn't 16 bit

          Early 68ks - I assume the 68008 too - had a 16 bit ALU. Yes you cao do 32 bit integer operations on them but it'll take twice as long; internally they are actually and for real 16 bit even if the ISA wasn't. This did get fixed later on, of course.

          1. Anonymous Coward
            Anonymous Coward

            Re: Sinclair QL wasn't 16 bit...DIV's and MUL's in 68000/08

            The 68000/08 was nt a 16 bit ALU. It was a 32 bit ALU.

            You could do 32 bit ADD/SUB's but for MULs for you could only do 16 bit *16 bit and for DIV's you could only do 32 bit /16 bit. For obvious reasons. So it it was ADD.L and SUB.L but only MUL.W and DIV.W. Which is why the MacOS had a LongMul and LongDiv trap in 1984. Quite separate from the Fixed Point and SANE floating point calls.

            There again with the 68000 the cost of MULs and DIV's was so expensive, could be up to 120+ clock cycles, you just made sure all multiply / divide ops were multiples of two's so you could bitshift rather than use MUL/DIV. It was not till the 68020/30 became really common with one op result every < 10 clocks (give or take) that you stopped using bitshifts for multiply / divide to get huge speed gains in mul/div intensive code. Once rewrote some Bezier curve display code for the 68000 that went from one frame per 5+ secs to 1/10+ sec per frame just by replacing all the muls/divs with bitshifts. A trick that can still work if you need one result per clock rather than every other clock which is still the case with come processors.

            .

            1. jotheberlock

              Re: Sinclair QL wasn't 16 bit...DIV's and MUL's in 68000/08

              No it really wasn't - http://www.easy68k.com/paulrsm/doc/dpbm68k1.htm 'Three Arithmetic Units' section.

              The fact you can do 32 bit operations in the ISA does not mean they are actually being done all in one go. Microcode is driving the operation through the ALU twice, hence it takes twice as long to do those operations. It's in the section I just linked - 'The MC68000 also operates on 32-bit data. This is usually done by taking two passes of 16-bit data, one for the lower word and one for the upper word. This is reflected in the execution time of many 16- and 32-bit instructions.'

              1. Anonymous Coward
                Anonymous Coward

                Re: Sinclair QL wasn't 16 bit...DIV's and MUL's in 68000/0.. we are talking real world here

                It was a 32bit ALU if you were actually writing 68K code for the 68000 in the 1980's. Which I guess you were nt. You know. Writing ADD.L and SUB.L instructions.

                Shipped a product with a good 100K plus exe of hand coded 68K asm for the 68000. In 1986. It did a lot of 32 bit integer math stuff. Being a compiler. And yes the mul and div language prims did the > 16 bit test for numerator / denominator to decide if it was simple asm or call the ATrap.

                I dont know what doc you linked too but this is what we actually used at the time. The official Moto manual.

                https://archive.org/details/M68000_16_32-Bit_Microprocessor_Programmers_Reference_Manual_4th_Edition/mode/2up

                Still have my really beat up copy somewhere. A freebee from Apple. In 1984.

                I still have my work copies of the 68020/30/40, 68881/2, and 68451/851 programmer manuals too. All used at some time or other to write shipped code. Keep them for old times sake. Given just much time I spent digging though them (mainly for instruction timings) from the mid 1980's to the mid 1990's.

  3. Pete 2 Silver badge

    Honourable mention

    Since we're talking about 16 bit computers. I feel the need to mention the PDP-11.

    Though I wouldn't call it one if computing's mistakes. Though I did make my fair share of computing mistakes on one several

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Honourable mention

      Well, I mean, the PDP-11 led to C...

      And that's the single greatest mistake in the history of computers.

      [DARFC]

      1. jake Silver badge

        Re: Honourable mention

        C was first built on and for a PDP7, not a PDP11.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Honourable mention

          > C was first built on and for a PDP7, not a PDP11.

          Aww. Spoilsport. :-(

          1. Michael Strorm Silver badge

            Re: Honourable mention

            Even if it appears he's correct that it was originally *built* "on" a PDP-7, that's as far as it goes and doesn't tell us anything beyond how they bootstrapped the build/compilation. Nor does it make your original claim wrong.

            If this post is correct it was cross-compiled on the PDP-7 "for" (i.e. built to run on) the PDP-11 in the first place.

            But that's still a minor detail. The important thing is that- from the observations I made here (cribbed, in turn, from the Wikipedia article)- it's pretty clear that C *was* designed "for" the PDP-11- i.e. with that specific machine in mind- *because* B (the predecessor to C) couldn't take advantage of the PDP-11's new features.

        2. Michael Strorm Silver badge

          C was designed for the PDP-11, not the 7

          No, Liam is correct in this case.

          Whether or not it was first built on a PDP-7, C was designed specifically to take advantage of the new features of the PDP-11 that its predecessor B didn't support.

          I remember this because I looked it up and already posted a reply pointing this out to you making the same claim less than a week ago(!):-

          https://forums.theregister.com/forum/all/2024/08/23/build_your_own_pdp11/#c_4918391

          Ironically, this all stemmed from me quoting Liam's original claim from an article years ago... The same one he made above which *was* correct in the first place.

          1. This post has been deleted by its author

          2. John Brown (no body) Silver badge

            Re: C was designed for the PDP-11, not the 7

            "Whether or not it was first built on a PDP-7, C was designed specifically to take advantage of the new features of the PDP-11 that its predecessor B didn't support."

            Was there ever a D? ISTR an E, but that was entirely unrelated AFAIK. Or did the just get board with that naming convention and went directly to C+, C++, C# etc :-)

            EDIT: Just found there is a D programming language, but it's based on C++ but not directly related to C and it's devs.

        3. Doctor Syntax Silver badge

          Re: Honourable mention

          The first Unix was certainly written for the PDP7 but was that in a high level language of any sort as opposed to assembler - or even raw machine code?

          1. Peter Gathercole Silver badge

            Re: Honourable mention

            Not only was the PDP7 version of UNIX written almost completely in assembler, so were very large parts of the original PDP11 version of UNIX.

            This was re-factored in C around Edition 5/6 (IIRC), but even then, there were sections of Edition 6 (particularly the hand crafting of the "init" process and various kernel structures during startup) that was done in assembler. There are references to model-specific code in m40.a and m70.a to cope with the differences in systems with the 18 bit and 22 bit Unibus hardware, and the Separate I&D features found in 11/70 and 11/44 systems. Most later systems were based around F11 and J11 microprocessors, and I don't know for certain whether they handled 22-bit addressing the same as the 11/70, or whether it was different because of the Qbus.

            I had to pull this all apart and hack it a bit when getting Edition 6 running on 'my' SYSTIME 5000E, a PDP11-34E based system, but with 22-bit addressing from the 11/70 grafted on by SYSTIME working in Edition 6. By the time Edition 7 came out, the amount of non-C code was very much reduced (although not eliminated), which along with the Portable C Compiler meant that UNIX became much easier to port to other hardware.

        4. Dan 55 Silver badge

          Re: Honourable mention

          It seems that the PDP-7's B compiler was ported from assembly to B, altered to be a cross-compiler and generate PDP-11 machine code, and then modified into the C language. So it was built on a PDP-7 for a PDP-11.

          Source: DMR's C history - the paragraph above the "Embryonic C" heading.

      2. Bebu
        Windows

        Re: Honourable mention

        Well, I mean, the PDP-11 led to C...

        And that's the single greatest mistake in the history of computers.

        Funny you should mention that.

        The deciding consideration whether I purchased a PC or Amiga was whether I could get an inexpensive C compiler in AU for either (back then almost all software in AU was imported, markups were high and a 33% sales tax applied.) The compilers for tge Amiga were difficult to obtain (direct import) and expensive but I could get a copy of Walter Bright's Datalight C from a local vendor at a reasonable price (that included the runtime library sources) which sealed it for the PC (a clone with a NEC v20 CPU I think.)

        Oddly the first compiler I purchased was a Modula-2 compiler (Dave Moore's FTL, [Čerenkov Software*]) which I used on work machines to build some tools.

        Pretty much the history since - the software, no matter how crappy, dictates the systems and hardware that is acquired.

        * a drole reference to their Faster Than Light (FTL) M2 compiler. A lot more physicists in the game back then.

        1. joeldillon

          Re: Honourable mention

          My first compiler was Modula-2 as well (and looking on my bookshelf it was indeed FTL Modula-2 for the Atari ST) - because it cost 50 quid and commercial C compilers cost twice that. Remember when compilers costed money? :)

      3. Locomotion69

        Re: Honourable mention

        See it the positive way: there is room for another set of articles: OS-mistakes, application mistakes, ....

        Keeps you busy :)

      4. Dan 55 Silver badge

        Re: Honourable mention

        And that's the single greatest mistake in the history of computers.

        People needed operating systems, no other language around between 1970ish-1995ish was as portable or made it as easy. The closest was Pascal and it lost that battle.

        As for me, HiSoft C on the Spectrum and SAS C on the Amiga got me to uni and through uni.

        1. Richard 12 Silver badge

          Re: Honourable mention

          I'm pretty sure Pascal lost entirely due to braces.

          Begin ... End takes longer to type and uses four times as much memory as {...}.

          Back then, both of those things really mattered.

          1. Ken Shabby
            Trollface

            Re: Honourable mention

            It was the semicolons (that and the ability to do anything useful)

            1. John Sager

              Re: Honourable mention

              I wrote a whole comms concentrator in Pascal in the 80s, along with a RSX11 driver in assembler. That worked for about 20 years, I think. RSX11T was a nice embedded platform. But then I moved on to programming in C/C++ and I've never touched Pascal again

          2. richardcox13

            Re: Honourable mention

            > I'm pretty sure Pascal lost entirely due to braces.

            And its need for non-standard extensions to do anything useful.

            Standard Pascal cannot do something as simple as prompting for a filename and then reading that file.

            1. Richard 12 Silver badge

              Re: Honourable mention

              Back then that was fixable, it's not as if C was actually standardised at the time either.

              Even while I was at Uni we were taught a toolchain-specific dialect of C because despite there being a Standard by then, nobody actually fully complied.

              That said, I never used Pascal in anger until Delphi, by which time it was already long dead. I don't think I even have access to that toolchain anymore, not without setting up a VM, anyway.

              1. Peter Gathercole Silver badge

                Re: Honourable mention

                IIRC, the main part of C that was defined in K&R edition 1 was pretty well implemented in almost all compilers (at this level, C is actually a pretty simple language). What was really missing was the variety of library and system calls that came along with UNIX. Many of these just weren't there in C implementations on non-UNIX platforms. And as these make C useful without having to re-invent the wheel, this meant that programs written in C were not as portable as people thought.

                I believe that the DECUS C compiler came with an implementation of the C library just to allow code to be more portable.

            2. ButlerInstitute

              Re: Honourable mention

              I used Pascal for work from 1985 to 2001. (Oregon Pascal-2)

              Yes Pascal needed extensions for all real work, as it had been designed for education.

              And as there were no standard extensions, all implementations were different, and differed in subtle ways, making translation difficult. Eg the precise definition of type compatibility between overlapping integer subrange types.

              All C was similar enough that didn't bite you as much

          3. Peter Gathercole Silver badge

            Re: Honourable mention

            Not in my memory.

            In my mind, it was Pascal's strictness, which is something that Wirth himself insisted on. Pascal as designed was supposed to be really strict with it's data-typing and prototype declaration to aid teaching 'good' programming techniques, so much so that trying to write anything serious in standards-compliant Pascal was a pain in the neck! Trying to access addresses such as memory-mapped registers without there being a declared pointer type or the ability to cast a datatype was made deliberately difficult.

            We had OMSI Pascal on RSX-11 as a teaching language, and this was so standards-compliant that it was very difficult to use, much to the upset of the students who were using Turbo Pascal outside of their collage work. OMSI did have the ability to embed some PDP-11 assembler in a program, and published in the documentation some methods of accessing memory using this, and of course you could link-in code written in other languages using the standard RSX link protocols, but this was not part of standard Pascal, and bypassed almost all of the compile time type checking!

            Of course many versions of Pascal had extensions which made it more useful, but these were almost all incompatible with other compiler's extensions.

            C on the other hand was designed to be fast-and-lose with it's data typing to make the best use of the slow CPU speeds that early systems were inhibited by, which is now regarded as a handicap, but at the time was a real strength. It also helped that C mapped nicely onto the PDP-11 ISA, making C->machine code compilation a relatively simple operation in many cases.

      5. Anonymous Coward
        Anonymous Coward

        Re: Honourable mention

        Paraphrasing an old USENET quote from dusty memory: "There are two notable things to come out of UC Berkeley: the C programming language, and LSD. We don't find that to be a coincidence."

        1. Anonymous Coward
          Anonymous Coward

          Re: Honourable mention

          I remember it as BSD and LSD ..... I think C came from Bell Labs which was on the other side of the country in Noo Joisy

          1. Doctor Syntax Silver badge

            Re: Honourable mention

            Maybe the LSD was responsible for the confusion.

          2. jake Silver badge

            Re: Honourable mention

            BSD initially also came out of Bell Labs. All BSD was was a series of patches and additions to UNIX.

            LSD initially came out of Switzerland.

            1. eldel

              Re: Honourable mention

              From the mouth of someone (not me) who claimed to have been involved in it - BSD was basically a collection of masters theses compiled into as OS. As I was spending much time trying to get inter-process communication to run reliably at the time I could well believe it.

              1. Peter Gathercole Silver badge

                Re: Honourable mention

                I remember looking through the BSD 2.8 (IIRC) release tape trying to decide which of the toys in it would be fun to play with, and which I could actually use on a non-I&D PDP11 (vi was one I couldn't because it was just too big!)

                I did get Ingres (the main reason we had the tape) working. The rest was just a bonus. Such things as more, vsh, curses, strings and a proto NLS system (you could extract the text strings into an indexed message file, primarily to save space at the expense of speed and a file descriptor for the message file, which in theory would allow you to have messages in more than one language for your program), a Pascal compiler (and probably some other languages I've forgotten), ex and many more I've since forgotten, some of which are now standard tools, but were not present in Unix Edition 7.

                The list was very extensive, and if it was a result of one or more Ph.D thesis's then what did it matter.

                The one that was most useful (after Ingres) was the overlay loader (something that RSX11M did out-of-the-box), and I actually did get some of the Ingres processes compiled with it, allowing me to reduce the number of processes and reducing the amount of IPC that was required to get Ingres running. This enabled us to run a full lab of 12 terminals (all we had at the time although we did have 17 RS-232 lines - one dedicated to the console) of Ingres sessions on a system that should only really have been able to support 2-3, but we did have faster CDC SMD disks and 2MB of memory compared to the 256KB that a similar system we had when I was at Uni. a few years earlier.

                One interesting aside is that in order to get the DZ-11 compiled in to the kernel to support 8 of the TTY lines into the system, I had to generate the kernel without the MT-11 tape driver, because even after shifting the kernel block device buffers out of the kernel address space (the Keele mods), there was just no space in the 56KB kernel address space to have them both! How things have changed.

          3. IvyKing Bronze badge

            Re: Honourable mention

            With respects to C, the first comments I heard about it at Cal was "an abomination in the eyes of the Lord", from someone who was found of Pascal (FWIW, Wirth got his PhD at Cal). Other comments about UNIX on the PDP-11 at Evans Hall, was "If Bell Labs hadn't invented the transistor, the phone company would still be using vacuum tubes." Some of the folks in the CS and EECS departments then went on to develop BSD, with a license that was very similar to the license used for distributing SPICE.

    2. jake Silver badge

      Re: Honourable mention

      My first home computer was a 16-bit LSI11-based Heath H11 ... in 1977.

      I had wanted a computer at home since I joined the Homebrew Computer Club in '75, and started saving my money. I knew I didn't need/want a MITS Altair 8800, IMSAI 8080, or similar. My Dad (who started working with computers in the 1950s) advised me against purchasing a PET, TRS-80, Apple II or any of the other toy 8-bit computers ... said they weren't very useful for my needs. So I made do with the Teletype Model 33 that Dad used for work, and I used to access ORVYL, the Stanford timeshare system, with the blessing of Dad's company (they were paying the bill for the extra telephone line).

      Then one day Dad came home from work with a back copy of Interface Age which had an advert for a 16-bit Heath H-11 and said "Now THIS is a worthwhile home computer!". So I bought a kit from a local guy who was fronting them for Heath (perk of living/growing up in the proto-SillyConValley). Xmas present to myself.

      We built most of it in my apartment in Mountain View, but for reasons I can't remember (better fume extraction?) we boiled the boards on Mom's stove ... she still hasn't forgiven us, even though I always left the kitchen cleaner than it was when we started ... and used my own pot.

      The entire world of software was freely available through DECUS, which I could access via a totally unofficial library at Stanford. Including most of the programming languages of the day. There were plenty (? ... subjective) of games, but I was never a gamer, so the few I copied were for friends and family to use.

      IMO, DEC kit was, and remains, the single best teaching environment for learning the concepts of computing and networking. To this day I use the concepts I learned from building and using (and upgrading) that box virtually every time I troubleshoot a computer. Shame the franchise was squandered away.

      1. GlenP Silver badge

        Re: Honourable mention

        We used LSI11 based computers for our small systems course at Uni (when the main teaching resource was an IBM-370!) They were a good introduction to assembly languages and learning about just how things work at a level that I don't think has been taught for many years. Professionally I've very rarely programmed low level stuff - a few driver mods on PDP11/23 machines and that's about it but the knowledge and understanding have stood me in good stead ever since.

      2. werdsmith Silver badge

        Re: Honourable mention

        I wasn't aware, but when I was scraping together pocket money to buy discrete components for my radio projects, the sole trader little electronics shop where I got my stuff, the owner was a well known name within the Altair world in the UK. I found some information about him in some archive material during covid.

        Of that era there were people doing UK101, which was a UK licensed version of Ohio Scientific machine. I put together a 6800 based SBC with 7 segment displays using pilfered components in my first job.

      3. ICL1900-G3 Silver badge

        Re: Honourable mention

        If only I had known... Never realised the Heathkit was Dec compatible. Bit late in the day now (!) but thanks for sharing. I really enjoyed those early 'micro' days, especially coming from a mainframe systems programmer background... compare and contrast!

      4. Doctor Syntax Silver badge

        Re: Honourable mention

        Heath - now you're really going back to a golden age. I'm in danger of breaking out in tears remembering the great years of Tottenham Court Road with Stern Clyne.

        1. Fr. Ted Crilly Silver badge

          Re: Honourable mention

          Proops of old, an Aladdin's cave of good things to be discovered and hurried home to tinker with...

          Tell the young 'uns about it (wistfully ofc) and they don't believe you.

      5. Martin Gregorie

        Re: Honourable mention

        I did something similar. Since at work I was programming and administering ICL 1900s, mostly running George 3, I wanted something a bit more capable than the micros available in 1979, so, since I was already handy with a soldering iron, I bought a kit containing a set of PCBs, a heap of chips, nice case, two 5.25" floppy drives, an EEPROM progmmer, keyboard and a green monochrome monitor and soldered them together to make a 64Kb SS-50 bus based microcomputer running the Flex-09 OS on a 2 MHz 6809 with 16x32 screen, and fitted with the two disk drives.

        I managed to debug the hardware well enough to run sucessfully, using only a multimeter and a 1 bit logic probe: I had previously done various electronic projects, including building my own transistorised stereo amp and a single channel Radio Control transmitter/receiver pair, so already had all the necessary tools for the job.

        Being used to 24 x 80 green screens at work, I almost immediately replaced the original 16 x 64 display card with one capable of displaying 80 x 24 on the same green TV, recoding its character generator to handle character graphics and rewriting its bootstrap code to match. I also made sure I had fitted enough RAM and EPROMS to entirely fill the 6809's 64K address space.

        I've still got the box: it booted OK the last time I turned it on

        This system ended up with compilers/assemblers etc. for Basic, C, COBOL, Forth, PL/9 and an excellent 6809 assembler. They all worked well. However the COBOL compiler is dog slow. though it seems quite robust. I usually programmed this box in 6809 assembler, PL/9 and C: I taught myself the latter from the original K&R "C" book.

        I eventually added another pair of floppy drives just because I could.

        1. Pete 2 Silver badge

          Re: Honourable mention

          > a 1 bit logic probe

          Also known as an LED.

        2. fromxyzzy

          Re: Honourable mention

          You might enjoy this: https://flexemu.neocities.org/

      6. Jamie Jones Silver badge

        Re: Honourable mention

        > ... and used my own pot.

        Shhh. It wasn't legal back then!

        1. jake Silver badge

          Re: Honourable mention

          a) The statute of limitations has long since passed.

          b) I never used the stuff, beyond sampling it once or twice in my late teens. All it does is knock me out, and I don't need help to take a nap.

          Would that be an example of me panning pot?

    3. Dagg Silver badge

      Re: Honourable mention

      Oh, I LOVED programming the PDP-11, 8 registers that you could do so many thing with. MOV (PC)-,(PC)- was one evil instruction.

      1. Peter Gathercole Silver badge

        Re: Honourable mention

        Nice, regular instructions that could be applied to any register.

        Even the PC and the stack pointer could be manipulated. Indeed, if you looked how the generated machine code for such things as PUSH, POP and JSR and RET were implemented (if I have the mnemonics correct), they just generated specific versions of general register manipulation instructions! Genius.

    4. Scene it all

      Re: Honourable mention

      The big thing I noticed during this time was that the Intel 8088 and 8086 architectures (the instruction set that the programmer sees) was very clumsy. As though it was designed by hardware people who were not programmers. The DEC PDP-11 clearly was designed with programmers in mind, most noticeably in its clever addressing modes based around truly general-purpose registers. And the Motorola 68000 clearly copied some ideas from that. Then later the DEC VAX line instruction set was *heavily* designed for the convenience of programmers and also for small code size. (I know - I was there) This was before the modern ideas around RISC designs that did simple things very fast.

      1. Bitsminer Silver badge

        Re: Honourable mention

        The 8080 was very clumsy indeed. I remember trying wrap my head around the fact it had no overflow bit for 8-bit arithmetic. Until I realized that it was an 8-bit binary computer, not an 8-bit twos-complement machine. Duhh.

      2. Peter Gathercole Silver badge

        Re: Honourable mention

        The Nat. Semi. 16032 ISA was almost a direct rip-off of the VAX-11 instruction set, by design. I don't know how they got away with it!

        But it could be that DEC decided that it was not really a threat, because of the number of bugs in the early implementations.

  4. Hans Neeson-Bumpsadese Silver badge
    FAIL

    My abiding memory of the QL was the construction of its keyboard. Back in the day, shops used to have examples of all of the current microcomputers on display, hooked up to monitors, for spotty little teenagers such as myself to go in and play with. I remember going into a local department store with a bunch of school mates and looking at the QL. One lad picked it up to look at the underside, and the keys literally fell out of the keyboard - we were scrabbling (no pun intended) to gather up the keys and reinstate them before any of the staff spotted us.

    1. werdsmith Silver badge

      It's actually quite hard to get the keycaps off them.

      1. Hans Neeson-Bumpsadese Silver badge

        This was when it first came out. I suspect they did something in later builds to rectify the problem.

        1. Liam Proven (Written by Reg staff) Silver badge

          > I suspect they did something in later builds to rectify the problem.

          I recall this problem, and its resolution, being discussed of the ZX Spectrum Plus... but then, that used an adaptation of the QL keyboard design.

          1. DJV Silver badge

            Yes, the first Spectrum Plusses certainly had that issue. I was working in a small computer shop in Norwich when the first model arrived. While unpacking it, I turned it over to remove part of the packaging underneath and, when I righted it, many of the keys were still lying on the table!

        2. Peter Gathercole Silver badge

          ..later builds

          Probably glue!

  5. jonsg

    Ah, the Acorn Communicator!

    I was a member of that team in Acorn's Custom Systems division, and did the Econet code for it. (I have war stories!)

    The Communicator was a lovely piece of kit. It sold in surprising numbers to an unexpected niche: travel agents.

    At that time, a lot of travel bookings were done using Prestel, a dial-up teletext service run by British Telecom. The Communicator was cheaper than PCs with (often ropey) modems, needed negligible maintenance, was hard to break, could be networked cheaply, and included word processing and spreadsheet for not a penny extra. Slam dunk!

    Unfortunately, it didn't see much business outside the travel agent trade, and was discontinued. Perhaps it was the slightly last-tin-of-paint-in-the-shop case colour...

    1. Belperite

      Re: Ah, the Acorn Communicator!

      Ah econet, my first experience of computer networking as a child in school. Room full of Beebs and a chunky Filer(?) and printer in the corner.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: Ah, the Acorn Communicator!

      > I was a member of that team in Acorn's Custom Systems division, and did the Econet code for it. (I have war stories!)

      This is quite interesting -- over on Mastodon a friend of mine also offered that _he_ worked on the Communicator, too.

      That machine had a lot of effort in it. I bet it cost Acorn a packet.

      Thanks for the fascinating tidbit about usage in travel agents.

      About 25 years ago, when I was in a Prominent UK Travel Agency Shop, I fixed a flakey PC for them so they could book my travel for me.

      The thing logged into Netware 4 using the admin account... *ouch*

      ... using a hardcoded password...

      *bigger ouch*

      ... which was written directly into AUTOEXEC.BAT.

      *major ouch*

      Anyway. The woman I spoke with had been trained that if it went wrong, press Ctrl+Alt+Del. Long before the IT Crowd. There's worse advice.

    3. f4ff5e1881
      Meh

      Re: Ah, the Acorn Communicator!

      I seem to recall Acorn toyed with this kind of thing previously - the 'Merlin' - which was basically an Acorn Electron linked to a communications pack. I gather they were used in retail shops, with InterFlora being a notable user of the system (flowers were big back then, you know).

      I'd imagine The Communicator was just a natural evolution the technology - pity the machine didn't reach a wider audience.

      1. jake Silver badge

        Re: Ah, the Acorn Communicator!

        Any relation to the Merlin Tonto, a rebadged ICL OPD (One Per Desk)? Basically a Sinclare QL without the 8049 peripheral controller but with a POTS communications system grafted onto it. They were built by ICL, and sold by BT. In Oz it was known as the Telecom Australia Computerphone.

        The name Merlin seems to have come from the same place as BT's Merlin M4000 line (rebadged Logica VTS-2300 Kennet), although they are in no way related hardware wise. No relation at all to the contemporary AT&T Merlin systems.

        Tonto was short for "The Outstanding New Telecoms Opportunity".

        All the above was early/mid 1980s. The only reason I know they even existed is because a friend in England's company hired me to beat them into submission.

        Note that this has nothing to with the mid-1970s TONTO, The Original New Timbral Orchestra, which is an entirely different subject, and some would say much more interesting.

        1. f4ff5e1881
          Boffin

          Re: Ah, the Acorn Communicator!

          A wayward cousin of sorts, as it were – the Electron-based incarnation was the BT Merlin M2105 - a dedicated communications terminal which was designed and manufactured for BT by Acorn, according to Chris’s Acorns. It seemed BT had a penchant at the time for this kind of microcomputer/communications pack funky fusion.

          https://chrisacorns.computinghistory.org.uk/Computers/BT_MerlinM2105.html

    4. Doctor Syntax Silver badge

      Re: Ah, the Acorn Communicator!

      "It sold in surprising numbers to an unexpected niche: travel agents."

      This goes back to what I said in the comments on part 1 - it was unknown territory for computers in this price range so nobody quite knew what would sell into what market.

  6. Antony Shepherd

    The Atari ST was the last pre-PC computer I had. I even had the fancy special black and white monitor that let you use the high-res (for its day) 640x400 mode.Think it was probably the BW monitor that helped a lot with the 'Jackintosh' nickname.

    After that it was just a bunch of generic beige box PCs until I switched to Macs in the early noughties with the G4 'dome' iMac.

    Some days I really miss the excitement and diversity in the olden days before everything became so homogenized.

    Some days I wish something other than Windows had become the mainstream standard.

    1. 45RPM Silver badge

      The 80s were a glorious rainbow of computers and operating systems. By the late nineties it was practically a monoculture, with MacOS the only holdout - and that was hanging on by its fingertips. Nowadays I think that we're in the healthiest situation since the 80s. We have Windows (and a plethora of PCs ranging from cheap and crap through to expensive and interesting), we have Linux, Android, macOS, iOS, we have Raspberry Pi, we have Haiku, we have Chrome - and that's before we consider all the really niche machines that are also available these days like the Spectrum Next or The C64 Maxi.

      1. Belperite

        A large and interesting range of PDAs in the late '90s / 2000s as well, until everyone just ended up with generic-looking smartphone slates.

        1. Paul Kinsler

          A large and interesting range of PDAs

          I still use my old Zaurus as an alarm clock; I like the little chirp it makes. The battery (its second) is a bit long in the tooth, though... but I might have a spare spare somewhere....

          1. Doctor Syntax Silver badge

            Re: A large and interesting range of PDAs

            I'd forgotten that. I have one somewhere. Must dig it out and see if it can still take a charge.

      2. John Brown (no body) Silver badge

        "By the late nineties it was practically a monoculture, with MacOS the only holdout "

        There is something to be said for that though. At least in terms of the hardware architecture. Compatibility. Back when it was the wild west with not just various CPUs but entirely different and incompatible implementations, there was always that chance that what you bought would be the next casualty in the battle, leaving you with no support, no new software (unless you wrote it) and big hole in the bank account. Remember, there were companies that seemed established, launching new products, often late, and then going bankrupt almost before the ink was dry on the cheque you wrote (well, maybe not quite that often, but certainly some went bust even before the warranty period was over :-)

        I'm primarily taking about SME business users here, but it affected the home marker at least as much, if not more so, just there was less of an issue if a home user couldn't get new games, they could still play what they had (and probably that sort of support would tail off over 12 months or so. Business user would want at least 5 years of life (and support) since a computer was often still a capital purchase.

    2. Plest Silver badge

      I had an Amiga and Atari ST as I'd just started working my first tech job, still living at home and I had money to burn. Something I really liked about the ST, it was a good little system, I got into the game cracking scene and dabbled as I'd done game protection removal on 8bits but I couldn't keep up as i had a full time job. I even dabbled with MIDI and a CASIO keyboard for a while. I do remember buying a whopping 10MB hard disk for my ST ( about £400 at the time, which is about £1000 today! ) and it came in an industrial type metal container about 1ft square by about 3 inch depth, mains powered, a huge data cable and it weighed a flipping ton!

      I finally sold all my Amiga and Atari kit around 1992 and was seriously into PCs by then. We'd had a PC at home since 1987 and my dad told me the PC was future and so that's where I concentrated my serious efforts, learning about databases with DataEase and dBase, even sold a few apps to local companies as a sideline.

      Good tmes.

  7. Andy 73 Silver badge

    The QL.. and other failures

    In retrospect it's easy to point out singular decisions that led to machines being failures, but the reality was that at the time the picture was far muddier.

    There were lots of reasons the QL didn't make it, including Sinclair's obsession with low prices, the use of microdrives and the incredibly messy launch and production delays.. but in the context of the time none of these things were particularly unusual. Everyone was making a range of compromises in order to make computing affordable and available, before mass adoption made winners out of certain technologies and dramatically reduced their price.

    It's worth noting that the lack of GUI on the QL came in part from Sinclair being quite upset that his most successful product to date was seen as a toy to play games on. The QL quite deliberately had restricted graphics (technically fewer colours than the Spectrum) and terrible sound (there wasn't even direct enough control of the beeper to pull off some of the multi-channel tricks the Spectrum had begun to use). Sinclair wanted it to be a serious machine and made decisions that actively went against the grain of increasingly capable rivals that could present a GUI and the beginnings of digital media.

  8. Chris Gray 1
    Happy

    Amiga all the way

    Back in the 8-bit days I was drooling over the Apple-II for the simple reason it had a proper keyboard. Typing class in school and computers at the University meant that there was no way I was going to be happy on chicklet keyboards. (Like the friggen horrors on many of today's slim laptops!). But, I never could afford it.

    I eventually was able to buy an early-ish Amiga 1000. The store I bought it from was not able to get the Amiga monitors right away, so they loaned me a small green-screen monitor. I kept the system in my work office (can't recall why!), and one thing it ran a lot of was a Mandelbrot program written in Basic. It was a looong time to produce a screen (in green) with that, but we did.

    A friend and I started a software company, which became an official Amiga developer. He got his Amiga through that program. That company didn't stay on Amiga's long, however. I eventually went through A2000 (then it was A2500), A3000 and A4000T. In fact I ordered the A4000T a couple days *after* hearing that Commodore was going under - I was in the middle of all things Amiga (Draco, Empire(?)) and I had no interest in changing. Eventually I went straight to Red Hat Linux.

  9. Fading

    Always envious of my friends'

    Atari STs or Amiga machines. From having access to a wide selection of 8 bit machines, the 16 bit era in my childhood home was an Amstrad 1640 and my little brother's Megadrive. The console didn't interest me so I mainly stuck with an aging Amstrad 6128 (writing school work on Wordstar). I ended up taking the 1640 with me to university as my childhood home upgraded to 386's and then later 486's (the latter on which I wrote autoexec menus for different memory allocations depending if you were playing games or loading up windows). The 1640 though served many years and from its lowly beginings with a single 360KB 5.25inch drive ended up with a second 3.5 inch 720KB drive, 30MB hard drive, adlib sound card and a dual joystick card.

    It wasn't all rosy in 8086 land back in the late 80s with Amstrad's Sinclair PC200/Amstrad PC20 failure (think beige box PC in an C128 case) - built to compete with the Amiga and ST but only had an 8086 with CGA graphics. I remember talking my dad out of buying one at the Earl's Court Personal Computer Show in late 1988.

    1. Nugry Horace

      Re: Always envious of my friends'

      The PC200 was definitely a misstep by Amstrad -- they reused the chipset from the PPC portable, meaning they were targeting the home gaming market with a chipset designed to run Wordstar and Lotus 1-2-3 on an LCD panel. I think a successful PC200 would have needed, at the very least, MCGA graphics (even if only the 320x200x256 mode) and a sound chip.

  10. Anonymous Coward
    Anonymous Coward

    Yes...I Know...The Article Is About Failures....Mostly In The Late 1980's.....

    ....but I have to observer that Sun and Apollo and HP were shipping hugely powerful workstations...used by "quants" in the City to help make millions (billions?) of folding.

    And at the same time ICL were pushing PERQ (sourced in the US)....who knew????

    As an aside.....the Motorola 68000 was pretty spiffy....but no one remembers that either!!

  11. Anonymous Coward
    Anonymous Coward

    >"the canceled predecessor machine, the Apple IIX."

    So was that pronounced "Apple two ex" or "Apple eight"? (Maybe that's why it went away?)

    1. John Brown (no body) Silver badge
      Happy

      Well, two ecks obvs, because Roman 8 is XIII :-)

      Still upvoted though cos' made I smile :-)

    2. A____B

      Reminds me of an occasion back in the early 90s when a HR person interviewing a candidate was told about his extensive experience with UN9.

      The technical interviewer saw through the bull and wanted to reject the candidate there and then, but the HR lady kept saying "if he's a UN9 expert, perhaps he'd be useful -- what is UN9?".

      You may have guessed by now that UN9 was Unix - the bluffer had obviously seen one of the books on the shelf in the office.

      Still makes me smile (but then, I'm easily amused !!)

  12. Tim99 Silver badge

    PART 1: 8 bits; Part 2: Sweet 16 and making mistakes; etc?

    Assuming this is about PCs - Maybe seeing an anthropomorphic trend here?

    • 32 BIT: Getting more done, in partnership, useful developments, spawning child processes?
    • 64 BIT: Everything you thought would ever be needed; but bloated, resulting in many layers of management, and still slow?
    • 128 BIT PCs: I wrote FORTRAN science stuff in 1970, ready for recycling?...

  13. The Central Scrutinizer

    Ah the Amiga

    I went from a C64 to an A500 and then an A4000 in fairly quick succession. It had a 120 meg hard drive and 4 meg of RAM. The power!

    The Video Toaster came bundled with LightWave, which was used to create the Babylon 5 models and various effects. The station, the jump gate, the Starfuries etc etc.Babylon 5 inspired me to get into 3D graphics and when Newtek started selling stand alone LightWave, I bought my first copy of it and was hooked.

    I still have my March 1993 copy of Amazing Computing magazine where Ron Thornton and Paul Beigle-Bryant talk about setting up Foundation Imaging to create the B5 universe.

    They really were heady days for those of us getting into 3D graphics, but eventually of course, Commodore screwed it all up.

    I always get a bit misty eyed when I think back to those early days of 3D and how good the Amigas were.

    1. Anonymous Coward
      Anonymous Coward

      Re: Ah the Amiga

      Some really quite fantastic software was spooned out by magazines for pocket money cost, as did the PD libraries of the time.

      AMOS, MED and Imagine were by far the most used programs on my A1500. Cobbled together 3-D renders used to make sprites; dump them onto the blitter and make a homebrew shootem up in a weekend.

      The fact that a 12 year old of above-average abilities could figure out how to do this on basically zero budget at the time is definitely part of the compelling memories of how good that system was.

      Commodore's failures are well documented, and none more than by the criticisms from the head of Commodore UK. Strange how an outfit that by 1991 was making most of it's sales in Europe was still being force-fed horseshit from the by-then clueless US corporate HQ - the HQ that thought that developing an 8-bit successor to the 64 while it was ALREADY selling A500's was a good idea.

      Yeah, that.

      1. Michael Strorm Silver badge

        Re: Ah the Amiga

        Not that Commodore's management didn't deserve to be condemned on countless other counts, but as I mentioned elsewhere, Dave Haynie claimed that the C65 was essentially the pet project of one engineer no-one else was interested in working with, (i.e. not management-driven).

        Though I agree regardless that releasing an 8-bit machine- even a much-improved one- against the A500 wouldn't have made an ounce of commercial sense by that point, because I've said much the same myself and defended its cancellation on that basis.

      2. CowHorseFrog Silver badge

        Re: Ah the Amiga

        Commodore failed because they took too long to improve the Amiga and the updates were minimal. If theres one thing we always see, everything doubles in power or performance very quickly, the amiga didnt move for nearly 10 years and by then even the a600/a1200 were so far behind other machines it was sad.

        1. Anonymous Coward
          Anonymous Coward

          Re: Ah the Amiga

          As a A1500 owner at the time, which was basically a 1987 A2000 with two floppies and a Meg of RAM there was almost nothing significant that wouldn't run on it even after the launch of AGA.

          The few forays into titles that HAD to have a 68040 were commercial disasters, for obvious reasons when most still had a 68000 or at most the 68020. AGA didn't do that much more than OCS did, and it definitely did not address the by-then dated processor.

          A 486 DX2, 66Mhz 250mb hard drive could be had in the year of commodores demise for less than a well kitted out A4000. To say nothing of Xwing, SC2000 and Doom. Which is of course, where a lot of us went.

          1. CowHorseFrog Silver badge

            Re: Ah the Amiga

            Spot on.

            The A500 was a 7mhz machine and the A1200 was double the speed. Given AGA requires a few extra bit planes that means the final speed increase for the CPU was barely double because DMA steals cpu cycles. This of course means AGA games were more colourful but were barely more performance impressive which means 3D. TO do 3D A1200 had to cut back on the bit planes so the CPU didnt have any cycles stolen. The real problem of course the amiga used bitplanes which are great for 2D but are terrible for 3D and that meant the amiga was dead compared to pc which did have blocky gfx.

  14. geoff61

    Atari UNIX

    Followed the link to the atariunix.com site and it brought back some happy memories of having a TT running SVR4 as my personal workstation for 3 years when I worked for UniSoft (who did the SVR4 port) in the 1990's. It had a fabulous 1280x1024 monochrome monitor, whereas the machine I had after that was a Dell PC (also running SVR4) which could only manage 800x600 (but with colour). SVR4 was installed on an 80MB hard drive and used almost all of it. I had a huge (physical size) 100MB external SCSI drive for my files. Part of my job was to run the X/Open XPG3 test suite on it (an ancient ancestor of The Open Group UNIX test suite that is used today to certify AIX, macOS, etc.).

  15. RobDog

    ‘….not a viable business model’

    They could have, if buyers were braver than always buying IBM.

  16. David Hicklin Bronze badge

    Atari 800's

    I had an Atari 800XL as a successor to my ZX81 which is were my computing career started out. The ZX81 had lots of add on modules, keyboard etc from the Maplin kits - happy days.

    I loved the 800XL as it had a proper keyboard, twin floppy drives and a serial link to my Brother thermal typewriter that I used as my printer, a basic "Office" program with a simple word processor and spreadsheet - all 32 x 20 cells of it!

    Machine code on the 6502 was lovely and simple after the Z80, eventually sold the lot just before the market crashed totally for an IBM Model 30 PC.

    Always wanted an Atari ST but could never quite afford one at the time, and by the time I could things had moved on.

  17. steelpillow Silver badge

    R is for RISC, C is for Communic ... err, no, wait...

    As I recall (so I am probably more wrong than Liam), Acorn's RISC interest grew from some theoretical work on RISC v CISC processor efficiency, done by a prof at their local uni - Cambridge. Perhaps this was what stimulated the "how do they do it, then?" visit Stateside.

    The old tale again, is that Acorn found that unleashing the potential of the ARM would make the kit too expensive for the educational market, and they couldn't afford to do both, so they had to choose. But I never heard how the Communicator fits into / blows away that story?

    And on a point of order Mr. Vulture, Sir, the standard ARM architecture for many a year was 24-bit.

    1. ThomH

      Re: R is for RISC, C is for Communic ... err, no, wait...

      *cough* 26-bit.

      Only 24 bits were used for the program counter, but code must be four-byte aligned; byte reads and writes occurred within a 26-bit address space. You had to attempt to access a byte or word beyond 2^26 = the 64mb barrier for it both to trigger the relevant exception due to being uncommunicable on the address bus.

      ... and I'm about the trillionth person to point it out, but the genius of the original ARM design, including ARM's original support chips, is the heavy use of page-mode addressing to get huge bandwidth relative to the underlying RAM — which the CPU helped a lot in by indicating whether accesses were consecutive rather than random, and because it's a load/store RISC CPU accesses are more likely to be consecutive; by the time ARM went to 32-bit addressing it was already in the modern age of having an intermediary cache.

      1. steelpillow Silver badge
        Windows

        Re: R is for RISC, C is for Communic ... err, no, wait...

        *cough* 26-bit.

        Memory's still not bad for a septuagenarian who makes this icon look cool, though.

        1. CowHorseFrog Silver badge

          Re: R is for RISC, C is for Communic ... err, no, wait...

          Correct the pc/status register had 6 conditionals (M/Z/C/I/V/cant remember the other one) leaving 26 bits for addressing.

  18. Blackjack Silver badge

    Oh yeah I lived this era... while still having a Master System, at least I also had a Gameboy.

  19. ecofeco Silver badge
    Windows

    Amiga, Amiga, Amiga *sigh*

    I remember the first time I saw the Video Toaster and Lightwave and Amiga's Workbench and Intuition UI. I said, yeah, this is the future!

    And then Commodore shot themselves in the foot, lit themselves on fire and drove into an oak tree at 100mph.

    Wankers.

    Me, crying in my beer. -------------------------------->>>

    1. The Central Scrutinizer

      Re: Amiga, Amiga, Amiga *sigh*

      The whole thing was a great computing experience. LightWave was incredible for its time and relatively affordable compared to the likes of Softimage et al.

      And ... Fred Fish disks.

  20. DissemblyCoder

    Apple II SWEET16

    In the 70's, I built some aircraft data logging systems based on the Apple II, which collected AIRINC and digital and analog data using a card bolted to the Apple's top cover. My exact reasons for using the Apple II are lost in the mists of time, but involved cost, availability, and just wanting to play with it. The base code was Apple BASIC. with drivers written in mostly in assembly . I remember finding out about Steve Wozniak's SWEET16 , which was just the thing for dealing with data buffers with minimal code. This nifty little virtual machine saved a LOT of space and time coding. I might add that Steve was very helpful in resolving some of the hurdles in adapting the Apple II to an 'Off-label' application.

  21. philstubbington

    No mention of the Atari Transputer Workstation

    I got to play around with one of these - I worked for a Xerox sponsored ITEC at the time, no idea how/why we got one but an interesting bit of kit.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like