back to article The real significance of Apple's Macintosh

Apple launched the original 128 kB Macintosh around 40 years ago, and in so doing changed the computer industry, in ways that a lot of people still don't fully understand. The original Macintosh was such a pivotal machine that, naturally, The Register has given it a lot of space. When it turned 20, we looked at how it was a …

  1. Sparkus

    If anyone wants to see what a Xerox 8010 looked like...

    There is a reasonable emulator available at:

    https://github.com/livingcomputermuseum/Darkstar

  2. 45RPM Silver badge

    The original Macintosh was ground breaking, just like Cugnot's fardier à vapeur or Karl von Drais's Drasine. And, like those, whilst they would become great they were also all complete rubbish to begin with. It took over 100 years for the fardier à vapeur to become a useful vehicle, and nearly 80 for the Drasine to become something approaching a modern bicycle. The Macintosh became great in only two years (original launch date to launch of the Mac Plus - the first really usable Mac).

    The Mac wasn't necessarily inventive - the idea of a mouse driven user interface was already over a decade old - but it was hugely innovative (as discussed in the article) and widely copied. Imagine Windows without in title-bar widgets or pull down menus. Imagine any modern GUI without those things. Networking wasn't a new idea either - but it wasn't common outside of academia, or big business - and yet every Mac since day one has come with some kind of networking interface. Originally, LocalTalk, then Ethernet, and now Wireless Ethernet (Wi-Fi). Multiple monitors, with a single desktop spanning them all. I don't know for certain, but that might genuinely have been a Mac first.

    My uncle got his first Mac in 1985 - a Mac 512. I thought it was rubbish. What was the point of a computer which can't run CP/M or DOS applications? What a stupid idea! I thought that the Mac II graphics were beautiful (I enjoyed the review in PCW at the time), and I liked the industrial design - but the fundamental point about compatibility, my main objection, remained. My uncle, on the other hand, was delighted with his purchase - and could never see the point of colour. He upgraded through a Plus to an SE/30 (and then died before Apple cut off his future black and white upgrade path). And I still dug my heels in.

    I was very happy with my DOS PC, a Dell 486. But then I needed to learn 68000 Assembly, and if it didn't have a built in hard drive I wasn't interested. So I bought a cheap, second hand, Mac SE (which still works, and is sitting next to me as I type this - I've just finished replaying "3 in Three" on it, with occasional forays into Arkanoid, Gauntlet and Spectre). Suddenly I got it - software compatibility is a nice to have, but not essential as long as you have compatibility with your data, your work (which the Mac obviously does)! In fact, rigid adherence to software compatibility could prevent you from innovating. The 486, powerful though it was, really represented a backward step vs. the Mac - well, for me at least. And I've been Mac first ever since, 32 years and counting.

    So let's raise a glass to the Mac. Whatever computer you use, if it runs an operating system written since about 1985, we all owe the Mac - and the team that created it - a debt of gratitude.

    1. Dan 55 Silver badge

      You could draw a line in the sand on the 24th of January 1985 but there were two other 16-bit machines which came out that year. You certainly couldn't see the Mac's release in January and then develop a new GUI for a new computer in about five months.

      If the Mac never had launched but the Amiga and ST still had, would the rest of the industry (meaning PCs of course) still have got Windows 2-5 years later? Of course it would, it was the direction the industry was going.

      1. 45RPM Silver badge

        Of course - it very likely would. But would it have had drop down menus? Widgets in title bars? Desktop spanning multiple monitors? Well maybe, but perhaps it would have had something else instead. Maybe even something better* - who knows? But that’s not the history we’ve got.

        Remember, Android was in development concurrently with iPhone. iPhone was released first - and took such innovative steps as being all screen and ditching the keyboard. At the time, Android was following the status quo and being developed, like WindowsCE, with a keyboard first design. iPhone comes out, no hardware keyboard becomes ‘just the way things are done’ - the rest is history.

        It’s probably easier to imagine the alternative history for the phone - because that was still less than twenty years ago. Remembering back 40 years is somewhat harder, but nascent GUIs did exist before Mac - and wow, they were hard work! It’s very hard to imagine a GUI now without Apples innovations.

        *if I could jump between realities, I’d bet that the ST wouldn’t have had a GUI and that the Amiga might have had a mechanism for pulling down the entire screen in a tile and revealing a table of options. In fact, I remember using a SunOS** based CAD package back in the day that worked something like that.

        **no, not Solaris.

        1. Liam Proven (Written by Reg staff) Silver badge

          [Author here]

          > Remember, Android was in development concurrently with iPhone. iPhone was released first

          Yes indeed.

          Android essentially started out as a Linux+Java rip-off of the Blackberry. The early screenshots show this well.

          https://betawiki.net/wiki/Android_htc-29386.0.9.0.0

          https://android.fandom.com/wiki/Android_1.0

          1. ldo

            Re: IPhone

            And the Iphone looked suspiciously like the LG Prada, which was first shown publicly a few months before.

            1. Handy Plough

              Re: IPhone

              Are you actually suggestiong that Apple designed, developed and built the complex supply chain in a shade under 2 1/2 months? Pssst, your fandroid is showing...

              1. ldo

                Re: IPhone

                Well, Google has been accused of been very quick in turning Android around to work more like the Iphone after seeing the latter. I wonder why Apple should have a virtue made of their slow lead time to market ...

            2. ianbetteridge

              Re: IPhone

              Ha! I reviewed the LG Prada. God, was that a terrible piece of junk. I can't remember if the screen actually was resistive, or whether it was such a terrible capacitive screen that it just behaved like a resistive one. It's probably in one of my boxes of awful old technology, somewhere.

        2. Boring Bob

          A friend of mine who was a developer for Sagem Mobile in France told me that they had developed a touch screen phone before the iPhone, but some high-up director insisted a telephone had to have a physical keyboard and dumped the project.

      2. Liam Proven (Written by Reg staff) Silver badge

        [Author here]

        > 24th of January 1985 but there were two other 16-bit machines which came out that year.

        Not completely sure of your point here.

        Mac: January 24, 1984

        ST: June 1985 <- 18 months later

        Amiga: July 1985 <- 19 months later

        I mean, you're right, you couldn't do it in 5 months. They didn't. It took a year and a half.

        No DR GEM would mean no ST.

        GEM launched February 1985. Just over a year to write a GUI for existing PC hardware based on one you've seen? Yes, that's doable. GEM was very small and simple. The GEM Desktop app shows off more or less all it can do.

        ST TOS is a blend of GEM (ported from 8086 to 68000), plus CP/M-68K (also a port) mixed in with some of the DOS-compatible kernel of what would later become DR-DOS. And, remember, no CLI and no command line, so no external programs and commands: GEMDOS was _just_ the kernel.

        Get it roughly working and port it to a new platform while development continues? Sure, that's doable.

        1. James Turner

          Amiga had the famous boing ball demo at CES in January 1984. Their financial issues that resulted in being bought by Commodore might have delayed the launch until the next year, but they had a working GUI then so it wasn’t something written in the time since the Mac came out.

          1. 45RPM Silver badge

            They had a proof of concept of some of the technologies then, but not much more. They didn’t even have hardware - they were using a bunch of Sage machines to emulate some Amiga concept functionality. Which doesn’t, by the way, detract from the fact that the Amiga was a great computer. But financial issues were the least of the reasons that it didn’t launch at the same time as Macintosh - let alone before it.

            1. user555

              Um, the relevant part is " ... they had a working GUI then so it wasn’t something written in the time since the Mac came out."

              1. 45RPM Silver badge

                There’s no evidence for that that I can find. I’ve searched. I love an old OS, but I can’t find any evidence that anything much other than a breadboard, a stack of SAGE IVs and the boing demo was shown at CES 1984. Even if the GUI was demo’d, I doubt it was anything like the Amiga GUI that we’ve come to know and love.

                But, I don’t think they needed to show a GUI - if I’d seen boing at 1984’s CES my eyes would have fallen out of my head! Real 3D (even though it wasn’t really, but it looked it, and that’s good enough) animated in real time. Yes, the Mac had its own boing demo (Vanlandingham) but that was a pale shadow of the Amiga’s majestic boing!

          2. Liam Proven (Written by Reg staff) Silver badge

            [Author here]

            > but they had a working GUI then

            [[Citation needed]]

            Remember that the Amiga was designed as a games console.

            "Working graphics" are *not* the same thing as "a working GUI".

            The fact that it could run a graphics demo does not mean that it was ready. The fact that it was 18 months from launch pretty clearly indicates that it wasn't.

            AIUI it was a CBM decision to bolt a desktop onto this console they'd bought and turn it into a general-purpose computer.

        2. Dan 55 Silver badge
          Facepalm

          Ah crap, I misread 1984 as 1985.

          Well... this is what a pre-release Workbench looked like in 1985, that could probably have been done in a year.

          I still believe it was the direction industry was going through simply because there had to be a reason for people to buy more powerful more expensive computers.

        3. Shred

          Yes - and don’t forget that some developers were given access to prototype Mac development systems (on Lisa hardware) before the launch of the original Macintosh. Microsoft was one such developer, hence Windows being “Copyright 1983…). Even back then, Apple knew better than to launch a new machine with no third party applications in the pipeline.

  3. Anonymous Coward
    Anonymous Coward

    Old Mac photo: MOUSE and GUI matter!

    Linux Wayland mouse lag has not been fixed for years. Also file copy/paste UI has been inconsistent across Lunux destops.

    Then everyone wonders why oh why Windows still dominates the market. This IS why. Or maybe because Linux developers are command line gurus, who do not use the mouse at all.

    1. Gene Cash Silver badge

      Re: Old Mac photo: MOUSE and GUI matter!

      > Or maybe because Linux developers are command line gurus, who do not use the mouse at all.

      Bingo, we have a winner. I spend at least 50% of my time in xterm or emacs. My window system doesn't even have widgets, it's all function key driven. F3 to make a window full-height, for example.

      Then you have Windows. That consistently ignores Ctrl-C to copy. If I had a dime for every time Windows pasted an old clipboard item, I'd have retired long ago.

      1. Anonymous Coward
        Anonymous Coward

        > consistently ignores Ctrl-C

        My old keyboard Ctrl-C keys became shiny from the frequent usage, but I never experienced the issue in Windows or Linux. NOT A SINGLE TIME

        1. Michael Wojcik Silver badge

          Re: > consistently ignores Ctrl-C

          Apparently you don't use Windows "Command Prompt" windows, then, since they don't use ^C for copy. To name just one example.

    2. ianbetteridge

      Re: Old Mac photo: MOUSE and GUI matter!

      Betteridge's Second Law: In the comments of any article written by Liam, someone will pop up to complain about Wayland.

  4. karlkarl Silver badge

    Behind that interface is an even cooler thing. A really portable approach to dealing with memory safety in C. Their pointer tombstone approach was quite inspired for the time.

    (that said, it likely violated strict aliasing rules).

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > A really portable approach to dealing with memory safety in C.

      Can you explain what you're thinking of here? Classic MacOS wasn't written in C.

      It's a mixture of Pascal and 68000 assembly code. I mean, yes, a valid approach to dealing with the problems of C is "don't use C at all" and I wish more people took that line, but still...

      1. karlkarl Silver badge

        Sure, its relating to this:

        https://en.wikipedia.org/wiki/Classic_Mac_OS_memory_management

        "To solve this, Apple engineers used the concept of a relocatable handle, a reference to memory which allowed the actual data referred to be moved without invalidating the handle. Apple's scheme was simple – a handle was simply a pointer into a (non-relocatable) table of further pointers, which in turn pointed to the data."

        (You are right though, this doesn't *have* to be C but there was an API that provided this at the time. It is a technique that can be used by most languages utilizing pointers)

        Basically, pointers went "through" a validation table containing "validity" records (formerly called "tombstones") to check memory errors.

        Think along the lines of struct Employee ** pointing to a struct Allocation * and thus passing through to the first member (void *).

        struct Allocation {

        void *data;

        size_t size;

        int valid;

        };

        I attempted to implement my own standalone approach here:

        https://www.thamessoftware.co.uk/forge?repo=stent

        It has been moderately successful with a websocket proxy / web server I wrote using it here:

        https://www.thamessoftware.co.uk/forge?repo=wsstent

        But... can't seem to find a way to comply with strict aliasing rules.

        1. Anonymous Coward
          Anonymous Coward

          In the early versions of MacOS it just blew up..

          Pass a Handle to a trap expecting a Pointer arg or a Pointer to a trap expecting a Handle arg and you would find yourself in Macsbug very quickly. Or a Bomb Box with a DS Error number if Macsbug not installed.

          Things got a little better when the various flags were moved from the top 8 bits of the address. 32 bit clean. But passing the wrong type as an arg for traps was a common bug in code for quiet a while. As was not locking or unlocking handles. And you got a compact heap in the middle of your code. Those were fun bugs to track down. In other peoples code.

          1. karlkarl Silver badge

            Re: In the early versions of MacOS it just blew up..

            Indeed. Though, I would have thought that in a "debug runtime" build expensive checks could have been done to alert the programmer of this error. This wasn't done and I am slightly surprised it wasn't.

            Same with C++ actually, in debug builds, similar to Microsoft's debug iterators we could:

            - Lock operator-> to prevent use-after-free of "this"

            - Lock operator[] to prevent invalidation of elements during access

            - Return some lock passthrough on operator T& to prevent use after free during use as a parameter reference

            Yes, it would run slowly (possibly slower than ASan) but in this kind of debug build who cares? It would be a really good compromise.

            1. Anonymous Coward
              Anonymous Coward

              Re: In the early versions of MacOS it just blew up...counting clocks cycles

              Back in 1984 we wrote a lot in 68K asm. And I could tell you the clock count for every instruction and addressing mode. And you never ever MUL'ed or DIV'ed. You made sure you could bit shift by two. /(2^N) or *(2^N). And I could tell you how many 68K instructions were generated by a particular expression in the C compiler. First MacOS product used a C cross compiler that could target 68K.

              So no. We did not waste instructions on anything but the most basic error checking. So pass in garbage as a trap arg and you usually got an address error pretty quickly. Within 10 or 15 instructions. After the ATrap interrupt wrapper code.

              Just looking at the original 64K ROM Memory Manager traps memory map. The whole trap code was bout 3K. Plus there was about another 2K or so of support subroutines for heap management and master pointer blocks etc.

              ; ==============================================================================

              ;

              ; MEMORY MANAGER start address $402AB0 length 2558

              ;

              ; ==============================================================================

              $402AE6 _SetAppBase $A057

              $402B4C _InitZone $A019 PATCHED! 2698

              ...... [another two dozen traps in here ]

              $403256 _MoreMasters $A036 PATCHED! 2578

              $4033FA _BlockMove $A02E PATCHED! 24EC

              The 64KROM after bootstrap started at $400000 after the bootstrap remapping. So Memory Manager traps were $402AE6 through $4033FA with mm support code filling up the rest of the address space until the next managers in ROM . The Sound Manager and OS Event Manager.

              Its amazing what you learn when you spend a few thousand hours of your time in MacsBug..

              (most of the patches in the 64K ROM were crosspatches for other traps)

          2. ldo

            Re: As was not locking or unlocking handles.

            Bearing in mind the handle you were locking might already have been locked by some outer routine. So the proper sequence was

            save_state := HGetState(hdl);

            HLock(hdl);

            ... safe to access block at hdl^ here ...

            HSetState(hdl, save_state)

            Perhaps you might want to do “HLockHi” (move block as far up as it can go before locking) instead of “Hlock”, to try to minimize application heap fragmentation.

            Later under MultiFinder, I discovered that the system heap was resizable. So I would often do large or variable-size allocations in that. Because the alternative was the tiresome process of telling users to quit your app, adjust the application’s heap size, and launch it again. The drawback was that you had to remember to explicitly free such allocations before your application quit. Otherwise that memory would stay leaked until you rebooted.

          3. Liam Proven (Written by Reg staff) Silver badge

            Re: In the early versions of MacOS it just blew up..

            > Things got a little better when the various flags were moved from the top 8 bits of the address.

            I must admit that before these comments and digging into the memory management a bit, I didn't realise that Apple used the same approach -- keeping flags in the top 8 bits -- that Acorn used about 3 years later in Arthur, the OS that grew into RISC OS.

            If I'd known, I would have sneaked this in as another influence!

      2. Anonymous Coward
        Anonymous Coward

        He's referring to Handles

        There were two sort of blocks in the Memory Manager. Non relocatable blocks. And relocatable blocks. Pointers and Handles. Unless you had a very good reason you always called NewHandle() rather than NewPtr(). As the heap manager could move Handle blocks around to optimize memory usage. If you passed your own memory ptr to NewWindow() etc it was always a Pointer block.

        One of the classic bugs made by less experienced programmers was to forget to HLock() handles before you derefed them to their current memory location in the heap. And HUnlock() when finished.

        One other minor quibble. There were two layers to QuickDraw. The high level traps DrawLine(0, DrawOval() etc. And the low level Std call traps. StdLine(), StdRect() etc. The high level traps stuff was 68K. The Std calls were just hand optimized PASCAL compiler output of Bills revised Lisa code. Just look for LINK A6 instructions to find the PASCAL code in the ROM. That was for the PASCAL local var frame.

    2. Gene Cash Silver badge

      > pointer tombstone

      Citation please? This is the first I've heard of such a thing, and Google just returns a couple pages trying to explain it that are barely written in English.

      1. karlkarl Silver badge

        I had a few very good fundamental ones I used as part of my thesis. I will try to dig them out. However for a quick immediate example, this article explains it quite well:

        https://oberoidearsh.medium.com/dangling-pointers-tombstones-and-lock-and-keys-f6bd0791810f

        This is quite good too, following this reference leads down a more academic path / approach.

        https://books.google.com/books?id=To3xpkvkPvMC&pg=PA392

    3. ldo

      Pointers To Pointers

      The relocatable “handles” concept was actually proof that the OS was not written in C. Because whereas in Pascal you could do easily double-dereferencing:

      theobj^^.thefield

      In C, this became a bit of a pain to write:

      (**theobj).thefield

      In the later colour Macintoshes, I once wrote this expression in some Pascal code:

      theGDevice^^.gdPMap^^.pmTable^^.ctSeed

      Imagine doing that in C ...

      1. Anonymous Coward
        Anonymous Coward

        Re: Pointers To Pointers...well in MPW C 1.0 in 1987 you could..

        .. do this...

        (*(*(*theGDevice)->gdPMap)->pmTable)->ctSeed

        ..and we did. And in Think C too.

        Plus no one ever wrote in dev code I ever saw....

        (**theobj).thefield

        They always wrote..

        (*theobj)->thefield

        At least all the guys on my teams did. And any other MacOS C code I saw in the next 25 years. Which was an awful lot.

        The PASCAL format for the API for MacOS was inherited from the Lisa. About 1/4 of the first loose leaf version of Inside Mac in early 1984 was very Lisa. But once MPW was shipped PACAL was totally dead for MacOS dev. Not even MacApp could save it. Was nt helped by the fact that MPW 1.0 shipped with best C compiler for the 68K shipped by anyone. From Green Hills. You had to work very hard to write hand tuned 68K asm that was tighter that the GH C final 68K code-gen.

        I'll say one thing for PASCAL. Passing args on the stack and getting the result back on the stack was a lot cleaner that the various C ways of doing it at the time. Which were mostly reg based. Except when they weren't.

        It will always be MOVE.L D0,-(SP) or MOVE.L (SP)+, D0 for me. Its a 68K thing.

        1. ldo

          Re: Pointers To Pointers...well in MPW C 1.0 in 1987 you could..

          Interesting, isn’t it: my Pascal version

          theGDevice^^.gdPMap^^.pmTable^^.ctSeed

          is 38 characters, while your C version

          (*(*(*theGDevice)->gdPMap)->pmTable)->ctSeed

          is 44 characters. C is supposed to have this reputation for being a terse, even cryptic language, yet here it is positively verbose.

          1. Anonymous Coward
            Anonymous Coward

            Re: Pointers To Pointers...well in 1987 you could..huh? Bloody stupid

            The PACSAL version is a few chars shorter than the C code? Using fields names that were in the Apple supplied header file?

            Almost the dumbest comment I have ever read in TheReg. In several decades.

            So I take it you have absolutely no idea what how the PASCAL and C compilers work. And what the executable output looks like.In 1987 using the MPW PASCAL and C compilers my "verbose" C code would have generated half as many 68K instructions as your "concise" PASACL code. Because the PASCAL compiler did serious register thrashing, no reg coloring, and every damn routing had MOVEM's and LINK/UNLINK wrappers. For simple getter / setters most of the final code was dead wrapper code.

            So lots of

            MOVE.L field0(A0),A1

            MOVE.L A1,A0

            MOVE.L field1(A0),A1

            Instead of..

            MOVE.L field0(A0),A0

            MOVE.L field1(A0),A0

            ..plus the PASCAL code now had a extra MOVEM to save / restore A1

            In 1987 the easier was of making MacOS PASCAL code execute twice as fast (usually faster) and shrink the executable size at least 50% (often a lot more) was a simple mechanical transition into C. Did it several times. That's how bad PASCAL was back then. And why it died and C took over.

            C was never terse. When well written. Now APL, that was terse. By design.

            1. ldo

              Re: Pointers To Pointers

              Interesting how you’re now changing the argument to complain about code quality of Apple’s compilers.

              Note that there were other compilers available, not just Apple’s ones.

  5. Marty McFly Silver badge
    Pint

    Read the stories

    Head over to https://folklore.org/

    Be prepared for a lot of first-hand history reports on the genesis of Macintosh. Someone had the foresight to record the stories before they were lost to the ages. Good stuff.

    1. W.S.Gosset
      Thumb Up

      Re: Read the stories

      Andy Herzfeld, IIRC.

      Somewhere in there is the story of Burrell Smith swapping graphics cards on a _running_ machine, timing it so perfectly based on his knowledge of the code+hardware that the machine didn't crash, just kept on running: oblivious. The man was a god. Very unfairly treated.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: Read the stories

      [Author here]

      > Head over to https://folklore.org/

      I did link to a few folklore.org stories in the article itself...

  6. J. R. Hartley

    Apple reality distortion field

    Good to see the ARDF is stronger than ever. The Mac, and Apple in general, were a bit of a joke. £8k for a computer that could only do black & white. Lol.

    If only Commodore didn't fuck themselves up...

    1. Anonymous Coward
      Anonymous Coward

      Re: Apple reality distortion field...Amiga Alert..Amiga Alert

      £8k? Really. Which orifice did you pull that number out of.

      In the summer of 1984 you could walk into a computer shop on Baker St (just up from the tube station) and for just under £2k walk out the door with a shiny new 128K Mac. About the same price as one of the early IBM PC clones. And much cheaper than a DEC Rainbow.

      So I'd guess you must be a member of the Cult of Amiga. Which even in 1985 soon after its launch was already garnering a reputation as a coven of anorak wearing Nigels. Even in the US. I remember a few Amiga stands at MacWorld's in San Francisco around that time and the people on the stands had that weird EST/Scientology cult vibe thing going. There again it was California and it was the 1980's. So who knows. It was that kind of decade. Like the time EST and Modula II destroyed the leading word-processor of the time.

      But that's another story.

      1. J. R. Hartley

        Re: Apple reality distortion field...Amiga Alert..Amiga Alert

        There's this thing called 'inflation'. You might want to Google it and get back to me.

        1. ianbetteridge

          Re: Apple reality distortion field...Amiga Alert..Amiga Alert

          It might shock you to discover that this applies to every other computer, too. You should see how much an IBM 5150 costs in today's money… and that didn't even include a monitor. “What a joke”.

  7. Jason Hindle

    Before OSX, the Mac didnt appeal to me

    After OSX, the idea of an appliance that “just worked,” started to become a bit of a joke. The spiritual successor to the Macs of old? I'd argue it's the Chromebook.

    1. Crypto Monad

      Re: Before OSX, the Mac didnt appeal to me

      System 6 was the peak of simplicity and reliability. System 7 onwards were as crashy as hell. OSX, with its proper kernel underpinnings, was the start of return to stability.

      1. Jason Hindle

        Re: Before OSX, the Mac didnt appeal to me

        And also vastly more complex. It's perfect for an engineering workstation and should mostly work for none technical users who don't know what the terminal is, but the extent to which it just works any more than Linux or Windows is open to debate.

        1. ianbetteridge

          Re: Before OSX, the Mac didnt appeal to me

          That, though, is down to both Linux and Windows becoming a lot better and more reliable over time. We're in the lucky era of computing where it's actually got a lot harder to terminally mess up your OS, with kernels and low-level stuff that is incredibly reliable compared to what we had to use even twenty years ago.

          I'd argue that in all three cases, though, we're subject to “it works... until it doesn't”. And when it doesn't, fixing it is incredibly hard. Less so with Linux, where are least almost every “going wrong” scenario is a well-trodden path where a bunch of nerds online have almost certainly written a solution that's a Google search away. But Windows and Mac? Boy, when they go wrong, they really go wrong.

          Case in point: last year, I managed to mangle an install of MacOS (or rather, macOS TM) so badly that it needed reinstalling from scratch. There's now no way to do this by simply downloading an ISO and reinstalling: you have to use Apple's recovery tools. Which of course did not work.

          After a few weeks of messing around, including getting help from Howard Oakley (former MacUser technical bod, help expert and the man who probably knows more about the internals of Macs than anyone outside Cupertino) I had to admit defeat and send it off to Apple for them to do it.

      2. Liam Proven (Written by Reg staff) Silver badge

        Re: Before OSX, the Mac didnt appeal to me

        > System 6 was the peak of simplicity and reliability. System 7 onwards were as crashy as hell.

        You have a point -- they were not stable OSes. The old joke was that if you owned a Mac, at least something in your life would go down on you every single day.

        But I spent a few days, some 15Y ago or so, getting a my Classic II dual-booting both System 6 and System 7.

        6 being 6.0.8, and 7 meaning 7.6.1, the latest that machine can run and it needs RAMdoubler to do it, as it maxes out at 10MB and 7.6 wants 12MB.

        Sure, a bare install of System 6 was lighter, but once you added a bunch of the essentials, such as Windowshade, After Dark, a menu bar clock, a hierarchical Control Panel add-on, MacTCP and a network stack, etc. etc. etc. in order to get to the basic level of functionality we all took for granted in the System 7 era, System 6 was no longer lightweight and fast. Actually, there wasn't much difference, and System 7 still did more, multitasked better -- not well, but better than Multifinder -- and it networked better too. It was barely able to access the web: the top left corner of Slashdot.org took about 25min to render, and that was over SCSI Ethernet and ADSL broadband. But it could do it, and you could get files onto it and off it again much more easily.

        My System 6 nostalgia didn't last long. System 7 was in its way the Windows 95 to System 6's Windows for Workgroups 3.11: yes, sure, the older OS was significantly quicker, but the new one did more and had a better UI and overall it was worth the (really quite modest) level of bloat.

        1. ianbetteridge

          Re: Before OSX, the Mac didnt appeal to me

          I was running Blue (what became System 7) on my Mac Plus in 1990, as I had access to the first widespread internal release (I was working in IS&T at Apple at the time). Even though that version was sloooow as hell, it was better than System 6 in almost every way.

  8. Doctor Syntax Silver badge

    Reading some of the linked stories it's amazing that the Mac survived Jobs' interference. No wonder it took them a couple of revisions to get it useable.

  9. ldo

    Monster Mac

    Even before the Mac Plus, people found ways to boost the spec of a basic Mac. The “Monster Mac” upgrade of 1985 boosted the RAM to 2MiB. It also added little fangs to the Macintosh icon that appeared at boot time, just to show it was installed.

    Some company even found a way to fit a hard drive inside the case. So the machine that was designed, on the orders of Steve Jobs, not to be upgradeable, in fact could be upgraded in many creative ways.

    1. PickledAardvark

      Re: Monster Mac

      Big screens for tiny Macs were a third party option, before the Mac SE which was designed to support them. The Monster Mac and similar RAM upgraders also offered a 68020 processor.

  10. sarusa Silver badge
    Unhappy

    It changed everything but changed one thing in a bad way...

    The Mac definitely validated the idea of mouse and windows, and WYSIWYG (What You See Is What You Get). They'd all been seen before, of course, but the Mac was the one that nailed and popularized it.

    The one terrible thing it did, however, was make black text on white background the standard. Before that time we did white on black (or orange on black, or green on black, etc) because it was easier on the eyes.

    And if you have a nice soft monochrome display like the original Mac did, then black on white is no problem, like it's not a problem with printed books. You can stare at that all day. However the entire rest of the industry (including Macintoshes once they introduced the color Macs) were on color CRTs and then LCDs, LEDs, etc. And on a bright emissive screen, black on white is far harder on the eyes than white on black (or yellow on dark blue, your dark theme of choice). In the daytime it doesn't matter so much, but at night the bright white backgrounds just stab your eyes with many many fine tiny knives. And it took 30 years of this crap before people finally started making dark mode a thing again (around 2015), just because 'Apple did it so it must be cool' even though Apple had long forgotten why they did it (the screen technology matters) and Microsoft never even knew why in the first place, just blindly copied it.

    1. ldo

      Re: black text on white background the standard

      No it didn’t. That was already recognized as the right way to do things. That’s why we don’t print white text on black paper, for example.

      1. VicMortimer Silver badge
        Flame

        Re: black text on white background the standard

        Exactly. The screen technology had nothing to do with it.

        I absolutely despise 'dark mode' and will turn it off if I'm having to touch somebody else's computer or phone. It's awful to read white (or worse, green) text on a black background, and I refuse to go back to 1982.

      2. sarusa Silver badge

        Re: black text on white background the standard

        Like I said in the original comment, that is only if you have no idea about the difference between emissive and reflective mediums.

        But hey, there's at least 6 kneejerk AppleBros who can't read and think that black on white on a CRT or OLED is the same as black on white on a page of paper and angrily hit downvote, so you're not alone there.

        1. sarusa Silver badge

          Re: black text on white background the standard

          But I'm sure I'm mistaken and this is genius up there with the puck mouse, my apologies.

    2. Michael Wojcik Silver badge

      Re: It changed everything but changed one thing in a bad way...

      The one terrible thing it did, however

      Huh. I had already assigned WIMP UIs and WYSIWYG to the "terrible things" category.

  11. jake Silver badge

    History.

    "An example of the machine's versatility is that it could run other OSes, including Xenix, Microsoft's early Unix."

    Not Microsoft's. SCO's.

    SCO[0] was the company that ported AT&T's bog-stock PDP11 UNIX Version 7 source over to the Lisa. I should know, that's what runs on mine.

    Microsoft was just the middle-man ... AT&T wasn't allowed to sell UNIX in the commercial market for anti-trust reasons. Bill Gates saw a financial opportunity, and convinced Ma Bell to allow Microsoft to became a re-seller of AT&T's source, but it was other companies that did the porting. The Xenix name came about because $TELCO's lawyers decided it would be prudent to jealously guard the UNIX trademark. About all the "Xenix" coding that Microsoft ever did was to add a Microsoft trademark line to a few header files, near as I can tell.

    [0] The real SCO, not the latter-day zombie SCO of insane litigation fame.

  12. jake Silver badge

    Myths and reality distortion fields.

    "Everybody" knows that Bill Gates said something about 640K being enough ... unfortunately, he never said that. It's a myth.

    However, I personally remember Steve Jobs saying that "128K should to be enough for home users!", at a meeting of the Homebrew Computer Club in late 1983, as he was demonstrating the original 128K Mac, just before the public unveiling. At the time, he had a point ... people were running flight simulators in 64K![0]

    [0] On bare metal, no OS ...

    1. Anonymous Coward
      Anonymous Coward

      Re: Myths and reality distortion fields.

      Gates denies saying it now, but I remember some years ago somebody who'd heard him say it say that Gates was lying.

      First I'd heard of the Jobs quote, but I'd believe that too. Burrell had to sneak in the 512K support circuits making it fairly easy to upgrade a 128K Mac.

      https://folklore.org/Diagnostic_Port.html

      1. jake Silver badge

        Re: Myths and reality distortion fields.

        https://quoteinvestigator.com/2011/09/08/640k-enough/

        tl;dr version: There is absolutely zero proof that Gates said it, just stories from a friend of a friend and other forms of rumo(u)r. Note that the 640K limit was a hardware limit, and already set in stone by IBM by the time Gates heard about the project. Microsoft was working with the computer they had been given, and had no input as to memory limitations.

        The Jobs quote I heard with my own ears. Ask anyone who was at that meeting of the Homebrew Computer Club. No, I am not going to out myself in this forum, but many of the folks there that evening are still alive. (My lizard hind-brain suggests that it was Todd Fischer who called Jobs out on the lack of upgradability.)

    2. ldo

      Re: Myths and reality distortion fields.

      One thing they obviously still believe, is “26 drive letters ought to be enough for anybody!”

  13. PickledAardvark

    Hierarchical Flle System

    Liam Proven: "These [Mac Plus] things, in turn, required a larger, 128 kB ROM, both for device drivers and the new Hierarchical File System."

    Not entirely correct. HFS had been introduced with the Hard Disk 20 for the Mac 512K, requiring an INIT (system patch) on the boot floppy. HFS worked on a Mac 512K for the HD20, 800K floppy drives and some third party hard disks (Corvus?).

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Hierarchical Flle System

      Fair enough. But I think the reason for putting it into the firmware was to enable booting from it, you see...

      1. ldo

        Re: Hierarchical Flle System

        As I recall, the “HD20 Startup” file (together with System 3.x) was sufficient to allow a 1984/1985-vintage Mac (with the old 64K ROM) to boot off an HFS volume. I remember seeing that message “HD20 Startup” appearing during boot.

      2. Somone Unimportant

        Re: Hierarchical Flle System

        Yep - that's my recollection.

        I started 1985 with a Mac 512k that had been brought home for me to use for my final year of school, allowing me to type up essays in MacWrite (disk version) before printing them out painfully slowly on an imagewriter.

        Then an HD20 arrived at the end of 1985.

        I remember hooking it to the floppy port, starting the Mac, popping in the HD20 boot floppy, then the boot disk ejected and HD20 took over.

        It was like all my Christmasses had come at once.

        I don't have that Mac anymore, but I still have an SE/30 that works nicely.

  14. John Brown (no body) Silver badge
    Trollface

    20, 30, 35, 40th anniversaries

    "When it turned 20, we looked at how it was a mold-breaking computer. When it was 30, we devoted two separate installments to key points in the machine's history; and at 35"

    You'd thing a techie/IT rag would be celebrating nice round binary or hexadecimal number anniversaries rather than boring divisible by 5 or 10 decimal ones!

    Admittedly, the downside is that they become ever further apart and many of us might be dead before the next "round" number anniversary :-)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like