
Architecture and Morality.....anyone?
How come the nice architecture of the Motorola MC68000 has been consigned to history......while the ugly 8086 (and its descendents) have thrived?
Apple has given its blessing to the release of source code for its first mouse-powered marvel – the Lisa – to mark its 40th birthday. The Computer History Museum scored the code and released it here along with a written celebration of the machine. As The Register wrote when lighting candles for Lisa's 30th birthday and, oddly …
This post has been deleted by its author
The IBM PC. And WinTel.
Which only happen because the IBM PC dev team hardware lead reused another project motherboard design to save time. Which used a x86 processor. No other reason.
Plus the original design was to use 68K but Moto said they could not deliver in quantity in the time-frame the IBM PC dev project needed to hit the release date whereas Intel said - No Problem. Of course Intel could not deliver in the quantities required.. Buy hey, thats Intel for you.
The 68K architecture lives. As Freescale ColdFire/ Dragonball etc for embedded. Still going strong.
Still the best processor architecture. (with ARM as an honorable runner up). When people ask me how they should learn assembly programming I tell them - learn 68k first. And then learn x86. Because you learn how to its done right in 68k. And then you can apply that to try to force x86 to be not totally stupid. Often impossible. But if you learn x86 first you just learn nothing but bad habits and really really stupid stuff. But if they dont need to ship anything, I tell them ARM is a good way to go. ARM is quite nice too.
That's a great question. Back in the day (late 70s & early 80s) I wrote assembly code on a Zilog z80 microprocessor. Life was good, but very limited. Zilog had announced the Z80k and that looked promising. Especially with the amount of memory available and how it was addressed. Zilog's approach was very much like Motorolas and I figured either one would come out on top. I never expected Intel to surface & dominate.
The company I worked for was stuck on IBM (nobody has been fired for buying IBM gear being the theory) and I took some training in 8086 architecture and coding. It didn't take long to realize that the original x86 indirect way of addressing memory would lead to many bugs. Segment and offset being used to reach into memory. At that time I decided to drop out of writing assembly language and went to C for stability reasons.
I did have to get back into assembly language and since I learned Intel's 8080 code way back then it was easy to get into the flow of things. Memory addressing has improved and you have much more processor to work with. That said, I think Intel's early x86 architecture cost me many late nights and probably 10 years of my life. Between that and windows reboot time I probably would be 20 years younger. :-)
The 68000 is just a more-or-less revamped PDP-11, though this is not to belittle it, as the PDP-11 is one of the greatest (if not the greatest) CPU designs ever IMHO.
The 68000 hit the streets in 1981/2 which was already behind Intel’s 8086/8 which had started coming on stream at the end of the 70’s and were therefore more widespread.
It’s a shame, as the Intel legacy has been, and continues to be, such a millstone that nobody can get rid of. Intel tried twice, once with iAXP32 and the again with Itanium. Both failed miserably having cost huge amounts.
Worse still the i86 family didn’t get any better until the 386. The 186 was the all in one system on a chip that should have been the obvious choice for building PCs, but it was as flawed as the 286 which IBM did use in its PC-AT. It’s virtual memory model was broken and only fixed to become usable in the 386 - the days of memory extenders were finally over.
But by that time the software investment was too great to switch - only the non MS world could take advantage of better processors or build their own - Apple, Apollo, Sun, Stratus, Tandem, Perq to name but a few, now all fallen by the wayside.
Although the 68000 did have a proper virtual memory co-chip its arrival was too late to stem the wave of the IBM PC and its clones.
The path to hell is paved with good intentions …
"The 286 could switch to protected mode.....but had no way to switch back. "
IBM came up with a hack to work around that ...they essentially used the keyboard controller and an out-of-bandwith chunk of writable memory accessed via the A20 gate to save the state of the system before going into protected mode, and reversed it to come back out. Ugly, ugly, ugly, but very functional hack ... IF you used IBM's proprietary hardware. It also worked on a few clones, but was hit & miss in clone-land.
Why, in this day & age, do you care what instruction set your CPU is speaking and whether it's pretty or not? C/C++, Java, Python (barf!) code all look the same irrespective of what CPU architecture you are targeting, unless you're the poor blighter who is writing the compiler backend or JVM. Is anybody out there still handcrafting x86 apps in assembler?
@Sceptic_Tank
Have you head of Heartbleed, Spectre, Meltdown? These vulnerabilities are actually related to "architecture"........so, yes, it might be useful to "care what instruction set" or to care what "CPU architecture" is under hood of your nearby (or not so nearby) computer.
Some of us are still using assembly language. The 6800 matches the architecture of unix and c. People who grew up on unix and c like to use an architecture that matches their expectations: partitioned memory (like on IBM mainframes) does not match that expectation, and causes cognitive dissonance.
Because when you're building a house you don't start with wobbly foundations...
If your processor has an elegant instruction set, addressing modes etc, then your lower level code is having to do less flapping around behind the scenes to give you the impression of a stable foundation on which to build your device drivers, OS and application layer code, than if it's running on something where the instruction set etc. was clearly designed by a sadist who hated programmers.
And for those of us still working in sectors where our highest level code remains rather closer to the metal than anything the average desktop coder will ever write, having a processor which isn't fighting the compiler at every step of the way is all the more important in being able to produce good quality, efficient code. Some of us really do still care about this sort of thing, and earn a decent living out of being able to write code that performs well on resource-limited hardware.
..when compared with the 64K ROM code documented in the Inside Mac 3 ring binder / "Telephone Book". There is a System 6/7 source code tree out in the wild if you want to do a comparison.
Apart from QuickDraw in the Lisa Workshops docs I dont remember much of anything on the Lisa making much sense at the time. And I had read Adele Goldbergs SmallTalk 80 book. Whereas Inside Mac, both loose leaf, telephone book and then the final 3 vol set made perfect sense. Shout out to Caroline Rose for writing the best tech docs ever. The revamped version of IM about a decade later lost all the clarity of the original.
For those trying to make sense of the files. Its written in Pascal with some 68k asm. Unless you have a full Lisa Workshop toolset (running on an emulator) the "unix.txt" files are the body of the source. TK (Toolkit) is where most of the meat is. If you know your way around SmallTalk 80 the class organization will make some sense. Apart from the QuickDraw code the ability to recite Inside Mac from memory will be of little help in working out what the code is trying to do in most places as its all very SmallTalk'y.. And even then it was only the Lisa QuickDraw Std calls code and below that made it into the Mac 64K ROM code. All the high level QD API wrappers in MacOS were written in 68k.
So the Lisa code. It looks like inside what it looked like outside. A bit of a mess. An honorably first attempt but the MacOS team did a much better job, a fantastic job, of creating a usable API which us developers could us to create all those very innovative applications of the next decade or so. And the MacOS teams code was much cleaner too. When the whole 64K ROM disassembly was IL'ed to another machine using Macsbig and with IM in hand it was soon pretty easy to see exactly what was going on line by line. Except in the small areas of hand optimized Pascal code. Like the Dialog Manager. And QD guts. Useful to know when single stepping in MacsBugs through the mysterious land about $40000000. Which I spent many hundreds of hours doing back then.
Saying that, all our Lisa Workshop docs binders joined the Lisas and Profiles in skip out back of the office building in the summer of 1986. When MPW was stable enough to build our complete product code base. And we junked the Lisas as build machines for our Mac product with a huge sigh of relief. A very happy day.
I'm afraid the only good thing to say about the Lisa is that all the mistakes made during the Lisa dev project, and lessons learned, is what made the Mac shipped in 1984 such a great product. And such a joy to develop for and on.
A worthy epitaph in itself.
People keep hinting that Apple stole it from Xerox. However, Xerox had invented these things but didn't quite know how to market it effectively, so they made a deal with Apple that Apple would see demonstrations of it all and in return Xerox would be allowed to buy a big juicy stake in Apple before the upcoming Apple IPO and make a lot of money.
During the demos Steve Jobs raved to both Apple and Xerox people about how great it was and what could be done with it and made it clear he would do so. There was no covert theft, just an open business deal that Xerox got somewhat less out of, especially because they gave it away (well, sold it for the right to buy cheap shares) rather that make a license deal and because they didn't keep the Apple shares.
I seem to recall that Lisa was the first WIMP machine to hit the market; Xerox had not understood the value of their research, and never really cared about their workstation anyway as it was not their core business.
Also that the M stood for Mouse not Menu - one of the Apple "things" that they pushed hard on the subsequent Mac was that you didn't need menus any more, you just clicked icons which opened more windows with more icons... and for that you only needed one mouse button, period.
But poor Lisa was not a going proposition. One of its less amicable features, that Microsoft would take on board in a very big way, was that it barely had enough RAM to run the OS. It didn't really matter whether there were any apps available, 'cos you couldn't run them anyway. It was really just a statement of intent, sold as a taster of the future to a few well-heeled punters/researchers who gagged for that taste.
No, the PERQ was earlier:
https://en.wikipedia.org/wiki/PERQ
"In June 1979, the company took its very first order from the UK's Rutherford Appleton Laboratory and the computer was officially launched in August 1979 at SIGGRAPH in Chicago. It was the first commercially produced personal workstation with a Graphical User Interface.
(I wasn't the miserable sod who downvoted you!)
Thanks, I guess my title reflects on me as well. I forgot that Xerox finally woke up and smelled the coffee with the Star. But it was too expensive and too late.
I am not sure in what sense the Alto and PERQ were "personal". Yes there was only one user at a time but the furniture was built around them, you couldn't just pick up the bits and put them somewhere else. Much like the PDP-11 I used during the mid '70s. You could have cut that in two at desktop height, stuck the halves either side of a keyhole desk and plonked the line printer on top. But nobody would have described it as "personal". IMHO that is retro fanbois bending the meaning of the word a little too far.
Don't worry about the downvoter, there's always one. I sometimes wonder if some saddo I once upset is following me round and downvoting everything of mine he finds. I hope not, there are better things to do with one's life. Anyway, you seem to have picked up two of them.
I saw Xerox Stars in the wild. And Alto's at trade shows but I honestly never remember seeing a PERQ anywhere. Or anyone even paying it any attention. Must have been an academic thing.
The only mention I can find in contemporary magazines is a SIGGRAPH report in Creative Computing from March 1982 mentioning that the PERQ was announced a few years previously but was only now starting to be actually delivered. In early 1982. So the Star shipped first. And the Lisa was less that 12 months later. Byte magazine was pretty through it its coverage of the more arcane areas of computers back then and even they seemed to have missed it.
At the time, late 1970's / early 1980's when it came to GUI software SmallTalk 76 and SmallTalk 80 was doing all the running. Think I remember SmallTalk 76 getting some coverage in places like Creative Computing, Byte etc around 1978 but the big breakout was in 1980 when GUI's and object oriented programming in SmallTalk (and other languages) were everywhere. 1980 was the year of GUI's and objected orient programming. The special issues of both Creative Computing and Byte that year gives a good feel for what was going on at the time. But the first really complete discussion was in Adele Goldberg book on SmallTalk 80. I remember picking up my copy in Foyles right after it was published in early 1983.
Heady times. We'd come a long long way from the Altair 8800 on the cover of Popular Electronics in 1975.
And the problem with the H11 when it came out is that all you basically had was almost but not quite a PDP 11/03 cpu unit. It was all the other stuff in the rack that made the PDP 11 so useful.
I remember talking about this very subject in a research institute in 1979 that was a Data General house. DG Novas and Eclipses but a lot of the rest of the kit was DEC. DecWriters etc. I'd been DEC since 1975, PDP 8's, 11's etc and asked about the costs of going all DEC mentioning the H11 as a sign of the future. It was a very educational discussion with the guy who ran everything that had 220V running through it. For me. The CPU unit might be the single most expensive item in catalogue but when you set up the full installation needed for the researchers it was maybe 10% of the total cost. The H11 might be in the running for a remote site and the QBus support was nice but by that stage early PDP 11's were starting to come on the used market at reasonable prices and if you were an institutional buyer you could usually swing a good deal with one of the brokers.
So the H11 was incredibly niche. And I remember its sales numbers at the time reflecting that. It was like the 370 "mainframe" ISA cards for the IBM PC. Amazing technology but desultory sales. Now the Apple II I first got my hands on in December 1977, that was something that looked like it had a huge future. Even if it soon sprouted a whole punch of custom in house cards which mean the top lit could no longer be attached and the 9 inch monitor had to be perched on a box of line-printer paper off to one side. A pain in the neck, literally.
"So the H11 was incredibly niche."
You're looking at it wrong. The H11 was a home computer in the days before home computers. I used it to learn PDP11 assembler, Fortran and COBOL (and later C). At home. In my own time, whenever I had a hankering. In a period when all my peers had to go to the school to manage a few tens of minutes of keyboard time once per week (twice if they were lucky). And they didn't have hardware access, nor ability to use the Monitor. I was allowed to crash it to my heart's content, then fix it, and break it all over again. It was a learning tool, and a good one.
Did I mention I bought it in kit form? I laid out the traces and boiled the boards as required, and then installed and tested every component of that system. I know how it works, down to component level. (Yes, "works". She still runs. Not too bad for a near 50 year old home-built pseudo-hobby box.)
I already owned a Model 33 ... But yes, I paid extra for a glass tty. Yes, I paid extra for more RAM. Yes, I paid an arm and a leg for dual 8" floppies. Yes, I paid extra for a card reader/punch. Yes, I paid extra for a paper (mylar) tape reader/punch. Yes, I paid extra for DECTape. Yes, I paid extra for the I/O cards required. I even added a HDD and mag tape to it, eventually. Etc. Etc. Etc. Most of the above was bought used, but in excellent condition, for pennies on the dollar.
I also had free access to everything that DECUS offered ... and the minds at the Homebrew Computer Club.
IMO, it was the best computer education I could have possibly received in that time ... I still use the basics I learned with that system every time I walk into a modern datacenter. To this day I'm still of the opinion that that era's DEC kit remains the best platform to teach real-world general purpose computing. Shame they squandered the franchise.
But you were very much niche. No matter what a fantastic education it was for you personally.
The numbers for the S-100 kits guys told an interesting tale. At places like IMSAI and North Star the per-assembled unit soon outsold the kits at least 10 to 1. And usually a lot higher. With kits discontinued not long after. That market was pretty much gone by end of 1978. Because you might have got the hang of soldering but most people did not. I had access to great electronics kit and people who had been doing it as part of their job for decades (family), the best teachers, but honestly, I wanted to write programs not do physically assembly work. So building from kits had zero attraction.
As for learning about how the hardware worked.That was done using the chip docs and spending many hours poring over motherboard schematics working out what did what. Pretty straight forward as most of the chips on the boards were 74LS's. And the CPU support chips were very basic. Although the Apple II schematic was a real head-scratcher in Woz's eternal quest to save chips and use every last pin. No matter how strange the trace contortions. That foundation in what did what on the motherboard with the early S-100's etc meant that not only do none of the modern CPU hardware manuals hold much mystery but neither do the North Bridge / South Bridge chips and related support chips manuals either. Just many orders of magnitude more complex but same basic building blocks.
I was always a software guy. Who just knew how the bare iron worked. All the way down to the gate level. So I started with PDP8's / 11's (with total access, not time share) then DG minis and micros like the Apple II. Apart from futzing around with a 1802 board my micro world was full assembled PC's from late 1977 onwards. On one of the very first Apple II's in Europe. Some IBM PC related stuff when it came out but still mostly Apple II until 1984. When I tagged along to an Apple launch party for the Mac with a technology journalist as his guide to what exactly he was seeing. And after a few mins of using the Mac (unlike with the Lisa) I was certain this is the future. The future we were promised in 1980. And my future. And it was. And two years later I was wandering around Cupertino on De Anza looking for the right building to meet the MacOS team. Or as it turned out, what remained of the original team.
Interesting times.
"But you were very much niche."
Again, you're looking at it wrong. You're pretending that "niche" is somehow a bad thing ... The reality is that ALL personal computers were niche in that era. It wasn't until Compaq released it's first Deskpro (late 1983?), followed shortly by everyone+dog entering the market for a slice of the pie (with a nod and a raised glass to Phoenix Technologies), that microcomputers (now generically called "PCs") came into more general use.
I'm talking the actual microcomputer / PC business from 1975 to 1983. When the Lisa shipped. The business I've been involved with or worked in pretty much from the start. Which was a completely separate universe from the mini world back then. Starting with the base price point. $10K v $1K.
By the time Compaq entered the business with the first really compatible PC clone in early 1983 the business had been through at least three or four generations. First generation self assembly mostly S-100. Ended when the first Apple II's, Commodore Pets, TRS-80 started shipping in late '77. But the third generation only only really got going with VISICALC in 1979 when ordinary people started buying microcomputers because they were really useful rather because they were fun to play with. I did not really understand what a big change had happened until the day I was in a computer store, late '79, when a guy walked in, a small business owner , who asked which computer runs VISICALC. The computer shop guy pointed at an Apple II and the business owner said how much and wrote a check for around $3K on the spot. Yeup, we are no longer in hobbyist / experimenter territory any more. The fourth generation was very much started with the IBM PC when the business split into the home and business market in 1982. And the Lisa foretold the next generation which started the following year in 1984 with the Mac. As a viable software / hardware peripheral market.
So the H11 was very niche because the PDP 11 was very much early / mid 1970's and from a world very far from the one in which micros lived. By the late 1970's VAX is where the action was in mini-land. So I remember when reading the first review of the H11 in 1978 and thinking, thats cool. But thats all. I was a little surprised that DEC allowed it to be released at that price point. So my guess was Ken Olson did not think it was going to ship many units. DEC was always touchy about potential cannibalization of sales.
To be perfectly honest the PDP 11 was kind of old hat by then. The Am2900 bit-slice was were the CPU hardware action was at the time and it was dev boards for that processor that kids like me were lusting after. And the computer we all wanted that year was the Exidy Sorcerer. The Apple II's I used were more practical but the Sorcerer was what I was trying to swing that year but it did not work out. Pity.
In the 90's I had a couple of PERQ computers. They had a GUI, and a pointer driven interface, running a version of Unix.
They were dated 1979 (and no, I didn't get them new! I got them for about 50 quid from Manchester University in the early 90s)
https://en.wikipedia.org/wiki/PERQ
"In June 1979, the company took its very first order from the UK's Rutherford Appleton Laboratory and the computer was officially launched in August 1979 at SIGGRAPH in Chicago. It was the first commercially produced personal workstation with a Graphical User Interface.
Ha ! I remember those - I was an undergrad there at the time.
They turned up from the Science Research Council (before Engineering crept in) and were followed by a couple Sun-1 workstations and a Sirius PC.
I think they were originally billed as Smalltalk machines though I am not sure how they ended up after ICL got to them. Though it could be that the Computer Science department wanted them for other stuff than other users …
(Bit late to the story, but...)
> I think they were originally billed as Smalltalk machines
PERQ = Pascal Engine Running Quickly
They were built from the Pascal engine hardware derived from work on UCSD Pascal (which Apple ][ users may be familiar with) and its p-code.
I worked on one for a year in the early 80's and everything we had on it was in Pascal. Pretty sure that included the OS (may have been Unixy but not actually Unix).
Lisa had exactly the same problem that the Apple III had. There were too may "proper programmers" and "proper engineers" on the teams . Lots of people with advanced academic qualification and no real practical product experience.
The "proper" programming language used by "proper" programmers was Pacal. So lets write it in Pascal. Even though resulting pcode executable was huge and slow and the compilers completely undependable.
Programmers with real world experience back then wrote in asm. A few years later when we had more memory and faster processors it was C. And knew all about code overlays and table languages and cpu bugs etc etc. And knew the number of clock cycles execution time for each instruction.
Lisa software was written in Pascal. MacOS software was written in 68k asm. Except for some borrowed code. Which was hand optimized Pascal compiler output. Easy to spot in the ROM because it always started with a blanket MOVEM and LINK A6. When you counted instruction clock cycles you used instructions like those sparingly.
The difference between the Lisa hardware and the Mac hardware was exactly the same story. The "proper engineers" end result was a mess. I always remember lots of jumpers on the Lisa boards. Burrell Smiths result was a work of art. If it had not been for the dodgy flyback transformer (due to a supplier issue) on the analogue board it would have had a perfect track record.
I've seen far too many dev projects ruined by people who read far too many academic books like Jackson and with little or no real world practical expertise. Many projects. Whose failure killed several products. And a couple of companies.
Beware of academics bearing software panaceas. Because they never ever work in real life. Guaranteed.
Lets have a look at the single biggest Rust projects market share since 2012? Catastrophic decline. Just correlation or is it causation?
If someone can use Rust to get a project out and shipped good for them. But its just another stupid language created for the usual very stupid reasons. Because the people involved did not have the technical expertise to refactor and rewrite a very successful products codebase. I know, lets create a new language because we dont know how to use existing languages to solve the same old problems that happen with large conglomerate codebases. Spaghetti time - so lets build a new kitchen. But look, there are such cure little knobs on the front of the stove. So sleek. So Moderne. So UpToTheMinute..
Lets see. Recent languages I have signed up to because they solve real world problems. Swift. Great, no more Objective C stupidity. And its quite sane too. I'm there. Dart and Flutter. Cross platform mobile and just a bit big for a JS framework but not quite complex enough for native. Dart and Flutter look like they could work for some projects. And so on. Rust. Who cares. Same goes for Kotlin. And so on.
Life too short to waste time on yet another crank language. Which is most of them.
So yes, I have Opinions on the subject of Rust. Which I may have aired on occasion in the past.
In my experience the book is utterly practical. There is no heavy jargon or pseudo-intellectual nostrum making. It really does provide principles for problem analysis and solution synthesis or design. Code becomes more economical, less buggy or even bug free, clearer and more maintainable. It is language independent.
If it helped you become a better programmer and produce better results then thats a real win. You take them were you can
But I just remember it as yet another book that gave absolutely no useful guidance for the type of software I was writing then. And now. Which is mass market shirk-wrap end user software on desktop (and now mobile) platforms. Which ships in very large volume and will be run on a very wide range of heterogeneous hardware.
If you are writing a wordprocessor or desktop publishing program, or a video editor, or a video game, etc, books like the Jackson book are worse than useless. In my experience. For my end of the business. Which is writing the software that most people use on desktops and mobile platforms. These are architecturally complex application codebases running in very complex heterogeneous runtime environments. A programming universe for which none of these type of books are of much utility. The subjects they cover are mostly, well, academic. A minor part of the world I have to deal with.
Its just like Knuths "The Art of Computer Programming" books. Which academic types just love. Not once in the last four plus decades have I found anything of practical use in those books. Not even providing a good kicking off point, a germ of an idea, of how to solve a real world problem. Whereas books like the Sedgewick books as well as Binstock / Rexx etc have been a life saver on a number of occasions over the years. Books like the Sedgewick books are worth their weight in gold. I've done a serious amount of DSP related work as well and about 90% of the books on the subject are very academic. And pretty much worthless. The other 10%, again, pure gold. To someone trying to write real world code that works. Which ultimately is the whole point of the exercise.
So it very much depends on what type of software you have to write, how complex it is, what environment is has to run in, and just what problems you have to solve to get it out the door. At least in my world those kind of books not only have not proved helpful in the past but the people who do take them seriously, in my experience, is the sign of a project in serious trouble and unlikely to ship. And if they do ship the result will be lackluster and unsuccessful.
Which is basically the story of Lisa v MacOS. The academics v the hard-scrabble pragmatists. The prate flag on the Bandley building was more than symbolic.
I remember reading reports (or rumours) that you could drag the computer icon into the wastebasket, at which point the machine would crash.
I saw my first Lisa a few months later at an exhibition. It was being used in a control application.
So with my hand on the mouse I asked "Is it true that if you drag the computer into......?"
At which point I was physically restrained and almost thrown of the stand.
Hah, once at a show I was poking about at a new thrilling NeXT machine, with its MACH microkernel and all that interesting stuff. There was a normal non-privileged shell open for having a look around.
There were many processes with PID -1 and interesting names, obviously the privileged microdaemons that did all the kernel work. I couldn't help myself: I typed "kill -9 -1" and nothing ever happened ever again.
Yes, the trash the Computer bug was true for the early release Lisa. A bit had been set but not checked from what I remember. As was the 63 region transitions bug in the first release of the MacOS in 1984. The only DeepShit Bomb Box crash from the Finder desktop bug I saw in 1984. (And we tried. very hard)
But my all time favorite was the MS/DOS "DIR \ " bug in early versions of MS/DOS. At tradshows (and elsewhere) if some one was being a a*hole typing those 5 keys locked the computer so completely that it had to be turned off and turned on again. Watching their befuddlement as they tried everything was a joy to watch. But only ever used on really unpleasant people with an attitude problem. Last used against a particularly obnoxious tool on a stand at Olympia in Earls Court in 1984.
This post has been deleted by its author
So when Microsoft released some DOS source, they did it under the MIT license ("do whatever you want, just credit us").
When Apple let the Computer History Museum release the source code to Lisa OS 3.1, they wrote an original license that:
· Only lets you use and modify the software for educational purposes.
· Doesn't let you share it with anyone else, in any way, not even with friends or from teacher to student (although technically you could still distribute patches you make for it).
· Implicitly forbids you from running it on hardware you don't own.
· Forbids you from publishing benchmarks of it.
· Gives Apple a license to do whatever they feel like with your modifications, even if you keep them to yourself and don't publish them.
· Lets Apple revoke the license whenever they feel like it.
· Forbids you from exporting it to any nation or person embargoed by the USA (moot, since the license doesn't let you share the software in any way).
Why Apple feels the need to cripple the use of 40-year-old code is beyond me. Especially when they have released a lot of the code for their current OS and tools under the popular and well-understood Apache License 2.0 or their own APSL 2.0, neither of which impose these arbitrary restrictions.
So put Xenix on it instead.
No, that is not Microsoft code. It's AT&T's bog-stock PDP11 UNIX Version 7 source, as ported to the Lisa by SCO[0].
Runs great on mine.
[0] No, not the SCO of insane litigation. Not really, anyway.
My first view of a Lisa was greeted by an even more terrifying price tag, thanks to New Zealand's 60% import duty AND 40% sales tax on computers, plus "exclusive import licenses" meaning that companies could (and did) charge whatever they wanted as markup
Even into the 1990s, long after both punitive duty and sales taxes were abolished (replaced with 10% GST) it was cheaper to fly from Auckland to Los Angeles first class (staying there for a week) AND buy a top end Mac, than it was to buy one in New Zealand