Not for profit
I often wonder if the management of Xerox ever realized quite how special PARC was, or how much money other firms would make from their work.
Xerox is donating the iconic Palo Alto Research Center (PARC) in California to SRI International, a non-profit scientific research institute, apparently so the company can focus on other stuff. Xerox PARC has acquired an almost legendary status in the annals of the IT industry as it directly contributed to key technologies …
Similar to Philips' Natuurkundig Laboratorium (NatLab) research facility it seems that as time went on and the company got bigger, shifting (or lacking) focus and diminishing budgets coupled with a failure to properly monetize what came out of the lab led to management failing to understand or keep what they had. The magic works until it doesn't and once it's lost it's gone. I work in the area around Eindhoven and in a company with many many direct ties (both in people and what we do) to the NatLab and many people are still nostalgic for it, similar to PARC. But given how both NatLab and PARC ended up, I think it really is a case of the magic having died long ago and attempts to revive it have not been a very successful so far.
Philips Research Labs (PRL) formerly Mullard Research Laboratories was the UK arm of the NatLab. I don't recall too much of use coming out of them. The great exception was the patent for radar doppler control of temporary traffic lights (Patent US4053887A). Every time I approach a road works and the traffic lights quickly change to let me though I say "thank you Ken!". He has saved all of us so much time.
The management certainly didn't realise how much money they could have made themselves. Once others were making the money I don't think they'd have dared realise how much they lost. The risk of shareholders suing them into oblivion would have been too great if anyone had put a number on it.
There was a lot of random innovation going on and no time travelers to say which one was right. Early GUIs needed expensive hardware and time-consuming coding.
The PARC/Mac GUI was ported to the Apple ][ series and it was a failure. That GUI needed huge investment and marketing to get going.
Oh, you mean the guys with a market-share of basically zero.
I remember looking at it around 1990. RISC OS 2. It looked and acted like pretty much every other X- Windows based GUI floating around at the time in the US. And there were quite a few. Not to mention OS2 Shell / WPS, and a bunch of other ones. Quite separate from the five or six with actual market-share.
Maybe to those with little exposure to the several dozen GUI's/ shells kicking around in from the mid 1980's / early 1990's RISC OS might have been impressive but to those of us who actually installed / used/ wiped all the most important ones, about 15 / 20, RISC OS was no big deal. We were developing commercial GUI applications and yet another GUI that was pretty much the equivalent of the Morgan car - well, who cares. Let move on to something with a potential commercial future.
The most coherent GUI to develop for at the time was the MacOS. The most incoherent was Win 1.x/2.x/3.x by a wide margin. With OS/2, GEM, Amiga etc in between. There were some interesting ideas in OS/2 WPS and easily the best trade show demo GUI was NextStep. But dont try to develop non trivial software for it. Pure Win 2.0 clown-show time. Although NextStep 3.0 was almost usable.
The GUI world back then was very different for those of us who had to write and ship application software. Not at all like the world of the software Nigels in their anoraks and their "Best GUI Ever" pontifications. Totally Comic Book Guy.
"I remember looking at it around 1990. RISC OS 2. It looked and acted like pretty much every other X- Windows based GUI floating around at the time in the US."
It really didn't. I was there too, doing UI research work in the UK (IBM UKSC and even at Martlesham!) and the US (ParcPlace! Interval Research! DEC WRL! HP labs!) and so on. Quite aside from anything else RISC OS was significantly faster than any X based UI, and was at least internally self-consistent - something none of the Xwindows systems seem to have mastered even today.
Well I was nt doing research in the late 1980's. I was leading dev teams *shipping* GUI shrink-wrap mass market application software for the MacOS. In the US. With direct involvement with the Win16 and OS/2 product dev teams. And reviewing all viable GUIs as port candidates. A very different world from the semi detached from reality research universe.
As we were market leader all the big guys wanted us to port to their better than sliced bread shiny new GUI. None were viable. As for X-Windows land I had a very good idea what was going at the time as one of the places I almost ended working, for and had many good friends already working there, had pretty much every new workstation available at the time (both released and prototype). Again because everyone wanted their software ported to the shiny new boxes. Which was very graphic intensive. I remember a good friend joking that the various workstation sitting on and under his desk cost more than his house. Hint, I got to hangout a lot at their great Friday afternoon pizza parties. In Sausalito.
RISC OS was faster, who cares. Its market share was zero. Unlike the various workstations at the time. Smallish unit sales, very high revenue. The only people running big X-Windows applications at the time were on the fastest hardware starting at $15K/$20 plus (up to the skys the limit - multi $100K's ) and thought nothing of shelling out another $10K / $20 just for a faster graphics card.
I'll agree with you on one thing. The X-Windows implementations are still the train wreck that made me laugh so much the first time I read the Burgundy book in the late 1980's. They took a book that thick to spec a windowing system that was basically just the MacOS Window Manger with remoting. Whose spec / programming manual ran to less than 30 pages. And the only reason the MacOS Window Manager could not be easily patched out to add remoting in the 1980's (like VNC later) was AppleTalk in those days had reliability issues with large binary transfers. CopyBits would have been a performance killer. Thats all. By the 1990, no longer an issue as it now had Ethernet.
So yeah, I think the Unix Haters Handbook got it pretty much right on the subject of X-Windows.
Apple thought it was necessary to update GS/OS all the way up until the Apple II GS's exit from the market so it can't have been that much of a failure. The reason for GS/OS existing was probably because the Commodore 64/128 had GEOS and the Amiga and ST cost about the same as an Apple II GS so it really had to have some kind of GUI to capture hardware sales from people who were looking for a computer with one.
The Apple II GS was the much later (1986) 16-bit version of the Apple II though, and more in the ballpark of the Atari ST and Amiga. (*)
Not really the same thing as them getting it to run on the original 8-bit, 1977-level Apple II or anything close to it.
(*) Apparently it was hobbled to avoid competing with their own Macintosh, and neglected in favour of the latter, but it'd still have been *far* more powerful than the original.
> the Amiga and ST cost about the same as an Apple II GS
At launch the II GS was substantially more expensive than an ST* but cheaper than an Amiga, since the Amiga 500 wouldn't turn up until the following year. It's also worse than both of them at everything except audio, at which it is fantastic.
* the monitorless 256kb II GS launched after and at the same price as the monochrome monitor-sporting 1mb 1040ST.
Half the world market for Apple was the US and by the time the GS was shipped it was just milking the Apple II market as it wound down.
I remember talking to some very angry Apple II developers at MacWorld around that time (1986/7) about the fact that Apple was telling Apple II devs with successful products to port to the Mac as the GS was the end of the line for Apple II product family. Sales of the GS continued into the early 1990's but only because of the huge internal battles in Apple about shipping a low cost Mac range v keeping the high margin models. The LC ended that battle in 1991. And the last of the GS's, IIe's, IIc's etc shipped soon afterwards. Although they had already completely fallen off the radar by 1988. Which is the last time I saw one at a trade show or in a computer store. I only saw them after that in company QA labs.
The US market was very different from Europe, and especially the UK. By the late 1980's in the US it was basically mostly IBM clones with the Mac taking the rest. The cheap PC's (mostly Commodores) were for playing games and a pretty insignificant part of the market. Revenue wise. A compete reversal of the UK were disposable income was so much lower so very few people at the time had home PC's that were IBM clones and Macs were as rare as hens teeth. But it was wall to wall Commodores / Ataris etc. The next big change in the US was in 1994 when Sony shipped the PSX console. Which turned consoles from a niche market into the dominant one very quickly and pretty much killed the cheap home PC market in the US. After Win 95 shipped it basically became the WinTel PC market with Mac as a minor after thought. The last nail in the coffin for the Mac being 1997.
Yes, the UK and Western European market were distinctly different to the US during both the 8 and 16-bit eras. In the UK at least, the Apple II was nowhere near as big, and the PC didn't *really* take over until the early 90s.
Amstrad's PC-1512 et al were the first really popular PC clones in Europe during the mid-to-late 80s. And even then the ST and later the Amiga remained more popular for non-business use until circa 1992-93 when rapidly-improving PC specs- along with the Mega Drive and SNES on the low end- finally displaced the latter.
As for the Apple II, while that wasn't unknown on the UK market- my Dad used one in his job in a hospital during the 80s- I get the impression it was never as big a deal here. Maybe it remained big in the US even after better and more affordable machines were out because it had already established a user base and support network early on?
Maybe the slight delay in launching, the mono-only PAL display (*), more local computer markets back then, import pricing issues(?) and less disposable income prevented it gaining that toehold in the UK/Europe?
Another difference was that the early 8-bit home computers were never displaced by the NES to the same extent they were in the US. (The NES was outsold by the Sega Master System and even *that* didn't come close to the NES's US market dominance). Wasn't until the 16-bit Mega Drive/SNES era that consoles dominated more.
(*) AFAIK, the trick that Apple used to generate "artifacted" colour on NTSC systems didn't work as well with PAL or SECAM, and European versions of the original Apple II line shipped as mono only.
By 1984 there was little difference between the business PC market in the UK and US. It was pretty much all IBM compatible. One way or another. You mostly saw IBM, ACT/Apricot, Victor PC's etc in the UK. Although the Apple II was more of a business computer in the UK than the US. Due to VisiCalc etc and the Apple II being mostly a business PC price point in the European market from the late 1970's onwards. For example there was a lot of Apple II's running booking / accounting software in European small business in the early 1980's.
As for Apple II's and color. The first Apple II I used, a low serial number one in late 1977, drove a 10 inch video monitor on the RCA. As did all the other Apple II I used at the time. Pretty good composite RF converters were already for sale in Europe by 1978. I remember seeing a small PAL color TV showing some Apple II game at the time. But as most Apple II's in Europe were bough by businesses a monochrome monitor on RCA was the norm.
As for the NTSC hack in Woz's original motherboard (the schematic in the back of the manual were a bit of a head-scratcher as he used ever pin on every TTL to do something) I think the Apple II+ in 1979 fixed that.
Amstrad was very much a bottom feeder at the end of the 1980's / early 1990's. The same kind of buyer demographic as for Yugos. It maybe a lot of worthless crap but it has 4 wheels and a steering wheel and an engine that mostly works. Sort of.
The game console market in the 1980's was very niche. Even in the early 1990's it really was not that big. In both US and Japan the game arcade business had far bigger revenue. By a very wide margin. The Sega Genesis was the first console (at least outside Japan) that started to main stream game consoles in the early 1990's but it was the 32 bit revolution started by Sony in 1994 that made them mainstream. As for the low end < $400 PC's I really dont remember seeing them in most US stores after the later 1980's outside of places like Radio Shack, Sears etc. Places like CompUSA , Circuit City really did not carry them. I remember trying to find an Atari ST in 1988 to test some software and it was not exactly easy to find. Some small store in Berkeley had a few. Again, very different from the UK were you would find Atari / Commodor etc for sale on every High Street and in the Argos catalogue.
I don't think you can dismiss Amstrad PCWs and PCs so lightly when they had two thirds of the UK market and a fifth of the European market in the latter end of the 80s.
Their Hi-Fis were terrible and people actively avoided them, their computers had a reasonable spec at a fraction of the price of other manufacturers. Their PCs also came with both DR-DOS/GEM and MS-DOS/Windows.
They only lost the market when other PC manufacturers finally lowered their prices.
The PARC GUI was originally implemented on custom computers like the Alto (and the Dorado, Dandelion, DandeTiger, etc) as a deliberate approach to seeing what could be possible with The Computer of The Future. They spent money to step 5 years into the future. So yes, expensive hardware originally in order to be able to do something advanced.
As for the code, well yes and no. It was, after all, Smalltalk. The time consuming part was working out what to do, since it hadn’t been done yet. The code for it was actually pretty simple. It had to be because there wasn’t much memory to play with on an Alto. IIRC it was 128kb including the screen memory. You can play with that original code at https://smalltalkzoo.thechm.org/ and see just how simple it was/is.
Of course these days a Raspberry Pi 4 can run Smalltalk around 20,000 times faster than an Alto and can have 64,000 times as much memory for a price equivalent to a diner lunch
The original Xerox Alto came out in 1973. There wasn't a cat's chance in hell they were going to be able to mass-market it- or *anything* with enough power to implement a halfway plausible GUI- at a remotely sensible price. People forget just how primitive- and expensive- computers were back then, when even a few years could make a huge amount of difference.
AFAIK, the Alto was essentially a minicomputer dedicated to one person. Even some of the limited number that made their way out in the late 70s (i.e. *way* further on in terms of technological affordability) still cost over $100,000 in today's money.
This was several years before the first mass-market microcomputers (i.e. 1977's Apple II, TRS-80, Commodore Pet) came out full stop and a decade before the Apple Lisa and Macintosh. (*) Even the Altair 8800 wasn't yet out then.
Wasn't going to happen in 1973.
(*) And those came out around that time because it was only then that the 16-bit computing power needed to make it worthwhile was becoming affordable.
I should make clear this *wasn't* meant as a criticism of Xerox for not commercialising what was clearly intended as a future-looking research project.
Quite the opposite; it's a rebuttal to the commonly-expressed criticism that they failed to commercialise it, when there's no way it would have been practical to do so at the time or for many years to come.
Later on, possibly. Although apparently they *did* commercialise it when they launched the Xerox Star circa 1981 and it failed. Most likely because even *that* was a bit too early, and it was still very expensive (c. $50,000 in today's money, and apparently you really needed more than one machine and a print server to make it worthwhile).
The thing about the Apple Mac is that it came out at about the right point- a few years later- for the technology it required to be *just about* practically affordable (though not cheap, and apparently it didn't do too well early on).
I noticed that with the iPod too. They released it at *just about* the time the underlying technology was becoming affordable enough to make a device like that practical and worthwhile.
Yes, other MP3 players *did* precede it by several years- you could have bought (e.g.) a 32 or 64 MB MP3 player circa 1998. This would have held about one hour or album's worth of audio which you'd have to transfer over a slow RS232 cable every time you wanted to change it. Slower than changing a cassette, and no point having skip or shuffle or a library under those limited circumstances.
The things that made the iPod possible in the form it appeared- the fundamental changes it offered over existing disc and cassette players, such as vast amounts of easily-accessed music at your fingertips- wouldn't have been doable at that price much before then.
For those interested in hearing from the people that *made* and *used* the Alto, consider joining the Computer History Museum's shindig tonight (April 26, 7pm PDT, see https://computerhistory.org/events/the-legendary-alto-and-research-at-the-edge) to hear from Butler Lampson, Alan Kay, Charles Simonyi, and others. I'd be there in person if I still lived in the valley.
The MacOS was not "ported" to the Apple II GS. The GS shipped with a windowing manager / basic GUI modeled on the Mac UI guidelines. It worked after a fashion but as almost no software was written for it and most GS's were just running Apple II CLI educational / business software it quickly died. By that stage all viable commercial GUI software was being developed for the Mac.
"no time travelers to say which one was right"
It didn't need time travel, it needed a bit of vision and a lot of application. Those are things senior management are paid to provide. Plenty of people had the vision about one innovation or another and succeeded. It was Xerox who had the best opportunity who didn't.
Xerox isn't the only company to fail to market a cool+useful thing their employees made.
I saw a photo of 3M employees holding a huge roll of e-ink "paper" they had made. Ten or twenty years after that, it started showing up in computer black-and-white displays (e-book readers, bus status displays).
Nobody has created what I hoped for from it: giant, completely-covers-the-wall black-and-white computer displays (even though they would have slow refresh), and, nearly-infinitely reusable electronic "paper" which has ultimate security: run it through the "erasing" machine, all the little balls inside twist to the "white" position, the "paper" is ready for re-use, and no data recovery from that sheet is possible.