Can we have at least one feature like this a day. Kthxbai.
Read a press release from Apple in the 1990s and it'll end with something along the lines of: “Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh.” All of which is true up to a point, but the statement does overlook the product …
"Can we have at least one feature like this a day."
an ebook of he best of these historical articles with edited comments (the ones where the protagonists come out of the woodwork and comment on the proceedings) would probably sell well. I'd drop at least a fiver on a Kindle edition...
“Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh.”
“Commodore ignited the personal computer revolution in the 1980s with the C64 and reinvented the personal computer in the 1980s with the Amiga.”
There, fixed it for you.
"“Commodore ignited the personal computer revolution in the 1980s with the C64 and reinvented the personal computer in the 1980s with the Amiga.”
There, fixed it for you."
You're fscking kidding , right? The C64 was good but it was in no way a pioneer. The Sinclair ZX80 and ZX81 and BBC Micro beat it by a few years even if you don't count the Apple machines as proper personal computers (and I've no idea why you wouldn't). As for the Amiga, it was a damn good machine but it didn't reinvent anything, it was simply an evolutionary progression. And you seem to be forgetting about the Atari ST of the same era which cleaned up on the amateur and professional music production side with its MIDI support.
"How many buttons for the mouse is a question that still rages today, but Apple’s testing on computer novices found that one button reduced confusion and eliminated the occasional glance to check the mouse buttons being used."
I wasn't aware of this, but I think an argument can rightly be made that the way a novice uses a computer isn't a good design guide for optimal use of a computer.
>Relying on people remembering or using experience isn't a good way to design a human computer interface.
They won't be able to learn quickly if they have to remember strange keyboard shortcuts, but if they do the task often enough then their 'muscle memory' will make it almost automatic. This is why I like menus as a training aid- the novice can select File > Save with the mouse, or to give their wrist a rest they can use Alt > F > S (or use Alt > cursor keys), or when they are used to the system they can save time by using Ctrl+S.
What I don't like about menus is when they get nested, and it becomes a test of mouse dexterity to select an item 3 levels deep... Oh well.
With each Mac purchase the first thing to be replaced was the mouse and keyboard. Even after Apple introduced Mighty Mouse I still wasn't a fan. Sure you could left or right-click but not both because it was still a single button mouse. Having separate buttons is handy especially for gaming. You could navigate by holding down the left button then click the right button to do something else such as shoot.
Yes and no.
Your first instincts are often correct. Relying on people remembering or using experience isn't a good way to design a human computer interface.
Take a swinging door for example. If it only opens one way and there are handles both sides then you instinct is to pull it open, but on one of the sides of the door this won't work. So really the side where you push should have a finger plate and the other side a handle.
You have to forget the simple applications we use like web browsers, the limited number of buttons and UI controls are pretty simple to work out. But look at a professional tool like video editing, 3D modelling and sound engineering. A sub optimal UI in such a tool is rather annoying as you'll be spending 7 hours a day using this software, lots of button clicks, keyboard clicks.
Of course most tools tend to provide power users with keyboard shortcuts, this is where web interfaces suck big time and why ChromeOS is dead in the water, it's not simple to have keyboard shortcuts easily in a web application.
> Macs have ALWAYS had second and even third mouse buttons; it's just that they're (even more
> confusingly) located on the keyboard. CTRL-Click = right click, option-click = ...
They were actually on the mouse if you could be bothered to buy one of the three button mice that were on the market. I saved for a long time to buy my first Mac, and almost immediately purchased a three button mouse for it.
I spent a few years in the late 80s and early 90s working for one of the NASA labs; one project I had some peripheral involvement with was building a system for the US military using Macs running A/UX (and DecStation 3100s running Ultrix.) The biggest user complaint was the one button mouse. I edumacated them about three button mice, they bought them, problem solved.
"I wasn't aware of this, but I think an argument can rightly be made that the way a novice uses a computer isn't a good design guide for optimal use of a computer."
Quite. Having more than 1 button on a mouse isn't an indulgence, it makes some contextual operations a damn site easier. Apples single button not only makes some things more of a faff but it also makes it next to impossible to use some X windows applications properly on OS/X because they require 3 buttons and apples 2 button bodge really doesn't work very well since you're never quite sure if you've pressed it correctly until something happens on screen. And we won't get into the lack of a roller wheel. The Apple mouse could be used as the dictionary definition of style over practicality.
The truth of the matter is he liberated the Xerox engineers from their short sighted idiot upper management on the other side of the country by firstly recognising their genius (which the Xerox board never did) and then offering them a job at Apple. I wouldn't call that poaching them since their ideas were rotting in a lab and were never going to see the outside world via Xerox.
The fact that the Xerox board approved Jobs and Apple engineers visits in return for being given the privilege of purchasing Apple stock before their flotation tells you how short sighted they were.
Find an interview with Bob Metcalfe, Larry Tesler, Alan Kay or a multitude of others who worked at PARC what they feel about Apple and Steve Jobs and you'll find a frank, not always flattering but overwhelmingly grateful response to a man.
This post has been deleted by its author
Excellent in depth story about the LISA and what she brought to the party in regards to "personal" computing
This was briefly covered in the documentary film titled "Weclome to Macintosh" where some of the original development team members were intrviewed
As for a LISA being worth almost US$25K, I wonder how many former owners rue junking their LISAs now?
"As for a LISA being worth almost US$25K, I wonder how many former owners rue junking their LISAs now?"
I junked mine about three years ago when it totally quit working. The motherboard had so much corrosion on the circuit traces that it would no longer even turn on.
I picked it up for $100 in about 1987 or 1988, from a used computer place that said it wasn't working. It turned out that a cable for the video tube had popped off; easy 5-minute fix. It was a Lisa 2/5, with the 5 MB hard drive, and served as my primary computer for about the next four years or so--I replaced the ROMs with Mac XL ROMs and found it ran Mac software quite nicely. (I was running System 6.0.8 at the time.)
Lovely machine. I was sorry when it finally failed for good.
One advantage of working in a science lab is that industrial quantities of pure alcohol for the cleaning of mouse rollers and balls was no problem. I still have a small vial of the stuff for loosening the ball my current mouse, the one on top.
Meths will do just fine as well of course, though smellier.
You used to see Apple II with all sorts of cards hanging out (did they ever make a top cover for it?)
The Apple II was so successful because it was such an open system.
Then with the Apple III they closed the doors, so it wasn't any use to anyone who'd been using the II outside the office.
"As far as I recall the Lisa/Mac didn't offer an affordable upgrade or experimenter card slots."
True of the first Macs, not of the Lisa. The Lisa had a card cage next to the motherboard, with (if I recall correctly) three slots. On my machine, one of the slots was occupied, but I don't recall what was in there. (Parallel port, maybe?) I bought an aftermarket SCSI card for the second slot, and used it to connect SCSI devices when the parallel-port Profile hard drive--with its whopping 5 megabyte capacity--started to get a bit flakey. If I remember right (it's been quite a while), the computer couldn't boot from a SCSI drive but it could use them.
I'm not so sure Lisa is "the machine Apple would rather you didn't remember". From Jobs' autobiography, it seems to me it's more that Jobs deliberately spoiled it with the Original Mac in a (somewhat typical) fit of pique when was sidelined in the company.
Perhaps a better description would be the machine *Jobs* would rather you didn't remember.
To be fair to him though (flawed flaky genius and all that stuff), the Mac proved to be a better thought out, more commercial machine.
Certainly "troubled Cambridge micro-maker" Acorn tried in the early '80s. The first ARM powered machine was supposed to be Lisa-like, but the researchers in the USA treated it like ongoing research rather than a product to be finished (obviously my bit was finished in time :-> ), that it more-or-less caused the "troubled" epithet and the development of Arthur and RISCOS in a hurry and instead.
An invention that exists only on paper is of no good to anybody (except patent trolls). Look at how much tech has been invented in the UK, and then look at how successfully they have been turned into money to reward the inventors. That observation alone should tell you that people who aren't inventors are required to turn ideas into products and money. That was Jobs' role.
What's yours, AC?
Even obvious and clearly superior ideas need to be championed, sadly. If you live the UK, look at the light switch on your wall- chances are that it is an inch-long switch with sharp corners sitting in the middle of a 4" plate, and it requires a firm press. Nasty. Now, look at the light switches that are commonly used on the continent- the switch is that same size as the plate, it has round corners and it can be easily tapped to switch between on and off.
How exactly are rounded rectangles "an invention that exists only on paper", when the quote quite clearly says that they were "everywhere"? What do you think that line about Jobs taking the guy out for "an educational stroll" meant? He was pointing out actual rounded rectangles in everyday life, not just chatting to him. They were a commonplace, as Jobs himself admitted at the time, but years later he reversed his position and claimed to have invented them; that makes him a bare-faced liar.
People seem to think Jobs' biggest fault was the way he stole others' ideas.
That is the way technology evolves, and always has evolved. See a good idea and make it smaller / faster / cheaper / easier to use. That is what Jobs did, and I have no problem with that. In fact he should be lauded for it because he did it well.
The issue with Jobs is that he wanted to have his cake and eat it. He was happy to use other people's ideas, but threw his toys out of the pram when someone did the same with one of his products. There are recordings of him boasting about stealing Xerox' ideas, and then his famous "kill Google" rant.
I remember Bill Gates reacting to Jobs accusation that he stole the Windows GUI from Apple. Gate's used an analogy along the lines of 'Imagine you had a friend who stole a TV set from his neighbour... now you go to the same neighbour and steal his other TV set, but your friend says you stole it from him...'
"Spreadsheets would never be the same again and neither would the way we relate to computers. This world of icons, folders and office stationery remains to this day and likewise the impression that these places exist on the computer continues to shape our thinking as we engage daily in direct manipulation of computational tasks."
Until Microsoft decided to scrap that for an user interface that only works well on phones and tablets. And forces that user interface on everybody while at the same time ignoring people who legitimately do not like that user interface. (The people who claim that "if you don't like Windows 8 you must not have used it" are side-effect because Microsoft is to square to have a blind following.)
Install a 3rd party replacement for the Start Menu. It ain't that difficult. But yeah, 'twas silly of MS to give people a reason to bash them, when it was so easily avoided. Still, they probably figured a lot of people are happy enough with Win7 and wouldn't upgrade anyway, so they thought they'd get a bit experimental with Win8.
Thinking positively, being able to choose from a few options for different parts of Window's Desktop Environment might work out better for the end-user... you could choose from a selection of File Browsers that are competing on quality, or are just better suited to the way you do things. Intermediate and Advanced users already use 3rd party software to give shortcuts to deeply buried settings, and many OEMs impose their own interfaces for audio options and the like on their customers. Logitech's Windows software gives the user a clone of the OSX's 'Mission Control: Show all Windows' feature which I find handy...
[Now, on the other hand that Ribbon Interface was very poorly handled... there was no reason why it couldn't co-exist with normal menus for a version or two. And it ate up too many vertical pixels when people have too many to begin with.... very silly MS. What really took the piss was that rather than provide a plugin that reinstated menus, they directed you to an interactive "Where the bleedin heck is that thing I'm looking for?" guide.]
All this talk of who copied who is just silly. It is quite possible graphical GUI systems existed even before the Xerox PARC. The problem with these very early systems is that there simply wasn’t enough computer power to make them successful. In those days DOS like systems were viable and hence were very successful. Graphical user interfaces most definitely must have been talked about in university research papers and how such human computer interaction would improve productivity. As processor technology improved over the years much of what was dreamt about decades earlier was now finally realized.
If somebody tomorrow manages to make a spaceship or time travel worm hole would you say they copied Einstein. Any idiot can sit around and dream about goblins and futuristic technology. Getting this technology to work is what most people would call genius.
The Patent system isn't design to protect lamers who have nothing better to do than dream about technology they would like. E.g. I couldn't patent a wrist watch that projects an interactive computer desktop hologram into thin air. On the other hand if I developed LED technology which allowed this then I should get a patent for my LED's. If somebody else manages to project a hologram with music speakers then they should get a patent for that. The way certain governments are running their patent offices hampers technological advancement. Intelligent people are not willing to develop the required technology because some lamer is waiting to take their cut. The system was devised to encourage technological advancement but it seems this is in reverse.
At the time, Cullinet (makers of IDMS, later absorbed into CA) was developing an integrated desktop application called Goldengate, which included Word/Excel/PPT equivalents (my memory is hazy on the latter).
On bit of this was the ability to upload/download mainframe database data from an IDMS facility known as the Information DataBase (some packaged IDMS facilities that provided a quasi-relational database).
Apple and Cullinet were in joint development, until Apple (as I hear it) pulled out of their side. Cullinet went on to develop the product for the PC side of things, where it pretty much underwhelmed, in spite of being arguably revolutionary.
I still have my square "Lisa/Cullinet: the Intelligent Link" button.
The team I worked with at Phillips Petroleum from 81-85 used the Lisa as a standard office workstation. It was a revelation after my earlier experience with IBM AT technology and software. The integrated software was far ahead of its time. In '85 Apple came and demonstarted the Macintosh. When we found out that the software from Lisa would never be migrated to the Mac, we laughed them out of the conference room. There was simply no way the inital Mac could ever compete with the Lisa. To this day, I have a hard time understanding how it ended up on the trashheap.
In the grand scheme of things, it was the prototype for the Mac. You develop all the features and work lots of the troubles out. Then you go and make a "second system". While this "second system" really isn't related to the first one, it grows and improves upon the the original. Sure, it did take a while to get the Mac up to the capability of the Lisa (around the Mac plus), things did work out.
My feeling is that it was a shame they dropped the 68k processor. For a given clock speed the 68k is far superior to the 80x86 chips. When they went to the PPC processor the deveopment of the 68k kinda died. Had it continued, I suspect the 68k processors would have kept pace. Oh, well!
Yes, because only Motorola could discontinue their processor line, but the reason for them killing the line was that the customers for the high-powered desktop chips were moving away from Motorola. A product without buyers is of no use to anyone.
SGI had already gone RISC, Atari had moved back to videogame consoles (and never progressed past the 68030), Commodore's Amiga was hitting the end of the line (and also never shipped with anything beyond 68030 - the 040 and 060 chips came from the resurrected organisation).
Intel's phenomenal success in the late 1980s gave them so much more money to pour into deep pipelining and other tricks to make their architecture work for them, but Moto couldn't justify it. They had very few volume sales for the high-end CPUs, and at the same time, the embedded device customers were pulling them towards low-power operation, which is where the 68k ended up.
Motorola stopped developing high-performance 680x0s, killed their 88000 RISC line, and joined the AIM (Apple IBM Motorola) alliance in the early 1990s. The idea was to produce a modern, RISC-based desktop to mainframe processor architecture between them and reap the rewards of the larger economies of scale. Apple and IBM were to provide an OS (Taligent), Moto and IBM did the fabrication, and all three would make customer hardware.
It didn't work that way: Taligent died very slowly, IBM concentrated on Power servers and Motorola ended up making PowerPC chips for embedded devices, with a high-performance variant just for Apple, who were also dying on their feet. The G4 was the last Moto chip in a Mac, but while its embedded focus made Apple's laptops king of the heap for battery life, the desktop line was falling behind the competition, not just on the customer-facing marketing point of peak clock speed, but also on real performance. The IBM-sourced G5 was the last hurrah, but as a server part, it would never make it to a portable.
I learned assembly on a 68000 (Atari ST), and 68030 (Atari Falcon030), and always remember the chips fondly...
Not quite correct - you can still by a m68000 from motorola even now. The difference being that it's in cmos and much faster. Motorola did stop development after the 68040, but only because better architectures and processes were being developed and it had run it's course.
As for the graphics, Evans& Sutherland really wrote the book on early computer graphics, includig a line clipping alrgorithm in this us patent from 1972:
and a complete computer graphics system running on a dec system 10, from 1969:
It may be fashionable to think that Apple invented all this stuff, but they were merely building on much earlier work, done in the days when computers were barely powerfull enough to run any knid of graphics system...
It may be fashionable (and justifiable) to think Apple are nothing but a hollow marketing operation, but that doesn't mean that the company never invented, or never pushed technology forward.
The QuickDraw region-clipping algorithm *was* new - other graphics systems allowed you to specify a single polygon if you were lucky, but usually only a rectangle. For irregular overlaps, this meant dividing your L-shaped exposed area into two rectangles, and then repeating your draw operation for each one. With an expensive draw operation, that meant wasted computation, or more complex code (preCompute(); for each rect in clip list: clip(); draw(); )
The QuickDraw system, unlike these, allowed arbitrary-shaped clipping regions. You could open a "region" handle, draw into that region with any of the QD primitves to define its shape, and then finalise it. Once you defined your clipping area using that Region, the clipping was done at the bit level, not geometrically (it's actually bit masking, but with some optimisations to a. pack the mask bit structure efficiently, and b. never execute fully-obscured draw commands).
The famous Evans & Sutherland patent is for geometric clipping on vector displays. The same techniques can be adapted for bitmaps, but the bitmapped nature of the display allowed much more sophisticated clipping to be performed. This is what Apple, or rather Bill Atkinson, did, and it's why he was awarded the patent.
(more history on this: http://www.folklore.org/StoryView.py?project=Macintosh&story=I_Still_Remember_Regions.txt )
While Atkinson's work was important, it was hardly unprecedented. The Sutherland-Hodgman clipping algorithm from 1974 can clip to arbitrary polygons. Wolfgang Straßer and Edwin Catmull had independently described z-buffering in '74 and '75 respectively, and that's a perfectly good method for doing raster clipping (just treat the windows as planes parallel to the viewport). Weiler-Atherton, published in 1977, can clip to an arbitrary window.
The bitmap-mask method described in Atkinson's patent may well have been novel at the time - I don't know of any prior or independent invention of it - but it falls out pretty naturally from BitBLT with the appropriate raster op, and much of that came from PARC.
My career started in April 1982 IIRC.
Sometime after that, my cycling mate and flatmate Philip got a job at Apple and started his career.
We had an Apple III at home with plans to write business software for it, but it was clear early that the III was not going mainstream, and we were busy trying to eke out an existence. And "existence" is the right word. A small 2 bedroom flat in the western suburbs, 2 mattresses, 2 chairs and a table and an old B&W TV we sponged from a mate when he moved on. We also had the ultimate in utility furniture - the floor. Life was minimal in those days. Our bikes were the only valuable things we owned - and many a hard weekend ride was had to Waterfall and through the National Park, where I was mercilessly hammered.
Then the Lisa got released in Australia and Phil was the demo guy. We had the Lisa in the flat to play with for a while as well. I didn't understand at the time how revolutionary it was, or the impact its successors would have on my life. It seemed like a toy (which it was), but it's legacy is with us today, like it or not.
I bought my first car during this period, a not too shabby Mk. II E-Type jaguar, about a year after the Lisa was available (since I had started getting paid and saved a few pennies). I got it from some guy who was a structural engineer who needed the money to buy the Lisa! I believe some engineering stress analysis software became available that was useful and cool for his engineering work. I hope it helped him make money. I, on the other hand put a shit load of miles on the Jag!
So, this article reminds me that I have been in the business for a long time. It reminds me that I was once clueless, and now I am not. It reminds me that I was once penniless and now I am not. It reminds me of old friends who were there at the beginning as well.
Cheers for the Lisa, it was there right at the beginning for me ...
Another victim of the Wikepedia vocabulary Nazi's.
I've been following the emergence of this term (DE-9) on the web for years. Originally, it was only a few people who had a sad desire to feel superio to everyone else. Gradually it has become more common, and now (if you search part suppliers) you even see that even (many) manufacturers have adopted the new nomenclenture.
However, since this was a historical article, it seemed anachronistic to see it used here.
Apple didnt invent the Graphical User Interface. The revolutionary computer scientist Douglas Engelbart developed many of the concepts we take for granted today. It was his research papers back in the 1960s that detailed many of the concepts on our desktops today. He even produced a prototype in 1968 demonstrating some of his ideas. Apple would like everybody to believe they developed the desktop or they paid Xerox for the technology. It wasnt even Xerox who invented the GUI, they merely implemented Engelbarts ideas in full and is a reason as to why they did not apply for a Patent. Its remarkable as to how the whole world thinks Apple invented the Computer Desktop. Any person who has attended a decent University and studied Human Computer Interaction would know that this is not the case.
Lest face it Apple Lisa was not a success, and MS windows Wiped the floor with Mac. Jobs stuck the signposts up though, just struggled with a market that only ever purchased IBM kit. the computer business was very conservative back in the day, When the Market was ready it was Bill Gates who had the genius to exploit it with Windows (and mug IBM to boot) Similar thing happening with iPhone now, Apple bust down the doors with the right product and one which there competitors can easily copy, and watch there competitors make off while there arguing with the judge.
It was obvious the direction the smart-phone market was headed. Symbian Series 60, then UIQ going on to linux handsets. Mobile browsers going from WAP to opera mobile and then full function browsers. Apple realized this and developed one of the most expensive phones on the market and sold it at a loss by giving large discounts. In addition to this they heavily marketed their products trying to convince the world they invented the smart-phone.
Apples competitors did not rip off Apple. The smart-phone is nothing more than a phone and a computer. Apples competitors waited for the right time until the fast expensive technology came down to the right price that a consumer is willing to pay. In fact Apple copied the user interface of others. Then claimed they were the first to do it on a smart-phone and hence should be granted a monopoly. Apple seems to think that if they do something obvious first, even if its uneconomical, then they should be granted a monopoly.
The idiots at the patent office help them in their lunacy. They grant me-too obvious patents. E.g. Phone text messages went from text to picture messaging. A patent would be granted if somebody sees this and they implement in their blogging software, which was text only before, picture messaging features.
Probably Channel4, about a kidnap, possibly in Ireland. The businessman having to come up with the ransom was clearly targeted because he could afford dozens of them...
Oh, and bonus question, when will Bill Atkinson receive the widespread fame he deserves?
I worked on developing some of the first accounting software for the UK market in conjunction with the UK Launch of the Lisa.
It is easy to dismiss the Lisa as a brief blip on the Apple calendar but at the time it was seen as completely revolutionary.
We worked on the Apple stand at the which? computer show in Birmingham for three days demonstrating our software. For the entire time, there was a massive queue of people wanting to see the new system in action. After three days of delivering demos I literally couldn't talk (unbelievable I know).
The Lisa was an important evolution in the Personal Computer Market. Some features we have come to take for granted, were first introduced by the Lisa project (with obvious credit going to work also carried out at Xerox).
The delays in the project did indeed allow IBM / Microsoft to gain a dominant position in the market place. Here we are 30 years later and the technology landscape has changed beyond even the wildest imaginings of all involved in the industry back then, I remember having conversations about the 3.5inch floppy disk and asking "whatever next?".
Happy birthday Lisa
Biting the hand that feeds IT © 1998–2020