Gold
Pure gold, thanks!
One of the longest-lived GUI operating systems in the world has its origins as an emergency project – specifically the means by which Acorn planned to rescue the original Archimedes operating system. This is according to the original Acorn Arthur project lead, Paul Fellows, who spoke about the creation of RISC OS at the RISC …
Yes, true. I played a lot of "Lander" myself.
It was, of course, only a cut-down demo of a full game called Zarch
https://www.sockmonsters.com/TheMakingOfZarch.html
The head-to-head comparison of _Zarch_ (Archie) vs _Virus_ (Atari ST) here is excellent:
https://bytecellar.com/2010/12/28/virus-vs-zarch-a-look-at-two-braben-classics/
...of enabling the little known text console by pressing - I think it was F12 - and then adjusting the overscan on the old CRT monitors so that the single line of black on white text that had just appeared at the bottom of the desktop under the icon bar was off the bottom of the screen.
Watching frustrated Computing teachers trying to work out why the machine wasn't responding to keyboard input at all because it was all silently going into that console was hilarious... well, it was if you were 12 anyway.
Me again, after posting yesterday about 1st Word+ on the Archimedes.
ARX was a highly buzzword-compliant project from the Acorn Research Center (ARC) in Palo Alto – neighbor to the famed Xerox PARC, where the graphical user interface as we know it today was pioneered. The design was ambitiously Unix-like.
UNIX did come to the Archimedes
On the 8th March, 1988, still working for GST in Cambridge, I worked on a "UNIX Kernel Validation Suite" to test the port of BSD 4.2 (and shortly afterwards, 4.3) to the Archimedes
Some history here: Chris's Acorns: RISC iX
It exercised all the (2) system calls with expected arguments to check correct functionality and invalid arguments to check error returns.
I've got lots of scribbles in my daybook on the design and implementation, paper is definitely more persistent that bits on Winchesters. I wonder whether a historian might be interested in some of my jottings?
> UNIX did come to the Archimedes
> On the 8th March, 1988, still working for GST in Cambridge, I worked on a "UNIX Kernel Validation Suite" to test the port of BSD 4.2 (and shortly afterwards, 4.3) to the Archimedes
And some time before that (because I left Acornsoft in 1987), I (and Laurence Albinson) ported SysV to the ARM (but not the Archimedes, I forget what actual model it was).
Hence my claim to (minor) fame of having written the first two C compilers for the ARM (the first was a derivative of the A Book on C compiler, the second was a back end for pcc)
I was working the other side of a partition from Paul and team.as they developed Arthur and it was a very interesting experience to observe how it all came togther.
The two things that RiscIX needed that the A310 did not have were a hard disk and more memory than the A310 had. I think it had all of the rest of the required facilities (I don't think that the Ethernet was essential.
These were things that most UNIX systems needed, although the amount of memory required would have been more on a RISC processor than a CISC. At this time, MC680X0 based UNIX workstations would probably have had 1-2MB.
It would not surprise me if using a SCSI podule and disk and after market memory expansion boards (A310 had space for two podules I think, as long as the riser was fitted, so Ethernet could have been provided as well) that it could have been made to run, but it's unlikely anybody would try in this day and age.
The memory on the A310 could be upgraded to 4Mb by replacing the DRAM chips on the motherboard - a bit fiddly since they were soldered in rather than in sockets.
Mine had the memory upgrade, a SCSI podule and hard drive, and the I/O podule which replicated the BBC micro ports. I am fairly certain that it would have run RISCiX. If memory serves the hard drive equipped A400s had ST506 controllers, these *may* not have been compatible.
I don't know if it's the same compiler but I was writing raw ARM code in the 1980s to drive an Archimedes (and before that came out, an ARM dev board that hung off a BBC micro). I pretty soon found that for the non-critical parts the compiled code was just as good as my handwritten assembler, sometimes better - so I started writing the system in C with just the critical parts hand done and linked in. At the time compiled code generally was nowhere near as efficient as hand tuned code, so between the nice ARM RISC instructions and the compiler I was using somebody did a great job. Well done if it was you!
I worked on that project, too (Hi, Alan).
I also got shipped off to work on the Acorn side of the port. GST somehow managed to convince Acorn that I was some kind of Unix guru and I ended up being given an Archimedes and the job of porting adb. Porting a debugger which I'd never used, for an OS with which I was unfamiliar, to a machine whose instruction set I'd never seen…was quite a challenge. But it worked!
But you could though.
Apple patented their menu bar at the top of the screen*, which is why nobody else could do that.
* locating objects on the side of the screen essentially gives them an infinite size in one dimension** making them faster to access with the mouse. In the case of Apple's menu bar, no matter how fast you throw the pointer at the top of the screen, you'll always hit it.
** locating something in the corner of the screen makes it infinitely sized in two dimensions.
AIUI it was more detailed and specific than that.
Apple had *pull-down* menus at the top of the screen.
GEM used drop-down menus, so escaped prosecution. The difference: you move the mouse over the menu title, and it appears, _without clicking the button_. That is different enough.
AmigaOS used pull-down menus, *but* you had to *right-click* for them to appear. That again was different enough and did not get them sued.
IBM/MS CUA made them drop-down, but inside the window: again, different enough, but a small and thus difficult target to hit, which a billion Windows habitués fail to notice.
So it is not just the location, but also the appearance _and_ the activation method.
The real tragedy is that anyone was convinced that software patents were a good idea.
Being able to protect your ideas is a good thing.
However, software patents should be a lot shorter lived than traditional patents - five years maximum, if that. Software moves too fast for anything longer.
Additionally, the requirements for software patents should be a lot stricter. There are a lot of software patents that simply shouldn't have been allowed in the first place.
It's wasn't created as a convenient place to drop minimised windows. The icons on the icon bar are the actual utilities and applications which can be interacted with (open new windows, open menus, do stuff). On RISC OS there is no correlation at all between "minimised" windows and whatever is on the icon bar (generally windows cannot be minimised in the same manner, though there is an option to hide a window by pinning it to the backdrop) so in this respect the RISC OS behaviour is more akin to the system tray part of Windows 95.
The best bit of this arrangement is that it avoids the highly annoying behaviour found on "other systems" whereby an application that is started insists upon opening a big window on the screen; and when all of the windows are closed the application terminates.
Well, to directly address the question:
> Doesn't Windows 1, from 1985, have a taskbar?
No. No, it doesn't.
In Windows 1/2, and in OS/2 1.x, when you minimise a program window, it becomes an icon on the desktop. Icons line up in horizontal rows, starting at bottom left.
(I *think* OS/2 1.x introduced a special "list of minimised apps" picker. OS/2 2.x, oddly, had a folder where minimised apps appeared. IIRC -- it's a long time ago!)
But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
Why? Because Apple had patents on drive icons, folders and a wastebin on the desktop, and sued DR over PC GEM and won. So everyone else _in the PC space_ was too scared to put drive icons on the desktop, and they hadn't thought of what else to put there yet except minimised windows.
(I say "in the PC space" because Apple _didn't_ sue Atari over ST GEM, or Commodore over AmigaOS, both of which kept drive icons on the desktop.)
It's not a bar. It's not a special dedicated region. If you manage to open enough windows to minimise without crashing it, you can have multiple rows. In Windows 2, they reintroduced tiling windows because everyone worked out Apple didn't have primacy in that: there was clear prior art. Then, you could pick up and move the desktop minimised-window icons.
But Windows 1 tiled its windows because MS was afraid of upsetting Apple, and the tiling algorithm avoided covering the bottom of the desktop so you could switch back to minimised apps.
Right now, I have 2 overlapping Waterfox windows, but I've arranged them so that I can see and get at the icons on my desktop. I do this on all my OSes. Win1 just did it automatically.
An area the window manager *avoids* covering is not a bar. It wasn't an app switcher or launcher: non-minimised windows didn't appear. It had no delimiters. It contained no menus or launchers, or any kind of status info, not even a clock.
No, I don't think this is any kind of taskbar, nor did MS or Apple or anyone else, AFAICR.
To make a real-world comparison, it's like you went to a hotel and the owners said "oh, we have a football pitch." When you asked where, they pointed at a large lawn. "There's room there for football. We've reserved it and we won't cover it with tables and things."
That is not the same as a football pitch. Even if it's the same size and shape, it's not a pitch unless it's marked out as a pitch, has goalposts, etc. And even if you could play a game on it, it will be tricky and won't work right with no touchline etc.
> But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
I think that's a slender distinction: they're not in a special area, but they are in an area that only they are allowed to be in.
That said:
> It's not a bar.
There it is: I lose. You can't invent the taskbar with something that isn't a bar. End of story.
Well, you can't put anything else there on Windows 1 because the windows aren't resizable.
But the minimised icons were pretty much the same on Windows 2, and then you could rearrange them as you wished...
I used to put them down the right hand side, starting from the top, for a vaguely Mac-like effect.
I think I recall addons that let you choose the sort order and arrangement. Not much use on Windows 3.x because the resource heaps ran out so easily that you didn't want to run anything non-essential, but quite handy on NT 3.x.
Back then, I had just about persuaded my wife to let me replace the family Acorn Electron (mine, but the missus insisted I share it with her and the kids, else I'd get plenty of flak for wasting money)! My sights were set on the Archimedes. Businesses around me were getting into PCs and I got a Dec Rainbow in my new job. Initially, the Arc was going to have a hardware PC plug-in (my memory is vague at this point but it was probably going to be a 8086 card). The Rainbow was fitted with 8086 (or 8088, I don't recall which) and Z80 - one used for MS-DOS, the other for CP/M - so the concept made sense at the time. However, the hardware support was dropped in favour of a software simulator, justifiable because the Arc was so much faster than the PC hardware.
What stopped me getting the Arc was the availability of MS-DOS software I found in my new job - I was even given a copy of Autoroute (though who would ever want to replace a decent map with software...). I realised I'd spend most of my time on an Arc running MS-DOS so ended up with an Amstrad 640 - and getting embedded into the MS-Dos, then Windows, environment (at least, until 10 years ago when I switched to Mac - albeit with Windows and Ubuntu VMs ready to hand).
Almost full circle!
On the subject of patents, I've always felt patents should be limited to tangibles, not ideas. If you had a new idea, you had the ability to be the first to use it and, if you had something useful/saleable, you should have protection against copying (for a while). The application should be patentable, not the idea itself. Some software patents work like that but there are far too many granted that haven't be properly tested. Imagine if Eugene Wesley Roddenberry Sr. had patented the idea of a computer you could hold in your hand...
I owned an Archimedes 310, until it got zapped in an attempt to expand its RAM.
I loved this fanless, quiet, powerful (for the time) machine.
A couple of years ago, I stumbled upon this lecture:
https://media.ccc.de/v/36c3-10703-the_ultimate_acorn_archimedes_talk
about (mostly) the hardware design of the machine and I enjoyed it very much.
I hope you will, too.
That is a really good talk. I knew some of the stuff, but the background behind some of the design decisions was very illuminating.
Makes me want to dig out my A3010, but it's under a lot of stuff, and preserving my BBC micro discs before they shed all their oxide is probably higher up the priority list.
Thanks.