Gold
Pure gold, thanks!
One of the longest-lived GUI operating systems in the world has its origins as an emergency project – specifically the means by which Acorn planned to rescue the original Archimedes operating system. This is according to the original Acorn Arthur project lead, Paul Fellows, who spoke about the creation of RISC OS at the RISC …
Yes, true. I played a lot of "Lander" myself.
It was, of course, only a cut-down demo of a full game called Zarch
https://www.sockmonsters.com/TheMakingOfZarch.html
The head-to-head comparison of _Zarch_ (Archie) vs _Virus_ (Atari ST) here is excellent:
https://bytecellar.com/2010/12/28/virus-vs-zarch-a-look-at-two-braben-classics/
...of enabling the little known text console by pressing - I think it was F12 - and then adjusting the overscan on the old CRT monitors so that the single line of black on white text that had just appeared at the bottom of the desktop under the icon bar was off the bottom of the screen.
Watching frustrated Computing teachers trying to work out why the machine wasn't responding to keyboard input at all because it was all silently going into that console was hilarious... well, it was if you were 12 anyway.
Me again, after posting yesterday about 1st Word+ on the Archimedes.
ARX was a highly buzzword-compliant project from the Acorn Research Center (ARC) in Palo Alto – neighbor to the famed Xerox PARC, where the graphical user interface as we know it today was pioneered. The design was ambitiously Unix-like.
UNIX did come to the Archimedes
On the 8th March, 1988, still working for GST in Cambridge, I worked on a "UNIX Kernel Validation Suite" to test the port of BSD 4.2 (and shortly afterwards, 4.3) to the Archimedes
Some history here: Chris's Acorns: RISC iX
It exercised all the (2) system calls with expected arguments to check correct functionality and invalid arguments to check error returns.
I've got lots of scribbles in my daybook on the design and implementation, paper is definitely more persistent that bits on Winchesters. I wonder whether a historian might be interested in some of my jottings?
> UNIX did come to the Archimedes
> On the 8th March, 1988, still working for GST in Cambridge, I worked on a "UNIX Kernel Validation Suite" to test the port of BSD 4.2 (and shortly afterwards, 4.3) to the Archimedes
And some time before that (because I left Acornsoft in 1987), I (and Laurence Albinson) ported SysV to the ARM (but not the Archimedes, I forget what actual model it was).
Hence my claim to (minor) fame of having written the first two C compilers for the ARM (the first was a derivative of the A Book on C compiler, the second was a back end for pcc)
I was working the other side of a partition from Paul and team.as they developed Arthur and it was a very interesting experience to observe how it all came togther.
The two things that RiscIX needed that the A310 did not have were a hard disk and more memory than the A310 had. I think it had all of the rest of the required facilities (I don't think that the Ethernet was essential.
These were things that most UNIX systems needed, although the amount of memory required would have been more on a RISC processor than a CISC. At this time, MC680X0 based UNIX workstations would probably have had 1-2MB.
It would not surprise me if using a SCSI podule and disk and after market memory expansion boards (A310 had space for two podules I think, as long as the riser was fitted, so Ethernet could have been provided as well) that it could have been made to run, but it's unlikely anybody would try in this day and age.
The memory on the A310 could be upgraded to 4Mb by replacing the DRAM chips on the motherboard - a bit fiddly since they were soldered in rather than in sockets.
Mine had the memory upgrade, a SCSI podule and hard drive, and the I/O podule which replicated the BBC micro ports. I am fairly certain that it would have run RISCiX. If memory serves the hard drive equipped A400s had ST506 controllers, these *may* not have been compatible.
I don't know if it's the same compiler but I was writing raw ARM code in the 1980s to drive an Archimedes (and before that came out, an ARM dev board that hung off a BBC micro). I pretty soon found that for the non-critical parts the compiled code was just as good as my handwritten assembler, sometimes better - so I started writing the system in C with just the critical parts hand done and linked in. At the time compiled code generally was nowhere near as efficient as hand tuned code, so between the nice ARM RISC instructions and the compiler I was using somebody did a great job. Well done if it was you!
I worked on that project, too (Hi, Alan).
I also got shipped off to work on the Acorn side of the port. GST somehow managed to convince Acorn that I was some kind of Unix guru and I ended up being given an Archimedes and the job of porting adb. Porting a debugger which I'd never used, for an OS with which I was unfamiliar, to a machine whose instruction set I'd never seen…was quite a challenge. But it worked!
But you could though.
Apple patented their menu bar at the top of the screen*, which is why nobody else could do that.
* locating objects on the side of the screen essentially gives them an infinite size in one dimension** making them faster to access with the mouse. In the case of Apple's menu bar, no matter how fast you throw the pointer at the top of the screen, you'll always hit it.
** locating something in the corner of the screen makes it infinitely sized in two dimensions.
AIUI it was more detailed and specific than that.
Apple had *pull-down* menus at the top of the screen.
GEM used drop-down menus, so escaped prosecution. The difference: you move the mouse over the menu title, and it appears, _without clicking the button_. That is different enough.
AmigaOS used pull-down menus, *but* you had to *right-click* for them to appear. That again was different enough and did not get them sued.
IBM/MS CUA made them drop-down, but inside the window: again, different enough, but a small and thus difficult target to hit, which a billion Windows habitués fail to notice.
So it is not just the location, but also the appearance _and_ the activation method.
The real tragedy is that anyone was convinced that software patents were a good idea.
Being able to protect your ideas is a good thing.
However, software patents should be a lot shorter lived than traditional patents - five years maximum, if that. Software moves too fast for anything longer.
Additionally, the requirements for software patents should be a lot stricter. There are a lot of software patents that simply shouldn't have been allowed in the first place.
It's wasn't created as a convenient place to drop minimised windows. The icons on the icon bar are the actual utilities and applications which can be interacted with (open new windows, open menus, do stuff). On RISC OS there is no correlation at all between "minimised" windows and whatever is on the icon bar (generally windows cannot be minimised in the same manner, though there is an option to hide a window by pinning it to the backdrop) so in this respect the RISC OS behaviour is more akin to the system tray part of Windows 95.
The best bit of this arrangement is that it avoids the highly annoying behaviour found on "other systems" whereby an application that is started insists upon opening a big window on the screen; and when all of the windows are closed the application terminates.
Well, to directly address the question:
> Doesn't Windows 1, from 1985, have a taskbar?
No. No, it doesn't.
In Windows 1/2, and in OS/2 1.x, when you minimise a program window, it becomes an icon on the desktop. Icons line up in horizontal rows, starting at bottom left.
(I *think* OS/2 1.x introduced a special "list of minimised apps" picker. OS/2 2.x, oddly, had a folder where minimised apps appeared. IIRC -- it's a long time ago!)
But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
Why? Because Apple had patents on drive icons, folders and a wastebin on the desktop, and sued DR over PC GEM and won. So everyone else _in the PC space_ was too scared to put drive icons on the desktop, and they hadn't thought of what else to put there yet except minimised windows.
(I say "in the PC space" because Apple _didn't_ sue Atari over ST GEM, or Commodore over AmigaOS, both of which kept drive icons on the desktop.)
It's not a bar. It's not a special dedicated region. If you manage to open enough windows to minimise without crashing it, you can have multiple rows. In Windows 2, they reintroduced tiling windows because everyone worked out Apple didn't have primacy in that: there was clear prior art. Then, you could pick up and move the desktop minimised-window icons.
But Windows 1 tiled its windows because MS was afraid of upsetting Apple, and the tiling algorithm avoided covering the bottom of the desktop so you could switch back to minimised apps.
Right now, I have 2 overlapping Waterfox windows, but I've arranged them so that I can see and get at the icons on my desktop. I do this on all my OSes. Win1 just did it automatically.
An area the window manager *avoids* covering is not a bar. It wasn't an app switcher or launcher: non-minimised windows didn't appear. It had no delimiters. It contained no menus or launchers, or any kind of status info, not even a clock.
No, I don't think this is any kind of taskbar, nor did MS or Apple or anyone else, AFAICR.
To make a real-world comparison, it's like you went to a hotel and the owners said "oh, we have a football pitch." When you asked where, they pointed at a large lawn. "There's room there for football. We've reserved it and we won't cover it with tables and things."
That is not the same as a football pitch. Even if it's the same size and shape, it's not a pitch unless it's marked out as a pitch, has goalposts, etc. And even if you could play a game on it, it will be tricky and won't work right with no touchline etc.
> But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
I think that's a slender distinction: they're not in a special area, but they are in an area that only they are allowed to be in.
That said:
> It's not a bar.
There it is: I lose. You can't invent the taskbar with something that isn't a bar. End of story.
Well, you can't put anything else there on Windows 1 because the windows aren't resizable.
But the minimised icons were pretty much the same on Windows 2, and then you could rearrange them as you wished...
I used to put them down the right hand side, starting from the top, for a vaguely Mac-like effect.
I think I recall addons that let you choose the sort order and arrangement. Not much use on Windows 3.x because the resource heaps ran out so easily that you didn't want to run anything non-essential, but quite handy on NT 3.x.
Back then, I had just about persuaded my wife to let me replace the family Acorn Electron (mine, but the missus insisted I share it with her and the kids, else I'd get plenty of flak for wasting money)! My sights were set on the Archimedes. Businesses around me were getting into PCs and I got a Dec Rainbow in my new job. Initially, the Arc was going to have a hardware PC plug-in (my memory is vague at this point but it was probably going to be a 8086 card). The Rainbow was fitted with 8086 (or 8088, I don't recall which) and Z80 - one used for MS-DOS, the other for CP/M - so the concept made sense at the time. However, the hardware support was dropped in favour of a software simulator, justifiable because the Arc was so much faster than the PC hardware.
What stopped me getting the Arc was the availability of MS-DOS software I found in my new job - I was even given a copy of Autoroute (though who would ever want to replace a decent map with software...). I realised I'd spend most of my time on an Arc running MS-DOS so ended up with an Amstrad 640 - and getting embedded into the MS-Dos, then Windows, environment (at least, until 10 years ago when I switched to Mac - albeit with Windows and Ubuntu VMs ready to hand).
Almost full circle!
On the subject of patents, I've always felt patents should be limited to tangibles, not ideas. If you had a new idea, you had the ability to be the first to use it and, if you had something useful/saleable, you should have protection against copying (for a while). The application should be patentable, not the idea itself. Some software patents work like that but there are far too many granted that haven't be properly tested. Imagine if Eugene Wesley Roddenberry Sr. had patented the idea of a computer you could hold in your hand...
I owned an Archimedes 310, until it got zapped in an attempt to expand its RAM.
I loved this fanless, quiet, powerful (for the time) machine.
A couple of years ago, I stumbled upon this lecture:
https://media.ccc.de/v/36c3-10703-the_ultimate_acorn_archimedes_talk
about (mostly) the hardware design of the machine and I enjoyed it very much.
I hope you will, too.
RISC OS, the operating system of the original Arm computer, the Acorn Archimedes, is still very much alive – and doing relatively well for its age.
In June 1987, Acorn launched the Archimedes A305 and A310, starting at £800 ($982) and running a new operating system called Arthur. At the time, it was a radical and very fast computer. In his review [PDF] for Personal Computer World, Dick Pountain memorably said: "It loads huge programs with a faint burping noise, in the time it takes to blink an eye."
Arthur was loosely related to Acorn's earlier MOS, the BBC Micro operating system but looked very different thanks to a prototype graphical desktop, implemented in BBC BASIC, that could charitably be called "technicolor."
Updated Arm today told The Reg its restructuring ahead of its return to the stock market is focused on cutting "non-engineering" jobs.
This is after we queried comments made this morning by Arm chief executive Rene Haas in the Financial Times, in which he indicated he was looking to use funds generated by the expected public listing to expand the company, hire more staff, and potentially pursue acquisitions. This comes as some staff face the chop.
This afternoon we were told by an Arm spokesperson: "Rene was referring more to the fact that Arm continues to invest significantly in its engineering talent, which makes up around 75 percent of the global headcount. For example, we currently have more than 250 engineering roles available globally."
Arm has at least one of Intel's more capable mainstream laptop processors in mind with its Cortex-X3 CPU design.
The British outfit said the X3, revealed Tuesday alongside other CPU and GPU blueprints, is expected to provide an estimated 34 percent higher peak performance than a performance core in Intel's upper mid-range Core i7-1260P processor from this year.
Arm came to that conclusion, mind you, after running the SPECRate2017_int_base single-threaded benchmark in a simulation of its CPU core design clocked at an equivalent to 3.6GHz with 1MB of L2 and 16MB of L3 cache.
Analysis Supermicro launched a wave of edge appliances using Intel's newly refreshed Xeon-D processors last week. The launch itself was nothing to write home about, but a thought occurred: with all the hype surrounding the outer reaches of computing that we call the edge, you'd think there would be more competition from chipmakers in this arena.
So where are all the AMD and Arm-based edge appliances?
A glance through the catalogs of the major OEMs – Dell, HPE, Lenovo, Inspur, Supermicro – returned plenty of results for AMD servers, but few, if any, validated for edge deployments. In fact, Supermicro was the only one of the five vendors that even offered an AMD-based edge appliance – which used an ageing Epyc processor. Hardly a great showing from AMD. Meanwhile, just one appliance from Inspur used an Arm-based chip from Nvidia.
The UK government is upping the ante in attempts to have Arm listed on the London stock exchange, with reports suggesting it is considering the threat of national security laws to force the issue with owner SoftBank.
According to the Financial Times, the British administration is considering whether to apply the National Security and Investment Act (NSIA), which came into force at the start of the year, in a bid to have SoftBank change its mind over listing Arm exclusively on the Nasdaq in New York, as it has previously indicated.
The FT cites the usual "people familiar with the matter", who indicated there had not yet been a formal debate over using national security legislation, and the idea was opposed by some government officials.
Arm is most likely to list on the US stock exchange Nasdaq, according to Masayoshi Son, chief executive of SoftBank Group, which bought the chip designer in 2016 for $32 billion.
Although he stressed no final decision had been made, Son told investors that the British chip designer was better suited to a US listing. "Most of Arm's clients are based in Silicon Valley and... stock markets in the US would love to have Arm," Son told shareholders at the company's annual general meeting.
He said there were also requests to list Arm in London without elaborating on where they came from. The entrepreneur did not say whether the conglomerate is considering a secondary listing for Arm there.
Arm has a champion in the shape of HPE, which has added a server powered by the British chip designer's CPU cores to its ProLiant portfolio, aimed at cloud-native workloads for service providers and enterprise customers alike.
Announced at the IT titan's Discover 2022 conference in Las Vegas, the HPE ProLiant RL300 Gen11 server is the first in a series of such systems powered by Ampere's Altra and Altra Max processors, which feature up to 80 and 128 Arm-designed Neoverse cores, respectively.
The system is set to be available during Q3 2022, so sometime in the next three months, and is basically an enterprise-grade ProLiant server – but with an Arm processor at its core instead of the more usual Intel Xeon or AMD Epyc X86 chips.
Arm is beefing up its role in the rapidly-evolving (yet long-standing) hardware-based real-time ray tracing arena.
The company revealed on Tuesday that it will introduce the feature in its new flagship Immortalis-G715 GPU design for smartphones, promising to deliver graphics in mobile games that realistically recreate the way light interacts with objects.
Arm is promoting the Immortalis-G715 as its best mobile GPU design yet, claiming that it will provide 15 percent faster performance and 15 percent better energy efficiency compared to the currently available Mali-G710.
The UK government is continuing efforts to have chip designer and licensor Arm listed on the London Stock Exchange after its public offering rather than New York, as is the current plan.
At stake is whether Arm moves its headquarters to the US, potentially leading to the further loss of UK jobs.
Speaking to the Financial Times, UK minister for Technology and the Digital Economy Chris Philp said the government was still "working closely with" Arm management on the IPO process, despite its parent SoftBank having previously indicated that it was planning to list Arm on the Nasdaq stock exchange in New York.
Amid the renewed interest in Arm-based servers, it is easy to forget that one company with experience in building server platforms actually brought to market its own Arm-based processor before apparently losing interest: AMD.
Now it has emerged that Jim Keller, a key architect who worked on Arm development at AMD, reckons the chipmaker was wrong to halt the project after he left the company in 2016.
Keller was speaking at an event in April, and gave a talk on the "Future of Compute", but the remarks were unreported until picked up by WCCF TECH.
Biting the hand that feeds IT © 1998–2022