Compute time was billed at around £750-800 per hour
"Compute" is a verb. The word you're looking for is "computer", which is a noun.
You're welcome.
Think today's computing industry moves fast? Try that of decades past. Moore's Law predicts that processor power will double every couple of years or so, but on December 7, 1962, the scientific computing power of the entire UK is said to have doubled in a single day. That was the day they switched on the original Ferranti …
Compute time != computer time. ATLAS was a multi tasking system, so in modern parlance the CPU time spent solving your pet problem (i.e. the "compute" time) would not be the same as the amount of elapsed time the computer spent running your program. Or in other words, even if it took an hour to run a program to completion, it may have only actually applied 5 minutes of actual compute time to it during that period.
Use of a verb form (a participle) as an adjective requires the present or past participle of the verb. So, this would require 'computing time' or 'computed time'. (cf. 'baking apple' and 'baked apple', but _not_ 'bake apple')
If you want to use the noun as an adjective (cf. 'apple pie', 'car park') then you'd need to say 'computer time'.
Using the bare verb form, 'compute time', is the equivalent of saying "Don't talk now, it's not talk time, it's eat time." It looks and sounds clumsy and ugly. However, language changes fastly in this modern world and I look forward to more of these disruptive read experiences.
> In the pre-electronic time a "computer" was a lady with a quill pen. At Harvard College Observatory they ware paid $0.25 to $0.35 per hour in the 1920s.
Since we all seem to be trying to out-pedant one another, may I just add that quill pens were replaced with steel nibs long before the 1920s? They may still have been dipped in ink, but no geese were plucked in the making of them.
We routinely refer to "compute time", which is how many CPU-seconds are used across all processors, vs clock time or the computation time of one cpu.
Pedantry would result in the language police renaming disk drives, without thinking about the actual history of the word.
Computers compute, no matter if human or machine.
Sounds like a case of US English vs Proper English.
In the Queen's English, compound nouns are formed from participles of verbs and genitive cases of nouns. Also (not directly relevant here), collective nouns are generally treated as plural. In the strange dialect picked up by the descendants of those who forgot to pack a dictionary when the Mayflower sailed from Plymouth , compound nouns are formed from infinitives of verbs and accusative cases of nouns -- and collective nouns are always treated as singular.
So whilst we would say "computing time", Americans might well say "compute time". Compare also "The girls' swimming team have chosen their new mascot" vs. "The girl swim team has chosen its new mascot".
I speak as a native Brit. I understand that collective nouns can make sense as plurals, but I prefer to treat them as singular when I can, because a group of something is a singular thing and singular things take singular verbs.
And no, the Americans don't get everything right, but they did not forget to pack a dictionary because the Mayflower sailed before Johnson wrote it (and the people who sailed on the Mayflower were all subjects of the British Monarch and largely expected to remain so). What actually happened was that the language changed in different ways on either side of the water. US English is closer to British English than some English dialects, particularly Scots (and Doric along with Ulster Scots/Ullans). Patrick Stewart could talk Yorkshire to you and you wouldn't have a clue what he was saying unless you were from near the village he grew up in.
And at one time, the English who had emigrated to plantation-era Ireland believed the English in England were letting English go to the dogs and that they were speaking a truer form.
"Think today's computing industry moves fast? Try that of decades past. Moore's Law predicts that processor power will double every couple of years or so, but on December 7, 1962, the scientific computing power of the entire UK is said to have doubled in a single day..That was the day they switched on the original Ferranti Atlas."
C'mon! All together now! Let me hear you!
"But could it run Crysis?"
"December 7, 1962 .. was the day they switched on the original Ferranti Atlas, the UK's first supercomputer .. The Atlas delivered nearly a hundred-fold advance in processing power over previous computers and it brought many innovations, including virtual memory and a multitasking operating system" ...
What are the specs in modern day parlance?
Actually quite substantial, according to the wikipedia page. It had 48 bit words, with 16kw of RAM and 96kw of drum storage --- that's 96kB and 576kB --- mapped into a 24-bit virtual address space, and what looked like paging between drum and core so that applications could use the full address space. It had an MMU, full interrupt systems and was asynchronously clocked (something that's still not done much today). Performance seems to have been about 500000 flops.
The washing machine comment is a bit harsh: this is way more powerful than the kind of embedded PIC you find in that sort of thing, and even beats a lot of 8-bit microcontrollers of today. I think it's roughly equivalent to about the 16-bit microcontroller class, although you'll need to find one with an FPU.
Does anyone have a reference to the instruction set?
"When we had finished commissioning, we eventually got to a point where the machine would run for ten minutes without fail, and at that point we all cheered and went to the pub to celebrate surviving ten minutes,"
Wind the clock forward 20-ish years and I could have said the same, only it was the software not falling over that we celebrated.
RIP Ferranti.
Surely I am not the only person here who actually used that machine? It rocked, until the day its console caught fire and we had to send our jobs to the Harwell Atlas while they rebuilt it.
For performance numbers see http://en.wikipedia.org/wiki/Atlas_Computer_%28Manchester%29#Hardware
No you're not the only one - I used London University's Atlas around 1967 when doing post-grad biochemistry, to process readings from a scintillation counter. A lab assistant in a brown coat would take my printouts away, and 3 or 4 days later bring back another pile of fan-fold with the results. Then I'd sit down with a clockwork desk calculator to finish the job.
Early masks of the Motorola 68000CPU had this bug as I recall. There was an opcode that caused one set of data bus buffers to turn on all 1s, and another to turn on all zeroes. The result was a deeply satisfying but expensive crack and a smell of frying epoxy, in the commercial version. The ceramic clad version just sat there and got hot till the wires fell off the leadframe.
"Now I understand the origin of the HCF assembler command (Halt and Catch Fire)."
The ICL 2903 had a built-in card reader that was totally controlled by software. It was possible to get the microcode timing sequence wrong - and effectively issue a command to the picker solenoid of "select, hold, and catch fire".
This also might be of interest:
http://ict1301.co.uk/1301ccsx.htm
http://www.kentonline.co.uk/kentonline/news/2012/october/23/flossie_the_computer.aspx
The owner of the farm where Flossie is housed has, unfortunately had to sell the property (the original Darling Buds Of May series location) and a new owner is in the offing, so the future of the machine is uncertain. Flossie is normally on demonstration on the same day as an annual charity classic car show at the farm that I am involved with (slightly off-topic, I know!), so the whole event may be in jeopardy.
Britain is still very good at this sort of thing. Both building precision stuff and crap middle management.
Can't think of any washing machines off the top of my head that sport a miltitasking OS, unless they happened to be Internet connected or something. Even then a proper MMU and an FPU would be pretty redundant.
IIRC, back in the 90s, Philips washing machines used out of spec. SPARC Motherboards.
I have no idea what OS they ran, but Solaris would make sense.
Multitasking may be overkill, but I'd certainly like a washing machine that I could program myself. E.g. to overcome bulletproof Zanussi's inability to rinse satisfactorily.
the manufacturing site mentioned in the video for Atlas, during its later days through ICL & Fujitsu Services, had an informative mural on a corridor showing the evolution of the UK computer industry including Atlas. I think the tower at West Gorton might be gone now as it was shedding lumps from on high when I was last there.
Years ago (mid-70's) we had to write programs - FORTRAN, I think - only 5 lines or so - to teach us computing. (Or was it Basic?) We only had one mechanical tty to use.
This was at Coventry Technical College, and I seem to remember the instructor telling us a) the machine was at Manchester, and b) it was frightenly expensive to use - we only had a couple of minutes or so per student. He'd vet our paper-written programs first, and made us practice keyboard skills from a sheet of copied paper (the copier, IIRC, was based on methylated spirits, and used a drum) so we didn't waste time on "hunt 'n' peck" typing.
In the days when my then-girlfriend was using punched cards, which sometimes she'd bring a stack home so I could help her debug - and her engineer fell about laughing when I told him "Half a byte is called a nibble" - which it is!
Yoof of today don't know they've been born. Luxury (etc.)
In an effort to learn about these Banda duplicators of which you speak, I tripped over this gem:
"In grade five at Tyler Street primary school, Preston, my first newspaper, The Weekly Trumpet, was hand lettered on quarto paper and pinned to the class notice board. About this time my mother bought her first washing machine, a Hoover with a fold-away hand-wringer. "What a perfect way to run off a few copies of the Trumpet and sell them to the kids" I thought! But something went terribly wrong. Instead of the violet hectograph-inked paper-master soaked in methylated spirits transferring onto white paper, it ended up on the wringers of mother's new pride and joy! The Weekly Trumpet appeared on whites and coloureds for weeks until the image disappeared."
-http://www.metaltype.co.uk/stories/story38.shtml
The author continues to describe his career in printing, from 1949 to present... a parallel story of business and information technology.
I managed to get a CSE in Computer Studies in 1973, given this I must have been using an interesting computer in 1971/2/3 (any ideas?) which was based in Chelmsford. Initially we programmed in 'City and Guilds' which was assembler like - we used to 'pencil in' cards (punched card sized) and send them off to get the output a week later - used to get massive stacks of expensive printouts with one character on each sheet! We then upgraded to Fortran and punched cards - the punching device was manual, but far faster to 'write' than having to 'colour in' (with a 2B pencil) the fields on a card! Those were the days ....
The valve Deuce at English Electric Computers Kidsgrove with its mercury delay line storage was still doing the company payroll in 1966/7. The bureau department had a KDF8 (RCA501) for customer payrolls and share registration - as well as an EE designed KDF9 for Rolls Royce turbine blade design.
To which was added the state of the art RCA Spectra 70/25 and the big Spectra 70/45 (System 4-50) with 128KB memory. The prototype in-house EE and Leo designed System 4/70 had a massive 1MB of memory. The Marconi designed System 4-30 was different hardware again with pretty neon indicators. There were also the industrial computers like KDF7 which could keep running off an enormous Ni-Cd battery - and the KDN2 (analogue?). The Elliot industrial "March" series computers were also being developed for system control - as were the Marconi "Myriad" ones for air traffic control.
These were all overlapping products on the same site - Leo 3s were still being produced elsewhere. The merger with ICT to form ICL in 1969 added yet more computers from their acquisitions and mergers forming their 1900 range. The company name seemed to change monthly as new acquisitions and mergers happened - life was fun.
"We could get pretty fast at hand bootstrapping the system via the console buttons, "
The Spectra switches were just round chrome push buttons in illuminated squares? Those machines quickly lost their silver Spectra logo to souvenir hunters as the rainbow diffraction effect was quite novel.
The System 4-7x engineer's panel had 64 address/data switches which were protruding toggles. Used to get calluses on the top of my index finger from doing a fast ripple reset of them all between each write or read. Only the KDF8 in a darkened room had the very large console of flashing coloured lights that visitors expected of a computer in the 1960s.
Google's core idea seems to be to get rich by indexing and making available all the world's information. Their preservation work is surely just part of it.
Whether it is a good idea or not will probably be discussed by future historians, but I have to say that Google frightens me a lot less than Facebook.
I joined the University of London Computer Centre (ULCC) in about 1972, which had moved on from Atlas to Control Data 6400, 6600 and 7600. Most of the people I worked with had worked on Atlas.
I understand that many of the patents from Atlas passed eventually to ICL, who let the Atlas patent laps on Virtual memory, after all with machine memory then getting as big as 32K, why would you need it.
In 1978 the VAX 11/780 with a massive 128K of memory supporting 12 interactive users re-introduced us to Virtual Paged memory, and I suspect never paid a dollar in royalties for Virtual Memory usage.
When I was working in industrial automation big tin, our shopfloor cabinets all had rounded corners for safely and forklift damage reduction. It hurts a lot less if a machine operator bangs his head on a rounded corner that a square one.
Really, the USPTO should have been paying attention that day. A rounded corner is not a design feature, it is a functional feature and as such cannot be patented due to prior art.
The ICL System 4/75 had virtual memory with paging in 1968 - on a real memory of 1MB. The bureau subsidiary BARIC had one at Winsford. It ran the Interact-75 TP service on the custom SGOS system overlaid onto the 7J O/S. It had a March 2140 comms frontend handling customer teletype connections - including some new fast Termiprinters at 300bps. The replaceable disk drives were 8MB - and there was a fixed CDC disk with the massive capacity of 600MB. The latter weighed one and a half tonnes and had water cooled bearings.
No, I don't think we let the patents lapse. What we did have though, for some reason which has always escaped me, was a patent pool with other computer firms. Including a foreign one with Big Blue boxes.
A few years later I recall conversations with an excellent product manager called Roger, in Intergalactic Headquarters: 'you mean we just have to lie down and get f***ked?'.
Everyone was scared of anti-trust suits...
That was an experience which would make your day. Junior programmers in the 1960s were not encouraged to just drop into this paradise where dozens of nubile young ladies were typing away on their IBM 029s.
But sometimes you would be invited down by the woman of formidable aspect who guarded access to the room to clarify something you had written on one of your coding forms. Oh joy!
She was no fool, however. She noticed that my handwriting was steadily getting worse and after the third visit told me that I was disrupting their workflow and that in future the coding forms would be sent back and I would lose my place in the queue.
Another thing from the video: whatever happened to Stentophones? We used to have them everywhere in 1969 but I don't think I've seen one in 20 years.
The EELM Kidsgrove punchroom was like running the gauntlet for any young lad. The object of the exercise seemed to be cause as much blushing as possible. When a new girl joined the punch room she would be brought to the viewing window of the bureau computer room. She would then be told the marital status etc of the male operators so she could take her pick of one that took her fancy.
When I was in the 5th form my father wangled me a summer job with his company's computer department. There was a woman programmer. Who was rather attractive. And she talked to me in a friendly fashion. I am afraid I got completely the wrong idea about the computer industry. But at least I knew at an early age who Rear Admiral Grace Hopper was, even if I made a completely incorrect correlation between a knowledge of Boolean algebra and getting on with the opposite sex.
" I am afraid I got completely the wrong idea about the computer industry. "
Hmm - in the 1960s there were probably as many women as men in programming departments. As a junior programmer I shared an office with four older women who delighted in causing me to blush. That was the era of mini, or rather micro, skirts. The question crosses my mind as to why they had to store their card boxes on top of the cupboard opposite my desk - they could only reach them by standing on tip-toe.
In the 1960s women computer operators were only allowed to work the day and evening shifts. However women programmers were allowed to supervise their work being run even in the middle of the night. During the night shift most of the computer room lights were switched off - giving a more restful atmosphere. The stories about what went on behind the tape decks on the night shift are best left to the imagination.
That is interesting. In fact my career path took a different turn after A levels and I only got back to computers in my late 20s when the research I was doing needed a lot of data collection and one thing led to another. By then, at least in industry, it was an almost all-male occupation.
This, though, seems to be usually the way. When a job becomes seen as attractive and high paid, whoever was doing it tends to get muscled out by middle and upper class men. When the Navy become glamorous, all of a sudden the aristocracy wanted their kids to have naval rather than army careers and the "tarpaulin" officers were pushed out. When it was realised how much could be made from sheep, younger sons were pushed off to the former prison of Australia. And, seemingly, when computation (a job largely done by women) turned into computing and programmers and analysts started to earn reasonable amounts of money, suddenly it was men doing it.
[sigh]
The EELM punchroom managed to upset the new compiler testing one day. A batch of faulty program sources had been submitted for punching. The object was to test the compilers handling of syntax errors. Next day the run arrived back with clean compilations - the punchroom had corrected all the obvious errors as they punched the cards.
When I studied Computation at UMIST in the mid 70s (University of Manchester Institute of Science and Technology, now sadly subsumed into the University of Manchester), I had the privilege of being taught by Jeff Rohl, one of the creators of Atlas Autocode - a kind of souped-up Algol specifically created for the Atlas.
I don't think that I ever worked directly on the Atlas - by that time UMRCC was primarily using ICL 1900s and its CDC 7600 and 6000 for university support services. I remember using teletype access to the CDC 6000 to develop a Pascal program to give you optimal strategies when playing blackjack, although needless to say this wasn't part of my course or even my final year project.
Anyway: Jeff Rohl was one of the finest teachers I ever encountered in my life. Soon after I left UMIST he returned to teach in his native Australia (Adelaide I believe), and I hope he achieved his ambition of conducting a Beethoven chorale performed by a top choir and orchestra. The one thing he taught me above all else, despite being a firm advocate of formally well designed and structured programming techniques was: if your program doesn't do what is set out in its specification then it's worthless, regardless of how well structured it may be. This was a most enlightened view from an academic, but served me well throughout my career.
Don't know if you're still with us Jeff, but thank you anyway.
In 1996 Hamish Carmichael published "An ICL Anthology". It contained anecdotes from the ICL workforce that spanned right back to the early days of tabulators. The stories covered many of the companies that eventually formed ICL from English Electric Computers and ICT. There were eventually two volumes and they used to be online - but apparently no longer. Surprisingly several secondhand copies are available - at a price.
1st rate creative engineering skills.
3rd rate production management and engineering.
A detailed description of using computers to run the building of computers was given in 1956 in
"A progress report on computer applications in computer design"
authored by an R. Kisch and one Seymour Cray.
Trouble always seemed to be with the production side of things.
Very true. But that's us Brits. The same is true of car production unless it's managed by foreign firms. Why is this? Because production's so damned boring compared to the interest of innovation. It's why we are good at designing and building Formula One cars, no need for a production line!
"The same is true of car production unless it's managed by foreign firms. Why is this? Because production's so damned boring compared to the interest of innovation."
Unfortunately it's the products that sell in the 100s of 1000s that make the big revenues.
It's a common misconception that production engineering is simple. If you can afford to throw away 9 out of every 10 items made it is. A 1% failure rate in 100 000 units is 1000 p***ed off customers. Getting a rate of 1 in 100 000 items is rather tougher.
Exactly. Production engineering is interesting - if you are allowed to do it properly.
One company I worked for, the MD "did not believe in statistical process control". We had to hide it in an industrial PC which we said was "chaining the machines together for efficiency". At the time, one of our key suppliers was pressing components on a machine which measured the plunger force on every plunger in the jig, and used this to calculate wear and report back on any defects in the feedstock. The other supplier was using 1920s tech which frequently resulted in broken parts. The production manager couldn't understand why it was a bad idea to (a) keep the second supplier and (b) let the parts get mixed up.
Our great Public Schools have a lot to answer for.
This post has been deleted by its author
Why can't we resist the urge to bash old technology, making it out to be worse than it ever was? I'd be really shocked if my washing machine needed 96Kb of code and half a meg of disk space to manage a few switches and read a temperature comparator. I'd fire the software and hardware engineers for incompetence :-) The only thing in common with my washing machine and an Atlas is the drum storage.
It's the same with all 60s technology. "Did you know the Apollo Computers had less power than your ZX81/Pocket Calculator/Analog Watch?" Rubbish, they had roughly the same power as a BBC Master from the mid-80s and you could write Novels with that, browse multimedia video disks, run businesses and crash the stock market too!
We're better off respecting both the old technology for what it was and the brilliance of the people who designed it. The Atlas was a landmark machine that had real power, power they didn't waste by filling it with schoolboy quality Flash™ and Javascript coding. We could all learn a thing or two from their era :-)