All Purpose Electronic X-Ray Computer
Best computer name of all time, go hang your heads in shame all you marketing men with your dull names like Macbook and Thinkpad.
Professor Kathleen Booth, one of the last of the early British computing pioneers, has died. She was 100. Kathleen Hylda Valerie Britten was born in Worcestershire, England, on July 9, 1922. During the Second World War, she studied at Royal Holloway, University of London, where she got a BSc in mathematics in 1944. After …
In this age of incel, and fragile snowflake feelings, we’re sometimes encouraged to forget (or ignore) the simple fact that in this business of computing we’re all standing on the shoulders of giants - and a lot of the time those giants are giantesses.
RIP Kathleen Booth - and thank you for everything that you’ve done for us. I’ve been writing in 68k assembler recently - I’ll raise a glass to you and think of you while I finish my program.
And had her work failed but instead she'd just donated a few thousand pounds to a political party, then she'd have been "Lady" Kathleen Booth. We have so many great technologists, scientists, and other wonderful workers achieving magnificent things and they just need to die before the country gets to appreciate them - we need to start appreciating people like Kathleen much earlier, not just making stupid donators "Lords" ...
I have always raised a glass to former wonderful people and taken a sip before pouring the rest on the turf for them. Tonight it will happen again.
A sad opportunity lost.
It’s all very well putting Turing on banknotes now but we the political establishment have been allowed to hijack the honours and Lords to be a cesspit of cronyism.
The scientific / biotech and medical industries are things we do well, and need to do better, but as Dame Bingham’s book illustrates, the people with the power are scientifically illiterate and still believe in fairies.
"learn some assembly, it is good for the mental muscles"
It is indeed. I cut my teeth writing assembly for the 6502 processor (Commodore 64), sophisticated code that took full control of the computer, handled the interrupts and banked out the operating system ROM giving me full access to the 64k RAM. I wrote software in assembly that rewrote other people's BASIC programs, doing things like updating all the line numbers to go up in tens, and compressed the code to run faster and load faster from cassette tape. Very unforgiving code though - I made a button to ground the reset pin on the microprocessor so I could do a soft restart without having to do a cold power off and on again when the machine code crashed during testing. It kept the contents of RAM without wiping it. Exciting times.
As I didn't have any formal training in computing I started with raw Hitachi 6303 machine code for making my Psion Organiser II do things that you could not do directly from its basic/pascal-alike programming language, OPL - all of that merely done from the function call list from PSION and the Hitachie processor manual I had managed to obtain. Discovering assembler made life a tad easier, but I pretty much started from scratch.
Given my initial lack of talent in that area I fed it via a mains adaptor as kicking out the 9V battery when it hung got a bit tedious :).
Fun times.
"I cut my teeth writing assembly"....with CESIL at school in A level Computer Studies :-)
"Computer Education in Schools Instruction Language", a very simple and very restricted pseudo assembler which, at the time, I felt almost like a waste of time. But it gave me the basics for when I got my first computer and started playing with real Z80 assembly language :-)
Loved Z80 assembly. Mostly done on a ZX Spectrum and an Amstrad CPC464. Moved onto 8086 for the original PC (well a PC compatible, also an Amstrad, this time the PC1512).
I know x86 is not considered particularly elegant now, and didn't always feel it back then with its segment register, but it was amazing what you could accomplish with surprisingly little code (and the BIOS). Happy days.
Took IBM 360 assembly in college - a short one credit course where we used punch cards - it was the 1970's. It really taught me how computers work. Later (almost two decades) I used my understanding to rewrite some CFD codes more than doubling their speed. And there were plenty of other "hacks" I was able to do especially back when computers were 16 bit and memory was precious.
One of the most "cost effective" courses I ever took.
The UK and British people used to have vision for the future.
Nowadays, but really since the 1970s, that has been lost. Who can not remember John Harvey-Jones with his "don't build it yourself, buy it in" mantra in summary. We as a people are more useless and more stupid as a result. Computing is just another thing thrown on the scrapheap like the steam engine, the television, and the jet engine. OK, so we have some presence with the jet engine still. But it's pitiful.
J. D. Bernal, student of W. G. Bragg, is one of those people who enabled a huge range of research. Not a household name. Worked on hard problems (X-Ray crystallography applied to organic chemicals). My favourite Bernal idea was exploring the structure of amorphous solids (e.g. glass). Get a football bladder. Stuff it full of ball bearings and pour glue or bitumen in. Wait for it to solidify. Then cut the mass out of the bladder, put it on a spectrometer table and measure the positions of each of the ball bearings. Apply appropriate maths to work out the density function and the spectral splitting and so on.
I knew that Bernal had an interest in computing but this article adds an extra thread to my knowledge, and tip of the hat to Mrs Booth.
Now, how about an article on George Spencer-Brown and his railway signal logic?
Not a household name
Actually, I've known him a long time, for his "Bernal Sphere", as I'm a big space buff. I didn't know of the rest of his work for a couple decades though.
As for women in programming, my mother taught me how to flowchart (using her green IBM template) in the '70s before I got my TRS-80.
See [1] which includes the text of a poster from 1977. It makes a grand prediction:
"It is thought that habitats of this type will be technically feasible towards the end of this century, possibly by the early 1990s. One calculation has indicated that with the level of industrial activity which is contemplated for space by that time, and with the means of transportation by then available, construction of such a habitat could proceed in about two months. Accumulation of the shield would take place over the ensuing two years."
[1] https://space.nss.org/bernal-sphere-space-settlement-detail/
My oldest uncle was born in Hungary, the next oldest in Romania, Dad and the next 2 younger were born in Hungary*
They were all homebirths in exactly the same bed, in the same room, in the same house.
*In 1946 they were made Romanian again after the 2nd Vienna Award was rescinded by the Allies.
Old Soviet-era Russian joke:
Man goes to central-administration building to fill out government forms. Unsmiling clerk sits down.
"Where were your born?" - "St. Petersburg"
"Where did you go to school?" - "Petrograd"
"Where do you work?" - "Leningrad"
"Where do you want to retire?" - "St. Petersburg"
The history of computers and programming is murky when it comes to firsts.
The contributions of Professor Booth are indisputable. Lets leave it at that and celebrate it.
I still have my half century old, well used, IBM 360 green card with its assembler language instructions. Although the only one I remember is Branch and Link - 69 in decimal - which stuck in my teenage mind.
Yes, Turing himself had what he called "popular form" notation in his original (1946) ACE design. It's beyond hopeless to argue about which of the great pioneers invented exactly what. I had no idea Kathleen Booth was still alive. What questions I would have had for her! My copy of "Automatic Digital Calculators" by Andrew & Kathleen Booth, published 1953, just went up in value.
Indeed. I have just finished reading the original programming guides and internal documentation on the Apollo Guidance Computer. The low level hardware instruction set was what we would today call a RISC machine, with highler level operations handled by functions. It had a multi threaded operating system. To make life easier for the application programmers (ie, the ones who designed the guidance and navigation, and how to land on the Moon) they had a virtual machine.
We still have a hell of a lot to learn from these Luminary's (and lets see who gets THAT reference!)
Oh no, this is sad news. I first heard about the Booths when I worked at Birkbeck College around the turn of the millennium, and they created a small but mighty computer science department there in the 1950s (to go with the small but mighty College).
As the only female sysadmin working there at the time and one of very few at the University of London in general, she was an inspirational figure. She built those early machines (with her colleages), plus all those other achievements in programming.
Basically anyone who'se invented what is a computer and expects to do any serious amount of work with it would logically think "What a PITA, why not make it read something more human friendly instead?"
I can certainly believe she is one of the few to have come up with the idea though.
And interestingly my OP was rejected on this article, which hasn't happened to me on this site for a very long time.
Still, at least I can still see it, which is more than is allowed on some US hosted sites. You have to go to "The Land of the Free" to apreciate the real "Ministry of Truth" vibe.
BTW I didn't realise her husband was the booth of the "Booth Multiplier" used in many a processor, including IIRC the ARM series, as being quite "resource light."
So some guy with what sounds like zero relevant domain knowledge background on the subject in question writes a piece a few years ago on a website I'd never heard of before and it gets picked up by wiki and its now the Official History That Cannot be Questioned.
Go look at the original 1947 paper and tell me how that is Ur-"Assembly Language".
https://albert.ias.edu/handle/20.500.12111/7941
It aint. It's just a very well though out symbolic notation. That's all. If you know Predicate Calculus you can see the roots.
I've been writing the code in question commercially for I suspect decades before the author of this El Reg piece was even born. Yet my opinion that the guy generally accepted for the last five plus decades as having invented the technology in question, and not someone else who kinda sorta wrote something a bit like it the year before, it has been removed as "unacceptable". Even though the work in question really is nothing like what the general accepted guy the following year actually created. Or what I and everyone else has been writing for the last 70 plus years.
So is this level of total PC bollocks El Reg has descended to? Rewriting history to fit what this weeks current political fashion?
Kathleen Booth did some fantastic first rate work in the early days of computing. Why denigrate by it by repeating a completely spurious claim of invention (basically from some guy and wiki) which there is no documentary support for. The book "Alan Turing and His Contemporaries" by Lavington (ed) does a great job of explaining just how important and vital the Booth team, husband and wife, were in the early days of computing. The book is a fantastic read.
Thats what you should be remembering. That truly great pioneering work. Not some recently made up basically politically motivated claim with little actual merit.
What's next? All the great breakthroughs have to have been made by Scousers or Geordies? Or gimp-legged bald headed men from Arbroath? Thats how bloody stupid this rewriting of history is.
1974, in college, I learned to program in IBM BAL - Basic Assembler Language. For those who don't know, Assembler is a step above pure machine code, with short alphanumeric 'codes' to do things. If you understand assembler, you understand what goes on behind the scenes after a compiler changes your high level language code into something a computer can actually use. And you understand so much about how bad compilers and operating systems create vulnerabilities. Real hackers (not ones who buy prewritten tools for hacking) understand what it takes to break systems, and competent compiler writers and OS writers understand how to protect computers. Sadly, companies like MicroSloth have for decades pumped out piles of new, often useless features while putting out OS versions and compilers that are sloppy and vulnerable. My favorite command in BAL is MVC SOURCE, Destination. Essentially saying to move contents of one location to another, typically in some way given source start position and end position or length and where to copy those characters. The fun part - you can put data almost anywhere, even write over your own code - IOTW, you can change the program as it runs. This is one of the biggest tools in hacking - putting code or data of your choice where it doesn't belong.
Early computers were programmed in this manner, but high level languages like COBOL and FORTRAN came pretty soon afterwards. While not glamorous, they are still heavily used today. They made coding much easier. However, let's say your COBOL program crashed and you got what was called a CORE DUMP. You needed to know Assembler to read the dump. And it still has value today.