Wire-Wrap Gun?
Gun? Bloody luxury! When I were a lad in my first proper job all I had was a hand-tool and RSI.
Start your Monday with a cautionary tale from the files of Who, Me? and a warning (if one were needed) that hiring a teenager to write your operating system might not go so well. Today's tale comes from "Matt" and takes us back to the mid 1970s and a university computer lab full of Interdata machinery. Matt was looking for a …
> RSI? We shopped at Farnell.
+1
The RS rep at Uni was a snobbish prat who didn't think students should get copies of their catalogue, or be allowed to order directly. Luckily by that point I already had a Farnell trade account (set up by a friendly rep when they noticed how much I was spending with them), and regular free copies of their catalogue (back when the catalogue could double as building material). They were rewarded with plenty of business sent their way after graduation. At one point I even had a better credit line on my personal account than the company I was working for, due to some issue over payment terms, so had to put a large order through on expenses!
Pity they've now been taken over by a corporate conglomerate and don't seem to be as efficient as they used to be. Nothing lasts, nothing lasts...
"Sandwich" placement from university working at a company specialising in surface-mount PCBs (still quite a novelty even in the late 1980s) one of the first jobs I was given was wiring up a very complex test rig by wire-wrap - never having even heard of the stuff before this time. Some 800 or 900 connections IIRC, I had two or three colours of wire, a hand stripper and a hand wrapper. At least the thing was properly drawn up for me, and I actually found it quite relaxing.
The production manager was astounded to find no errors at all in the final result and said it was a very neat job. I had spotted two or three myself as I went along, but even I was impressed.
Same company, first week there I was "inducted" through the various departments, including a stint at reception where the two secretaries gave me some typing to do (something in DOS, WordPerfect I think (something else I'd never met before) and a daisywheel printer). I think they expected me to peck away at the keyboard for the rest of the day, rather than get the letter done in ten minutes, including working out how to fit the correct wheel to the printer and how to get WP to print.
Since then, I try never to assume that just because I know a lot about "something", the person I'm talking to doesn't.
I don't always succeed...
M.
Had a similar set up when I was in college in the 80s.
We had printed templates (basically grids) for writing down the hex machine code. Then depending on the boards we were working on, would sit there manually entering the hex directly to the board, then cross your fingers and hit the run button! Or into the EPROM, then onto the board and run.
UV lamp in a box at the ready, when we inevitably needed to wipe and try again.
EEPROMs made life a little easier later on, once we got some.
EPROM burner with a hex keyboard, you 'ad it lucky.
I had to build my own EPROM burner from Veroboard and stick it into the back of a Commodore 64, AND write the burner program, in BASIC, AND create the EPROM contents as DATA statements in the burner program.
<Beer, because after all that, I needed one, or 0x02>
The entire time sharing system at Dartmouth was written by undergraduates around 1964. There were two machines: a GE-225 and a GE DN-30 - one programmer for each machine.
I wrote the major part of the executive for the Dartmouth phase II time sharing system on a GE-635 around 1970.
You do not need a hoard of programmers to write an operating system.
Reads to me that "Matt" actually did a heroic job working around the problems caused by broken hardware:
"The reason that the reported fault address had been incremented to the next instruction was that the machine had actually executed the instruction, using whatever noise was on the memory bus at the time the instruction executed."
I disagree, although I can understand his approach. He should have reported the hardware as unreliable and got that resolved instead. Computer doing random things with your data means it'll never be reliable or trustable.
Noise pickup warrants better shielding/grounding.
Unless what was meant was that it was executing from memory that hadn't been (intentionally) written, in which case a software fix is appropriate, but not by looking for specific instructions that you know would cause issues.
The term 'noise' is a bit misleading -- the main issue was that the data returned by an instruction that took a fault was *undefined*. The hardware wasn't unreliable, the OS designers just assumed it provided functionality (restartable 'segment missing' faults) that Interdata didn't actually provide until 3 years later.
They did eventually fix the issue, so that by 1978, the next model, the 8/32 worked as desired. But that was three years later, and we wanted a solution to share these machines much faster.
Aye, fer them soft southern jessies 'oo can't tell difference between technical jargon an' 'umerous technical slang.
Ah once 'ad a trainee co-worker Ah tuld ter go an' play wi' editor an' she complained ter boss that she didn't appreciate bein' tuld ter play wi' computer.
Mind you, there were mitigatin' circumstances - she were thick as a bacon doorstep.
This post has been deleted by its author
Well, you’d fix the compiler and linker so that the first instruction in any segment is a NOP…. Wait that doesn’t work because it doesn’t execute that instruction but random bits… So you’d fix the hardware so that reading from non-existing memory gives a NOP instruction.
Early 1980's, we have built a RM 380Z from the kit that RM supplied (As it was slightly cheaper) and it was primarily going to be used by us pupils for the Computer Science O Levels.
The 380Z was loaded with CP/M and it was apparent the hardware was capable of more than the then standard CP/M could do, so we created some CP/M extensions to allow us to do some basic graphics.
I then wrote a program for my O Level (As did my friends doing the O Level)
So after bolting the PC together, adding extension to the OS and then writing our programs we all managed to get a C grade, which looking back was a bit harsh from the examiners.
Basic graphics? In 1980 we ran the basicg (basicsg?) interpreter which would drive the video with "hi-res" graphics - about 320x200 or something - in blinding monochrome whoch looked great on a froopy green computer screen.
I remember writing a programme to show the motion of a charged particle under the inflence of a magnetic and/or electric fields on a 2D screen ... As a teenager that made me a real physics geek!
I loved that 380Z ...
I hated repairing those. Or adding the wire so the parallel port IC 8th bit got to the connector so you could print graphics and extended ASCII on an Epson MX-80.
Terrible design. A metal "shoe box" with the bus along the top of the cards as a ribbon cable. It can't have saved much money compared with a proper but simple motherboard bus in the bottom of the case. Also why when the port I/O used an octal chip anyway did they only wire 7 bits to the parallel port?
So after bolting the PC together, adding extension to the OS and then writing our programs we all managed to get a C grade, which looking back was a bit harsh from the examiners.
Thing is, the task for the course was probably "write a program", and the rest of the stuff you did was not relevant and indeed couldn't have been taken into account even if the examiner had wanted. He (and it probably would have been "he") would have looked at your planning (flowcharts I imagine?) and logic, taken a cursory glance over the source code (MS BASIC?) and checked that your program produced the required output with the test inputs. Tick, tick, tick, cross - maybe you used a rectangle instead of a rhombus for a step in your flowchart, or you hadn't used a ruler for the lines.
And don't forget the written exam which included logic, maths, history and suchlike. The actual programming was a very minor part of an O-level in those days I seem to remember.
I suppose it shows up the problem with academic achievement assessed on a simple one-off metric. You'd have been better with some kind of coursework element, and a resourceful teacher might have been able to bend the guidelines to include "knowledge of hardware" and "extending the operating system" towards that element.
By the mid 1980s, when I took O-level Computer Studies, it was a popular subject. So much so that there were some 60 children in two classes doing it at my school, a grand total of twelve BBC Micros attached to cassette recorders and sharing two printers, and a teacher very much out of his depth.
I can't remember if it was two of us or three who passed (i.e. got a C grade or higher), but only my friend Rhys and I went on to do A-level Computer Science, and we found ourselves helping said teacher with his lessons during our "free" periods when we should have been studying...
M.
I can't remember if it was two of us or three who passed (i.e. got a C grade or higher), but only my friend Rhys and I went on to do A-level Computer Science, and we found ourselves helping said teacher with his lessons during our "free" periods when we should have been studying...
I was in a similar position at A-level for Physics. Our old-school Physics teacher had retired, and in his place, a young teacher with experience in the experimentally based Nuffield Physics course had been recruited. Of course, we were already partway through the conventional, mathematically based Physics course, and couldn't switch mid-course. Myself and another guy who was the school mathematical genius frequently ended up either correcting his working or being asked to explain to the class!
"So after bolting the PC together, adding extension to the OS and then writing our programs we all managed to get a C grade, which looking back was a bit harsh from the examiners."
I bet you got a high mark for the coursework component (depending on the program you devised and the assessment criteria for the project). Did your certificate split the components or just give a grade?
When we had coursework on Maths GCSE courses, the coursework mark was always higher than the exam component mark. The students were not getting Aunty Carol to do it for them as they had to explain their thinking to us and at the College I was teaching in then we took that seriously.
Of course, it does not really matter - in a 35 year teaching career noone has ever asked me what grades I got or what degree classification. I've been judged on what I actually do which suits me fine.
I wouldn't go so far as to call it an OS, but I wrote a data logging program for a Z80 based S100 bus single card computer that had no software at all; just the bare iron. I used an Osborne 1 with WordStar, link and asm as my development environment (the system had to be debuggable in the Arctic!). The program was logging data in real-time from a prototype ice-sounding radar! The basic structure was sufficiently flexible that I was able to keep using it through several iterations of ancillary hardware, and I think (with hindsight) that it could have been developed into a real-time OS without too much trouble.
COBOL is a great language as long as you use it for what it is designed, business. Fortran also has its advantages and disadvantages, so should be restricted to what it was designed for as well. The real power of PL/1 is writing compilers for some languages (not C and its derivatives).
> Probably one of the kinder descriptions of PL/I, a language "designed" by taking Fortran and COBOL, banging a six inch nail through them and spraying the result with Algol-ish syntax paint.
Rather harsh, if somewhat true. PL/I was notable for routines which handled the historic British currency of pounds, shillings and pence rather well. IBM must have put a lot of effort into the compiler for just this feature, and undoubtedly were mightily miffed when the British currency went decimal in Feb 1971...!
Lyons, the tea-shop chain, build the first commercial computers, initially to handle supply chain logistics. Even before they build the LEO machines they were using decimal pounds for internal accounting, and putting lobby money into the decimalisation campaign.
the IBM 1401 went so far as to offer LSD math as a (hardware) option. At extra cost, of course, just like the console sense switches that _rented_ for $6/month apiece, IIRC.
One of the working 1401s at the Computer History Museum in Mountain View CA is alleged to have that option.
It was still around and occasionally used in the 1980s; I think I've got a programming manual for it on my shelf somewhere (the IBM variant). Thankfully I never actually used it - got the manual because I sometimes had to resurrect other people's programs, though I drew the line at things written in really strange languages (one was in SNOBOL, and I've seen RATFOR as well).
the IBM variant
The unfortunate thing about IBM's PL/I was that the checkout and optimising compilers actually compiled subtly different languages, thus defeating the idea that you developed and debugged your program using the checkout compiler and then used the optimising compiler for the production version. You'd get a program that worked perfectly using checkout and the optimising compiler would vomit all over it.
I drew the line at things written in really strange languages (one was in SNOBOL, and I've seen RATFOR as well).
I wouldn't regard either of those as really strange. RATFOR was simply a preprocessor that let you write Fortran in ALgol-ish free format. In my academic days the Prof who headed our research group used it a lot, at least until I got Algol 68C running on our system
SNOBOL was interesting, if a little primitive in its flow control. At one point in my career I had to process vast amounts of climate data using programs all of which had different Fortran fixed format input and output specifications. SNOBOL (specifically the SPITBOL variant) was brilliant for rewriting the data as necessary. I also wrote a primitive text formatter in it.
If you want a really strange language try INTERCAL (or Befunge).
How about Brainfuck.
Just a Turing machine. :-)
Try Whitespace, the language that's just as easy for the visually impaired as it is for the normally sighted.
It at least had a block structure, though it did seem designed mostly to accommodate some of the S/360s more obscure arithmetic options, whether it be packed decimal or LSD. You spent a lot of time writing declarations....
The idea that instead of getting the address of the *start* of the instruction that generated a missing segment fault... is reminiscent of the S/360 "imprecise interrupt". Some S/360 models incorporated early versions of out-of-order execution - and if a fault occurred it wasn't possible to identify exactly which instruction had caused it. But at least that was acknowledged.
For some reason, PL/M was derived for system programming. I still don't see why...
I have to take my hat off to you Matt. To be able to do that at such a young age is quite remarkable. I could barely tie my shoelaces at that age. It's fair to sat that many devs with 20yrs experience wouldn't even know where to start on that task. The fact that you actually got it working is incredible. Well done sir!
When I was in college, around 1975, I got a part-time programming job in the Nuclear Physics lab. Based on the qualification that I had had a class in assembly language and knew in theory what an interrupt was.
They were collecting data from instruments in their particle accelerator lab on a DEC PDP-15 computer. They did data analysis on the same computer, so of course there was a lot of competition for computer time. Time sharing was not an option, as the OS did not support it and the response time required for data collection would not allow it.
DEC did have the option of a PDP-15/PDP-11 dual processing system, that connected via 8K of shared dual-ported memory. But nobody wanted to rewrite all their analysis code for a PDP-11, and being a bunch of PhD physicists they decided they could use the same dual-port configuration to build their own Frankenstein dual-processor PDP-15.
That's not a supported configuration, said DEC. We'll figure it out, said PhDs. Just sell us another processor with hard disk and the dual-port memory.
You need another DEC Tape on the new system to load an operating system, said DEC. We've too cheap to do that, and we've got a CS student who knows what an interrupt is, said PhDs.
So my first task was to write a disk-copying routine that could be loaded on the new bare machine by paper tape, that copied the entire hard disk through the shared memory. OS and all, for some reason this did not violate any licensing.
They put this thing together and split the peripherals between the primary and secondary CPU. Data collection was on the primary, the line printer was on the secondary. Their biggest challenge was if someone was printing something from the primary computer, if you typed Ctrl-C on the secondary it would reload the OS and forget the print job. Having the complete source code for the OS helped, but they still needed a way to know if a Ctrl-C was typed before letting it go through all the interrupt handling. But if you read the character in the keyboard buffer it also cleared the interrupt flag and the normal interrupt handling would be broken.
Give a wirewrap tool, complete schematics for the computer, and too much time on their hands, they found an unused opcode and wired in a new read-the-keyboard-buffer-without-clearing-the-interrupt-flag instruction. Problem solved.
The Interdata machines (later Perkin Elmer Data Systems, then Concurrent Computer Corp) came with a quite capable real-time operating system called OS/32 (there was an OS/16, but I don't think the 7/32s could run it). Various people used it for its "real time" attributes.
The assembler was very similar to IBM's BAL (it was called CAL!)
Much of the operating system was actually written in C...
Edition VII Unix got ported to the 8/32 range (from what I remember!) then System 5 to the later models.
Wire wrapping was routing the interrupt lines round the expansion boards you had installed in the right order (Racu-Tacu - not sure of the spelling but it's how it sounded!)
At one time I ended up working on a "peephole optimiser" to optimise the output of the C compiler (which was very much "template" so had lots of repeated/wasted instructions!) - it made a BIG difference and ensured I was proficient in CAL (the optimiser was written in CAL too!)
A salesman sold a comms processor to a valued customer who was looking likely to move mainframe suppliers.
The specified functionality would require a customised enhancement of the OS. That was a major challenge in the required time scales.
Then a quick calculation showed that the software wouldn't be able to address the amount of data space needed for the number of terminals. A data 16bit addressing mode allowed twos-complement negative offsets - and generated an exception if the positive offset went over 32K.
I suggested that we inhibited the exception logic - as a negative offset was never being used in the OS. A quick test in the factory showed it worked with a simple track cut that could be bridged by a switch.
It was agreed the customer would get modified cpu boards. Two of us were drafted in from our normal roles - and we spent several months on site enhancing the software for their custom requirements. It all went live first time - ahead of the customer's deadline dictated by a building move.
Our company gave me £50 for "a good suggestion".
Prototype EEC System 4-70 ran its variant of the standard disk OS ok - but not time-sharing. Came the day when the system programmers switched on time-sharing - and user programs started crashing.
Time sharing implemented the store protection settings. The OS had one tag - and up to fourteen running user programs each had a unique tag. The final tag was for "unprotected".
The problem was that the 4-70 was a microcoded cpu - and some of its decimal operations had a few words of scratch space in the bottom block of memory. For very good reasons this block of memory was tagged as "system". Potential Catch-22 resolved by setting that block of memory to "unprotected" to allow user state access.
It was declared a safe thing to do - as user programs theoretically couldn't affect anything by transiently writing anywhere in that memory block.
Ah, yes. Honeywell Multics.
Combine an OS quirk that identically named batch files had priority over system commands with terminals that allowed primitive "programming" via ASCII codes. Result = hacking that was almost too easy.
Make a batch file called DIR. Whenever anyone (hopefully with higher access) tried to list the files in your home directory the batch file would go to the user's own directory, give you full access, send you an email saying you now had access to this directory, then display a text file that looked like a boring directory listing. Normally each of these commands would show on the screen but via the terminal hacks you could temporary disable the terminal from displaying anything, do the command, then turn back to normal.
in my PC running Windows 95...
But I only had IDE cards which sat in a specific range of memory, and you only got a single choice of location within that using a dip switch on the card... so a maximum two cards per PC, with a master and slave* on each - so 4 hard drives only.
Quick bit of jiggery pokery with wrapping wire, a spot-face cutter, some through pins and a soldering iron and I was able to fool the card into thinking it was on an address it expected when in reality, it wasn't. Unfortunately Windows 95 only expected those kinds of devices on specific address boundaries, so I had to make a few registry changes as well in order to get it the driver working half-way through the next address block instead of at the start...
But I did it! And it worked. And it only took a day of reading/planning and a few hours of messing around.
* Let's get HISTORICAL not HYSTERICAL - this is what they were properly called in those days - it said so on the cables and the devices!
Later than most of these stories: I needed to acquire some image data in real time (photon by photon, recording the positions) from some lab equipment in the early 80's. This needed a big flat memory space on an 80386 PC, and at the time there were few ways of doing it. I ended up using OS/2 1.0. That was pretty much "half an operating system" at the time, as important stuff like the GUI were not yet supplied, but that didn't matter for me - I just wrote my own simple GUI. The problem was that I had to swap bits around in the coordinate information provided over incoming ribbon cables, and I needed to do this at about 200k events per second. There wasn't enough oomph in a 386/25, so I just used a box of wire-wrap.
That did the job fine, and I got on with the research which was the point of the exercise. The problem was that OS/2 1.0 had been designed to be compatible with an 80286 processor (ask your grandparents) and the way some of the memory handling changed when they moved to OS/2 1.1 - at least that's how I remember it. Anyway, the wire-wrap was not compatible with OS/2 1.1. Didn't matter to me - I just stuck with 1.0 and carried on taking readings. But when I left I occasionally got calls from new students proposing upgrading the operating system to something modern. I'd point them at the big box of grey spaghetti that would need to be rewired - and wait for the next generation of students to make the same call.