Completely and utterly bonkers
But I hope he manages to complete it and find a home for it because it'll be a wonderful achievement.
I'm assuming it'll go abroad because that's where most great British technology ends up...
A bloke in Cambridge, UK, is building a computer processor using 14,000 individual transistors and 3,500 LEDs – all by hand, piece by piece. James Newman said his Mega Processor relies almost entirely on the hand-soldered components, and will ultimately demonstrate how data travels through and is processed in a simple CPU core …
Doesn't have to go abroad, bring it to the Museum of Computing in Swindon if they have space (it's almost abroad, I guess, from a Cambridge perspective). It may not be truly a museum piece yet, but it's undeniably a brilliant educational tool.
We have good beer, too.
If they'll take it in at the Cambridge museum he'll be able to throw a couple of spools of solder into his bike basket and cycle down to finish (or, more likely, mend) it.
This post has been deleted by its author
If you haven't, go and look at the WEB site for the project. It's fascinating, board construction, component layout, testing, managing connections. Amazing breadth of skills the man has.
From his site:
"I spent a bit of time trying to work out how to do the 7-segment display using discrete transistors but the answer is vast. Really, really big. It would have near doubled the size of the thing and the circuitry for the display would have obscured the circuitry for the processor which would have undermined what I was trying to do. As its only for debug and not proper function I went for chips. This is definitely NOT cheating, it is just for debug. It is irritating though."
And
"The RAM's turning out to be quite sizable. A square inch per bit ! I'm hoping to do 64 bytes, but that translates to the best part of two square metres."
Really, I had to laugh. Sizeable? Not half it isn't.
>There are several examples of those monstrosities around.
None of which are actually functioning today. The Computer History Museum in California has a number of historic tube computers which would be a nightmare to restore to working order. Most of them used magnetic drum memories, guaranteed to be nonfunctional and almost impossible to repair. The only tube computer that is functional today is the Colossus replica at Bletchley Park, and it's not even a "general purpose" computing device.
"None of which are actually functioning today"
AFAIK the replica of the Manchester 'Baby' is around and ran in 1998. I was taught physics by a chapvwho worked on the original and had a photo of himself, stripped to the waist, working in basement surrounded by racking
The RAM's turning out to be quite sizable. A square inch per bit !
Sounds like he's using static RAM, maybe he should have tried a dynamic RAM design? With decent capacitor sizes he wouldn't have too fast a refresh cycle...
I suppose core memory would be better still, if he's into knitting!
Those wires act as huge capacitors which need to charge and discharge on each cycle to allow the signal to stabilise.
Not huge.
The general rule for a signal to settle on a plain old wire is something like six times longer than the speed of light along the wire. (Or two to-and-fro bounces at 0.7c)
I've often wondered what is the optimum design for a discrete-transistor computer. Minimise the transistor count, build as small as possible, and clock as fast as possible, or go for wider buses and more transistors clocking more slowly? (Of course in the early days they went for small component counts, because transistors - germanium alloy junction ones - were significantly expensive, and suffered thermal runaway at fairly low temperatures so cooling really mattered. )
"It's only 14m long. Assuming 0.7c because of the dielectric of the wires that would be 66.6ns propagation delay end-end. So you could run it under 15MHz, say 1MHz should be do-able"
Yeaaahhhh... I only know a tiny little bit about RF, so I might be talking complete rubbish here, but wouldn't there be radiation issues? I seem to remember that one of the constraints on the original IBM PC (4.77 MHz) was that pushing the clock any higher led to disproportionately high energy losses to radiation (and of course interference with your transistor radio!), and that on a printed circuit board of much less than 2 sq ft. I imagine that a 14m long assembly with lots of interconnecting cable and hand-soldered assemblies might have a slightly worse problem with that.
Take a look at a photo of an old enough computer that the CPU consisted of a large number of logic modules connected with a wire-wrapped backplane (for example Google "Images PDP-8 Backplane). You'll soon deduce that the interference problem is not insurmountable. It was not negligible, though!
The routing of wires within the backplane was a black art. Some were artificially lengthened so as to introduce deliberate signal delays. Others took non-parallel routes from A to B to reduce crosstalk - interference is by far the greatest between closely parallel wires. The general term was "random-wired". It was most definitely not a good idea for structure in the circuit schematics to be explicit in the physical arrangement of wires in the backplane.
To some degree it not just the length of the wires, but the differences in length wrt the frequency being used.
I sure hope he's keeping the length of the wires (as appropriate) the same to each (and within) similar functional banks of transistors, otherwise the differing propagation delays will be madness to try to debug. This is normally done at chip layout and PCB layout. Clocking in incorrect bits (on some lines) and not others would surely lead to a long stay at a mental institution.
Slowing down the frequency until it worked might be practical, but with a little attention to the lengths, he might find that he could run at a much higher frequency. Overclock - baby....
kHz or KHz is valid, after all, little "m" means "milli", big "M" means "mega", as kilo = 1000, it is a multiplier and can be K. But from wikipedia:
* The engineer's society, IEEE, and most other sources prefer "kHz" to "KHz." This apparently makes it less likely that users will confuse "kilo" (decimal 1,000) with the computer "K" (1,024).
Absolutely,
I'm no fan of "coding" in schools.
Which seems to be pretty pointless for most kids.
But if we're showing the kids what is actually under the bonnet and then letting them try to make it do something there's a chance that some ( the right ones ) will be inspired to really get involved.
I seem to remember a ladybird book ( I think) with a computer made from wood and OC71 germanium transistors, and some dairy/milk Co series* of many-how to booklets that did the same sort of thing.
Where are these sort of things now...?
*can't find them on Goog - they were thin, square and white, with blue titles etc.
Project Books published by the Dairy Industry Council.
Awesome! I cannot approve more of this project - what a geek, utterly awesome!
Cambridge Uni: He mentions space is a problem for the final CPU, please promise this guy a room for a few months to demo the finished CPU - knowing this will surely aid his motivation.
Funding: How much is this all costing? Where do I donate some transistors?
This post has been deleted by its author
A technology that existed in Babbage's time, but of which Babbage was unaware, is hydraulic logic. It's possible to create a bistable out of fluid (air) being pumped through an appropriately shaped cavity, and to switch it between its two stable states using pipework connected to the output of others. Logic gates are also feasible.
Anyone fancy building the world's first (?) hydraulic programmable computer?
Or even a simulation thereof, just to hear what it might sound like while it is computing.
re: hydraulic computers, there was MONIAC but I'm not sure if it counts as hydraulic (involving water pressure in some useful way) or a computer in the usual sense.
What prompted me to reply, though, was that I just recently came across the idea of a hydraulic ram pump. Sounds like it would make an excellent component in this speculative machine.
Now you've got me thinking about powering stuff with water in Minecraft :(
Great suggestion. With a bit of care, he may even be able to put more than one transistor on the same piece of silicon, and save on all that tedious wiring interconnect. Perhaps he could use some sort of photographic system so that the repeating units don't have to be drawn by hand.
Gosh. Well done thet man! Although if I'd had the knowhow and wherewithal, I'd have gone for replicating a 6502, always prefered the instruction set for them over the 8080. Less transistors to connect up, too, apparently (I'm astonished the 6502 used so few - I honestly thought it was more like an order of magnitude more than that!).
To all the downvoters of my comment above, do you really need to be told that a transistor is a digital switch? on an El Reg Forum?
That's entirely possible. There's lots of people in tech who don't understand the basic tech. For most, the operational theory behind the processor is the FM* Theory of Operation.
* aka: Freakin' Magic
To all the downvoters of my comment above, do you really need to be told that a transistor is a digital switch?
No, I don't need to be told that. You think that a transistor is a digital switch. I know that it isn't. You can wire them into an arrangement where their gain is high enough that it makes no difference and the resulting system behaviour is very digital, but that is in circuit. In isolation there are no ifs or buts, a transistor is an analog device.
I did look into doing something similar to this a few years ago and it got much further than a thumbnail sketch. The goals were slightly different - this was a 38.4kHz 12-bitter with a few niceties (e.g. hardware multiply and divide) and a few oddities (hardware assisted garbage collection). It was a lot simpler than this, estimated at 3,500 transistors and perhaps 3ftx2ftx18" in size, but no integrated circuits anywhere - not even memory. Most of that reduction in complexity was down to the use of threshold logic gates which are a slightly quirky semi-analog system - digital inputs, digital outputs, but internally the processing is very analog in nature which allows for a much richer set of functions than pure Boolean logic. This approach was common for research systems in the 60s to reduce the complexity of the systems by exploiting that very analog nature of transistors.
Utmost respect for the guy though because I know precisely what is involved. My project didn't get further than design, a few test assemblies, and a software emulator and assembler before the transistor I had based it around (BF199) went out of production. They were less than 3p each in quantity and when I saw the cheapest through hole alternative was £1.50 the entire thing went on the back burner.
"To all the downvoters of my comment above, do you really need to be told that a transistor is a digital switch? on an El Reg Forum?
I didn't downvote you (I don't use the up/down vote thingies, because I am over 12 years old) but my memory of college fiddling with transistors and doing calculations about how to bias the base so that it operated in the "knee" part of the accompanying datasheet graph made them work in a very analogue way as they were transitioning. Hence the analogue output of a transistor amplifier. They can work as a switch, but there is more to a transistor than the behaviour of a relay.
Of course, this makes no difference the way that transistors are being used for this project.
After I left college and fell into the world of hardware development, I found that using transistors in projects was done using shortcuts and I didn't have to think about all that NPN PNP shit ever again.
Off the top of my head, did silicon bias at 0.7V, germanium at 0.5V? It''s been 25 years since I did hardware, little has been retained.
Individual transistors and LEDs - sure - but hand made wiring looms? I'd at least have gone for some PCBs for the backplane (with hand made looms linking the PCBs together). Density of components looks low enough that even one sided PCBs would work with small hand soldered bridges to jump wires.
Fair play indeed - more patience than I have ...
"How computers work, as opposed to how to use them."
OK, so are we taking an historical/hardware approach or building up from logic (and/or/not and Boolean algebra, leading to shift registers, half adders &c) or from the conceptual side (Von Neumann machine/Turing computability which rests on the 'diagonal proof' and its generalisation) or through programming (variables, assignment, loops, subroutines/functions then into more abstract areas) or all of those?
Could take a bit of time (and need some serious skills). Best of luck. One tiny activity I use sometimes: take an 8 by 8 grid of squares on squared paper. Draw a resonably complex shape (each square is either black or white).
Now devise a way of sending the shape to someone else using an sms message. Document the method for reconstructing the shape.
Now find a method that will work for a shape drawn on a 16 by 16 grid, and then a 32 by 32 grid &c.
"OK TonyJ and anyone, what is the THIS that you want teaching?"
The underlying architecture of what drives a computer.
How the core components from a transistor upwards come together to make a gate. How gates come together to build logic. How... and so on.
In other words, it's all well and good showing kids how to use PowerPoint, but let's start by showing them how the actual computer works.
In other words, it's all well and good showing kids how to use PowerPoint, but let's start by showing them how the actual computer works.
Using animated powerpoint naturally
Mines the one with a wire-wrap tool in the pocket (yes I have one from the days of actually wiring up computer backplanes)
Actually yes, if you give it access to a large enough persistent data store and spent the time writing a PC emulator for it.
The speed wouldn't be much to write home about - we'd be talking about one frame every few thousand years I reckon - and there'd be no visual display on a monitor. But yes, in principle, it could.
A very cool project.
Reminds me a bit of "The Elements of Computing Systems" , although the book used software emulators so you don't need so much physical space. Enough to get started on the principles of CPU design though.
(not associated in any way with this, just an interested reader that bought the book)
"Reminds me at Uni ...
I programmed a bit-slice CPU ...
And hand soldered my final year project."
Mine were much less fun.
Had to design and build a bar code reader. It did the utterly pointless thing of reading the bar code, and recreating it on a plotter - also built by me.
Had to build a memory expansion for the "D5E evaluation kit" to hold the necessary lookup tables and "driver" for the plotter.
I learned far more about bar codes than anyone would ever really wish to know. :)
Totally agree with the commenters who suggest housing this in a public institution as an educational exhibit. NMoC was my first thought but Cambridge makes more sense geographically (damn you both - put it in my local science museum so that I can go and ogle it!).
El Reg is being a tad lazy by comparing its performance to integrated circuits of yore. It would be nice to see how it stacks up on the whole continuum of computing, at least as far back as the first valve-based systems.
All hail the man in a shed (or spare bedroom) with too much time on his hands!
(And before anyone points it out I know it's von Neumann)
This post has been deleted by its author
because I think it's a ridiculous project :-)
Just because you can, doesn't mean you should. This is the opposite of progress - deliberately doing thousands of small repetitive tasks that a machine can do much better (for almost every definition of better - smaller, faster, cheaper, more reliably, using less resources)...
"This is the opposite of progress [...] a machine can do much better "
Yes, it is, and a step back in time. It goes back to the roots and *wonderfully* demonstrates how computers work. I can see that being a fantastic educational tool for those who want to learn about it, before they go off and build machines which produce the next generation of Raspberry Pi.
You didn't think that this 14x2m project was going to go into mass production for you to buy and use, did you?
I haven't seen Soylent Green on any supermarket's shelves, but ramen cups are pretty efficient: just add hot water. If you want to further optimize for speed, get ramen blocks and eat them as-is. They are crunchy, reasonably flavorful, though sometimes they pierce your gums with sharp bits. There also are "SOYJOY White Chocolate & Lemon 25g (0.9oz) 6pcs Gluten-Free High Protein Bar Otsuka Pharmaceutical" available on Amazon.
I shudder to think he'd be running it at 5V… that's a 100A busbar that'd be needed to power the thing!
I've seen what power leads carrying 400A look like, feeding a 20kW 3-phase BLDC motor. We'd lock the rotor to measure torque, in doing so, we'd watch two wires get attracted to each other, and two repel each other according to magnetism (F = (µ₀I₁I₂l)/(2πr)).
This post has been deleted by its author
This post has been deleted by its author
Doing a 8-bit design would have been less than half the work.
That is something of a mixed bag. It reduces complexity in terms of components and wiring but significantly increases design effort. Large parts of a 16 bitter are simply the equivalent 8 bit circuit replicated but going in the other direction introduces some significant extra issues. Presumably you would want to address more than 256 bytes memory so that makes an address (at least) two words long. Similarly it's somewhere between difficult and impossible to encode a complete, useful instruction set in eight bits so you have multi-word instructions too. That gives you a large amount of hassle co-ordinating those half quantities and you need multiple cycles to send those values around. That in turn adds complications as you co-ordinate timing in multiple-cycle instructions.
I mentioned somewhere above that 12-bitter I set about designing a few years ago. 12 bits was chosen very deliberately as the simplest option - it's the narrowest width where you can sensibly have arithmetic, addresses and instructions all the same length. All instructions were single cycle so keeping everything in sync was also made a lot simpler, even if multiple cycles would have allowed you to crank up the clock rate a little. It did limit you to a 4096 word address space but I considered that adequate for a demonstration system.
The 6502 does everything in 8-bits width, except addresses. An opcode byte is optionally followed by an 1 byte immediate value, or a 1 or 2 byte address. I'm guessing that it has 2-bit field in the instruction byte that causes it to load 0, 1 or 2 following bytes to a register determined by rest of the opcode (and increment the PC). If performance is no big concern, shouldn't this operand loading sequence be implementable with a simple state machine? Though I must admit my knowledge about CPU design comes from one mostly-forgotten university course a quarter-century ago... (the final exercise was creating a paper design, which was not even simulated, never mind built.)
Here is an interesting piece on how the 6502 decodes instructions:
http://www.pagetable.com/?p=39
It explains how the instruction decode causes undefined opcodes to do the strange things that they do.
We managed to make one single register, but we used all of post-grad air pistons kit in our pneumatics lab. Just for the heck of it. Ours could retain the ram memory even without power, obviously. It would have become even bigger if we tried to do anything larger.
But hey, kudos, that was mighty impressive. I guess that's what would have happened if we didn't have single-die processors, or in a post-apocalyptic steampunk future.
You might want to read the creator's web pages, particularly this one...
http://megaprocessor.com/progress.html
where he explains the limitation.
That's nothing, when he can do it in Minecraft, I'll be impressed.
(And before I get spammed, yes I know there are Minecraft computers, but the limitations of the 'platform' do quickly make large scale computer exponentially more difficult than actual computer construction, so they tend to be rubbish.)
I don't live nearby and don't have the time but would love to help out if I could. I have joked for several years about buying replacement gates/pixels/ccd cells etc for stuff at work [TV engineering] (no one laughs for some reason...........I know, I know)
I wonder if you can step-by-step the clock for debugging?
"I wonder if you can step-by-step the clock for debugging?"
Apparently, yes.
As a German engineer, and therefore a rather lazy person, I have to point out that there's a way to make such a computer with _much_ less parts at the expense of speed.
The idea is that you build a bit serial computer. This means that lots of parts will suddenly become a lot simpler. You can still have 16 bit words, but your ALU, for example will just process one bit a time. Your registers become shift registers with a one bit input and a one bit output. All your buses will also have one bit and clock in their values serially.
There's a book describing such a system. I think it's called "Elektronische Rechenmaschinen". I think it describes a 20 bit machine working bit serially. Back in the early days of building computers, reducing the complexity was essential for many teams building a computer. Trading a factor of n in speed for a factor n of complexity seemed a _really_ good idea back then. Particularly since back then as now, computers rarely were fully utilized.
>The idea is that you build a bit serial computer.
That is exactly how most tube/valve computers were designed. It saved enormous amounts of circuitry, although a magnetic drum memory was mandatory to hold all the data and registers. And damn were they slow.
https://en.wikipedia.org/wiki/LGP-30
There's also bit-serial parallel computing ... SIMD, with one instruction at a time broadcast to an array of one-bit processors. The ICL DAP, if there's anyone else out there who can remember that ill-fated project. I had great fun one summer learning to program it in assembly language.
Even though I am a geek I oficially don't get why anyone would spend so much time/money/space doing this as you can just go out and buy a functionally much better CPU for pennies that would fit easily into a matchbox.
Heck if he just wanted to see his processor design implemented he could/should have just programmed it into a PLD or something.
"Even though I am a geek I oficially don't get why anyone would spend so much time/money/space doing this as you can just go out and buy a functionally much better CPU for pennies that would fit easily into a matchbox."
It's quite tricky to teach visually about CPU logic design using a CPU in a matchbox. It would be like training a chef to prepare food by giving him a fiver and telling him to go and buy a big mac.
And you matchbox wouldn't get you publicity, a shining CV, unlimited admiration.
Or have I been whooshed?
I can think of no nightmare worse than hand-wiring 14000 of the TO-92 transistors pictured. They wiggle around unless you bend the leads, and bending the leads causes solder bridges. After all that, you'll find that one is in backwards and the slightly bent leads have anchored it down with such incredible strength that molten solder spatters everywhere when you pull it out.
The 6502 is not a stack-based machine. It does support a stack for push, pop and subroutine call instructions, but so does almost every other microprocessor. I think the 6502 can be best described as an accumulator machine, where all arithmetic and logical instructions require one of the operands to be in the accumulator (A) register. (The first CPU I ever tried programming in machine language, on the Oric 1, which is why I go on about it...).
If anyone is tempted to do something similar , a nice compromise might be to start off looking at the AMD 2900 family of chips . https://en.wikipedia.org/wiki/AMD_Am2900 . This stuff starts at the ALU level . But being interested in this stuff , does anyone know if he went for a RISC setup with microcode ?
--That is an amazing undertaking, and a smart guy to build a processor from scratch.
--He obviously has a lot more free time and ambition than I have.
--Troubleshooting shouldn't be that bad as long as he has a diagram. He obviously understands the principles of operation, so he would just need to put a scope on the correct test point and make sure signals are present. The behavior will tip you off as to what's wrong. Many years ago when I took electronics, we had a donated DEC PDP 11/70. Our instructor would sabotage it, and armed with blueprints and a scope, we'd have to find the faulty component or failed wiring.
--Looking at the MIPs on the processor comparison, it's obvious the MOS6502 was the greatest bang for the buck in its day. Half the transistors and clock speed and still as fast as its competitors.
That is all :)
All the logic that does computation is done transistor by transistor.
The only place that LSI has been used is on the 7-segment display boards - apparently it was too complex decoding hex (octal?) digits to a 7-segment display in transistors for something that is there 'for debugging purposes'. To quote Newman...
"I spent a bit of time trying to work out how to do the 7-segment display using discrete transistors but the answer is vast. Really, really big. It would have near doubled the size of the thing and the circuitry for the display would have obscured the circuitry for the processor which would have undermined what I was trying to do. As its only for debug and not proper function I went for chips. This is definitely NOT cheating, it is just for debug. It is irritating though."
To me it's an amazing achievement of education over rationality. Far as I'm concerned, if "James" doesn't at LEAST get awarded a Masters (even honorary) over this, I'll be annoyed with the universities. Not sure if it's worthy of a doctorate, although perhaps he deserves one in education. That said, many people have received doctorates for a lot less. Even in computer science.
Assuming the poor fucker doesn't already have one, which is why he's doing this in the first place.