back to article Happy 40th birthday, Intel 4004!

On November 15, 1971, 40 years ago this Tuesday, an advertisment appeared in Electronic News for a new kind of chip – one that could perform different operations by obeying instructions given to it. That first microprocessor was the Intel 4004, a 4-bit chip developed in 1970 by Intel engineers Federico Faggin, Ted Hoff, and …


This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    18-Wheelers? Eldorados?

    What's that in brontosauri?

    1. Mike Richards Silver badge

      Shocking failure to abide by Reg guidelines

      But that's the Americans for you, never using the same standard sauropod-derived measurements as the rest of us.

  2. GrahamT


    Re: "The 8080 wasn't alone, though – there was plenty of competition in the earlier days, such as the Zilog Z80, Motorola 6800, and MOS Technology 6501, which Pawlowski told us were all essentially equal competitors at the time."

    The Z80 took the 8080 architecture and expanded it with more 16 bit registers like the IX and IY index registers; it came after those other processors, so wasn't really a competitor "at the time", and soon took Intels market for general purpose microprocessors. For us Brits, its most obvious manifestation was in the Sinclair ZX80 and 81, but it was also used in many embedded systems. I loved programming those things.

    The first microprocessor I worked with was the 6800, which I thought had a better architecture than the 8080, but CP/M ran on 8080 (and Z80) and was too dominant by the time the 6800 came along.

    I never wrote software in assembler for the 6502, as I didn't like the architecture at all - but that didn't stop me loving my BBC micro.

    1. This post has been deleted by its author

      1. Andus McCoatover


        I didn't realise it ran _THAT_ slow! Jeez.

    2. Giles Jones Gold badge

      More pedantry

      The 6502 (originally 6501) was designed by the same people who did the 6800 and shares a lot in common. In fact the original version (6501) was pin compatible with the 6800 until motorola sued and made them redesign it.

      So I find it hard to think why you would hate two very similar CPUs.

    3. ThomH Silver badge

      The 6502's not so bad

      You've just got to think of it as a load/store architecture, with the zero page acting like the register bank in other machines and accesses everywhere else being expensive. You end up doing most of your business logic with the two-or-three cycle instructions acting on the zero page and occasionally wander into the elaborate four-upward cycle instructions to fetch tabular data. Oh, and I guess you have to get used to the slightly weird one's complement subtraction but it ends up just being a carry inversion since all arithmetic is with carry.

      I prefer the Z80 but I think that's just because I know them only through the home computers and the popular 6502s always had to confuse the issue on video circuitry, the 6502's relatively poor random memory access speeds seemingly making people want to back away from just giving it a framebuffer in a sensible order.

      1. A J Stiles

        Ones complement? Not.

        The 6502 uses twos complement arithmetic. You have to set the carry with SEC before you begin a subtraction, is all, because there is no SCS (set carry and subtract) instruction. Afterwards, the carry will be set unless we had to borrow one from the next byte.

        Also, the 6502 writes multiple-byte numbers units-first; but the 6800 is units-last.

        As to the "out of order framebuffer" thing, I always thought this was just a consequence of using the display generation process (which obviously must read every byte of the framebuffer memory) to perform DRAM refreshing. (The Z80 can do its own refreshing, but only up to 16K bytes as the R register is only 7 bits.)

        1. david 63

          I loved the 6502 instruction set

          It was much cleaner than the macro-ised multiple cycle 8080. I used to write in it for the bbc micro, oric, atmos and dragon assembling by hand (writing assembler and then converting to machine code by looking up the instructions in the reference manuals.

          Aye, ya tell the yoong kids today...and they woont believe ya ;)

  3. Herby

    The original clock speed target on the 4004...

    Was supposed to be twice as fast. It would then be comparable to an IBM 1620 which Ted Hoff was VERY familiar with. It turns out that the yield on the chips if they were to be specified at the higher speed would be very poor. Yes, the 4004 was compared to an IBM 1620, Business Week in the day had a picture of one that was compared. I owned that one (and still do if I can find it).

    Interesting days of microprocessors in the "beginning". Much has changed in 40 years. For example, before they had quartz windows on EPROMS, you needed to have them X-rayed to erase them. Now you just ask and they (flash) forget.

    1. This post has been deleted by its author

      1. Andus McCoatover

        "..before they had quartz windows on EPROMS, you needed to have them X-rayed to erase them "

        That was before the 2508, which neded (iirc) +12, -5 and +5 to function.

        I reckon I should stick core-rope on my machine. Slow me down a bit.

  4. Dominic Connor, Quant Headhunter

    Was it made of wood ?

    I had forgotten how primitive we were in the 1970s and that picture reminds me of the vital role of stripped pine in the fabrications of chips.

    1. Francis Boyle Silver badge

      everything was wood veneer - why should processors be any different.

    2. heyrick Silver badge

      Wood veneer

      These days we call it "steampunk". Back then, it was normal.

      Ahhh... a deep varnished wood-finish processor with delicate brass corners. That would be something worth showing off.

    3. Michael Wojcik Silver badge

      The hard part was shoveling the coal into that little tiny oven.

  5. Eponymous Cowherd

    Pure quality

    Oak veneered processors.....

    They don't make 'em like they used to, do they?

  6. Anonymous Coward
    Anonymous Coward

    Memory I/O

    I like the way the DDR3 drivers take up almost as much area as one of the cores in the latest part. They are practically an FPGA in themselves.

    1. Turtle_Fan

      While I don't claim to be an expert I think the uniform vastness you refer to, is cache and has not much to do with "driving the DDR3" memory. It's true though that recently, cache is the most transistor intensive area of the CPU's and tends to be a major determinant of final price.

      (If I'm wrong can someone point out the correct interpretation?)

  7. ChrisC

    "...the chip itself wasn't all that impressive. It ran at 740KHz, had around 2,300 transistors that communicated with their surroundings through a grand total of 16 pins..."

    Sounds not entirely unlike some of the embedded processors some of us still use these days...

  8. Mondo the Magnificent
    Thumb Up


    Thanks El reg for what could either be a trip down memory lane for us old timers, or an educational article for the younger generation!

    A lot history covered in this article, I was fortunate [or unfortunate] enough to have worked at an Intel Distributor in the 90's. We had massive posters of the 286, 386 and 486[DX] cores in our tech department and it was amazing how much interest these would attract.

    Once again, thanks for an informative, interesting and somewhat nostalgic article.. Two Thumbs Up!

  9. Prag Fest
    Thumb Up

    Nice one

    Rik, been really enjoying your recent articles, another good one, cheers.

  10. Cazzo Enorme

    A lot Intel bias going on there. For starters, the Pentium was not the first "superscalar" processor, as the technique had been implemented as far back as the 1960s by Seymour Cray. As for the poxy 8088 and it's offspring, they're still hindering advances in programming by making pretty much everyone cater for the brain damaged x86 instruction set. If only IBM had chosen a chip with a decent instruction set (the Motorola 68k for instance), then we may have seen advances in instruction set design going hand in hand with advances in manufacturing processes. Even Intel have acknowledged the problems of the x86 architecture - by creating alternative processors like the i960 and then putting a RISC core behind a complex decoder for more recent x86 implementations (we'll just forget about Itanium, as that just proves Intel can still fuck things up on a major scale).

    1. E_Nigma

      Now thank God that that was not at all biased.

    2. Michael Wojcik Silver badge

      "first superscalar machine"

      If you actually read the article before you post your rant, you'll find that when Pawloski describes the Pentium as "the first superscalar machine", he clearly means it was the first superscalar x86. On the very next page of the story he says they were "playing catch-up" and knew that they could have started on a superscalar version earlier.

      I expect Pawloski is well aware that the Pentium wasn't even Intel's first superscalar chip - that was the i960, from the late 1980s.

  11. Yag

    "That first microprocessor was the Intel 4004"

    Nope, it was the first commercially available microprocessor...

    The very first one was the MP944, but it was a bit... restricted :)

  12. Anonymous Coward
    Anonymous Coward


    Brontosauri? Surely everyone knows full well that the SI unit of comparative size, is double decker buses.

    1. Anonymous Coward
      Anonymous Coward

      Or "football pitch" and "Wales" for area, depending on scale.

      Heard a Rhod Gilbert rant that basically went along the lines of:

      "I know what new readers really mean when they the scale of disasters in terms of "Wales". They wish it was Wales are each time are muttering under their breath "but NOT Wales". I know your intentions. Ah, but thejokes on you! Now Wales is used as a scale it cannot be obliterated or you wouldn't have anything to measure disasters by!"

  13. Anonymous Coward
    Anonymous Coward

    Succeeded despit etechnology not because of it

    The story of teh scusess of intel microprocesors is that commercial and not technical factors dominate.

    The 8086 was very much inferior to the 68K and the 16032 it was probably on a par with the Z8000. I rember Intel trying to sell to me at that time and they always emphasised price, the agreement with AMD that gave guarantee of supply and assurance on pricing, and support. They never tried to sell on performance or technical aspects because it was well behind Motorola.

    The PC then came out and things changed very rapidly. Intel broke the AMD arrangement and the price of the first non-agreement part the 80287 sky rocketed. Technically intel parts were still very much second best but they sold fantastic numbers o fparts. The 80286 retained the awkward segmented architecture extended withprotected mode performance was still very poor. The 386 finally had a sensible memory architecture but still had the nasty special purpose registers and complicate dinstruction set and performacnce was still very poor compard to other micros. It was probably not until the pentium that Intel gained parity with other microprocessors.

    None of these technical things mattered, one design decision by IBM made Intel the dominant microprocessor company with massive reources despite not because of their technical design.

    1. Turtle_Fan

      Pedantry at its best....

      You really don't know when to press space, do you?

  14. David Mery

    Ada + GC + 32-bit ...

    Remembering the iAPX432.

  15. Kevin Fairhurst

    I have very vague memories from uni where we were told that the design of the 4004 was actually carried out by an intern / gap year student, and design flaws were carried through a number of iterations of follow-up chipsets to retain backwards compatibility. This story was probably apocryphal - it's nearly 20 years since I heard it so I cannot remember any more details! Anyone else heard anything about this?

  16. Number6

    Eight Bits

    I miss the simple days of the 8080/Z80/6502/6800/6809 where the layouts weren't critical and the instruction sets were easy to use at the machine level. Every clock cycle and every byte counted back then, with memory limited by cost and the address bus.

    Seeing a Z80 emulator run on a 686-class machine or better and claiming to be the equivalent of a Z80 with some fantastic clock speed does bring home the huge performance gains.

    Meanwhile, I'll stick to programming my embedded 8051, another processor that dates back to simpler times.

    1. Medium Dave

      Ahh, those were the days...

      ...when bytes were were real bytes, Motherboards could be fixed with a soldering iron, "intellectual property" meant you'd paid off your Encyclopedia Britannia, and 'programming' meant hand coding raw MC. Maybe assembler if hung over.

      And yes, counting every damn clock cycle.

      God, I feel old.... <sniff.>

      1. Anonymous Coward 15

        And when I were a lad...

        <insert Four Yorkshiremen sketch here>

  17. introiboad

    Correction needed?

    In page 8, I read: "Nehalem was a 45nm part, and a follow-on to the first 45nm parts – code-named Penryn – which introduced the second of Bohr's process improvements"

    But in fact believe that Penryn preceeded Nehalem, since Penryn was based on the Core architecture, the one immediately before Nehalem.

    1. Lennart Sorensen

      Nope you got it right.

      Nehalem was a follow on to penryn. That is what it said. That means penryn came first. Where is the problem?

      All it says it penryn was the first 45nm part and the nehalem was a follow up 45nm part.

      So your objection seems to be to agree with what it said.

      1. introiboad

        Of course, I misread the original text even after copying it. Thanks for the precision.

  18. Anonymous Coward
    Anonymous Coward


    And now the crippled x86 design looks doomed by those sitting up and taking notice of ARM, especially in the mobile/laptop/low-power-server markets.

    They are like MS, they got their foot in the door because of a lazy decision big blue made one day, and have been laughing to the bank ever since.

    Really didn't rate them at all until the 80386. Fond memories I must admit, felt that was the milestone when PCs became truly useful (or 'fun' in terms of gaming!)

    1. Andrew 59

      Correct me if I'm wrong, Sir Wiggum (or anyone else for that matter) but x86 is also known as CISC, or complete instruction set computer/ing, whereas ARM, PowerPC etc. are RISC or reduced instruction set computer/ing. And Apple not too recently jumped from PPC to x86 for their processors... Why do you describe x86 as crippled?

      We really need a ? icon!

      1. Michael Wojcik Silver badge


        The RISC/CISC debate is essentially irrelevant for general-purpose CPUs these days. The major CISC architectures long ago went to decoding CISC instructions into micro-instructions that are processed by (superscalar) RISCy cores. Meanwhile, supposedly RISC architectures got steadily more complex, starting with IBM's RIOS (the first POWER implementation).

        And CISC/RISC was never a matter of being "capable" or "crippled". The CISC/RISC distinction was invented when "RISC" was coined to describe CPUs that deliberately restricted their instruction sets to those instructions that could be done in a single clock cycle. That followed from the observation that compilers rarely used fancy multi-cycle instructions in their generated code anyway. CPUs like VAX provided all sorts of nifty operations for the benefit of assembly programmers, but when most software was being written in HLLs anyway, it made more sense to optimize the simple opcodes. And having the same one-cycle timing for all instructions makes that easier.

        Some people feel the x86 architecture is "crippled" because its early members had various failings (segmented memory architecture, few general-purpose registers), and later members have carried some of those along for the sake of compatibility, while maintaining an arguably excessive instruction set. Of course, modern x86 CPUs are rarely used in anything other than flat-memory mode and have more registers than their ancestors; but I don't know that you'd find many folks who'd describe the x86 architecture as elegant.

  19. f1rest0rm

    Further Reading

    Good article as someone who cut his teeth on Z80s and 6502s I love stuff like this. Anyone got any suggestions for further reading on microprocessor history? Books, PDFs, URLs, whatever ....

  20. Andus McCoatover

    YE Gods and little fishes!!!!

    Friend of mine's father worked for Courtauld in Coventry. They had some of the first 4004's in the UK, and he kindly gave me a copy of the 4004 handbook.

    I was a teenager at the time, trying to diligently figure out (analogue) electronics.

    When I started to read it, I was transformed to a totally different world.

    It took me ages to figure out timing diagrams, truth tables, concept of registers, the instruction set...not being versed in digital electronics at that time. The concept was obviously completely foreign to a spotty schoolboy.

    However, I persevered, and finally understood it. (No Google to help you in those days).

    That book - I don't know where it is now - was the rocket under my ass to the path I was to take.

  21. Turtle_Fan
    Paris Hilton

    Some of the old(er) hands please enlighten....

    If I read this right the 4004 was purpose-built to be embedded in a calculator.

    Reading the calculator's capabilities, I get the feeling that the 4004 is a bit of overkill for something that does the 4 basic calculations plus percentages with storage for one number. I understand that back in the day even this would have been avant-garde but was such a processor really the minimum necessary to deliver the performance?

    Paris, cause I'm just as clueless

    1. Lennart Sorensen

      It was. But if you are building a calculator and intel can design and build you one chip for $5 or so to do the job, or you can use twelve existing chips costing way more than $5 (and making the calculator bigger), which would you go with?

      Who cares if the 4004 is overkill for a desk calculator. It's cheap and small.

  22. Colin Eby

    A few corrections more...


    1. IT was not built on the Intel 4004 or its successors. The information technology industry started in the 1950s with pioneering data processing applications leveraging emerging computing technology. Remember LEO, and the IBM 1401? They were certainly information technology systems. You'd have to use a pretty discrete and tortured definition of IT to claim the 4004 was its first building brick.

    2. You use the phrase 'first processor' to describe the 4004. Here comes more pedantry... This is not true either. It was the first commodity, commercially available microprocessor -- which is to say an IC with all the traditional components of a CPU. Computer processors in modern sense date back to at least 1949 and EDSAC. The Digital PDP-11, a direct contemporary of the 4004, certainly has a processor, as did all it's ancestors. What it didn't have was a single chip 'microprocessor.'


  23. Anonymous Coward
    Anonymous Coward

    Nitpick: "the first"

    That's debatable. For one, the 8008 was worked on in parallel. But it wasn't intel's own idea; it was a commission to a specified instruction set. intel didn't really like it; the 8008 team got poached for the 4004. So to consistently push the 4004 as the first uP is, well, a bit of a case of NIH on intel's part, starting back then. And then there's the Four Phase Systems AL1 that was available earlier, but only as part of a product, not as parts. Of course intel will claim high and low it was their 4004; nobody's around to contest the claim and hey marketeering for great history rewriting.

  24. N_Wanzer

    40 Years....

    Ok, now I really feel old...

  25. Steve Knox

    More Pedantry

    "After the 8086/8 came the 80286..."

    No, it didn't. After the 8086/8 came the 80186/8, which was then followed by the 80286.

    I remember coding in 80186 assembly on my dad's Tandy 2000...

    1. Anonymous Coward
      Anonymous Coward

      My BBC Master

      My Beeb Master has a 186 board, AKA the 512 Board. I also used 186 RM Nimbus at school.

      1. Anonymous Coward 15

        Nimbus? Do you remember the game where this came from...

        That was not a good idea.

        (space bar)

  26. Asphy
    Thumb Up

    definitely a trip down memory lane..

    I had the pleasure of enjoying this ride from the 8088 (with 10mhz Turbo button!!) All the way up to now. Fantastic article, more like this please :)

  27. Will Godfrey Silver badge

    A 6502 starter here. Then ARM1 - oh what a difference!

  28. Mike 16 Silver badge

    8008, not the child of 4004

    8008 was based on the Datapoint processor. Had little to do with the 4004. I had wondered why the interrupt was not really usable until I read that the original had two register sets. Ah!

    Those crazy segments? The 286 finally did them almost right, if by "right" you meant "well tuned for a Multics-like system". But by then all the new kids were running DOS and Unix, so...

    Interesting how AMD is touted as just a minor goad. Forgetting both Sledgehammer (morphed into x86_64) and HyperTransport (aka "father of QPI")

    And don't get me started on 6502s vs Z80s for graphics-intense systems. Been there (CPU eval "shoot outs") Done that (chose 6502).

  29. PT


    740kHz was the oscillator frequency, not the processor speed - it took 8 clocks to execute a single instruction. The actual instruction cycle was 10.8 microseconds, for an effective speed a little under 93kHz.

  30. Yet Another Anonymous coward Silver badge

    I feel old

    I'm the same age as the 8008 !

  31. Old Hand

    I see the picture is of the one in Teak; does anyone know if they did it in Mahogany too?

  32. Dom 3

    And this year's clumsiest El Reg analogy award goes to...

    Pencils? Cadillacs? WTF?

    How about just telling us how big a Core i7 die would be if implemented in 10 micron.

    That'd do nicely.

    1. Dom 3

      Oh, I'll do it then.

      Apparently the core i7-920 has 730 million transistors in 0.045 micron process in 263 mm^2. Making a guess at the dimensions - 18mm x 14.6mm. Scaled up to ten micron that's...

      4 meters by 3.24 meters.

  33. Ian Ferguson
    Thumb Up

    Lovely pictures

    Anyone got a link to even higher resolutions? They'd make a great art sequence in the lab :)

  34. Rick Brasche

    A month of new desktop wallpapers!

    thanks, Reg! I now have a nice collection of (work safe!) desktop backgound wallpapers to use!

  35. Anonymous Coward
    Anonymous Coward


    I was in my final year of secondary modern school when this product was first launched. And I chose electronics as a career???

    Subsequently Intel 86 CPU bus widths have doubled every few years - 4, 8, 16, 32, and in the early 90's 64 bits. Whilst there have been some "specialist" CPU's with a 128+ bit CPU bus it seems the x86 architecture has got stuck with 64 bits.

    Even DEC have had a dinosaur event - the PDP8 computer had a 12 bit bus.

  36. Stevie Silver badge


    And when, pray will we see the transtator, the technology that will free mankind to explore the heavens, transcend their tawdry beginnings and stuff like that?

  37. Anonymous Coward
    Anonymous Coward

    "At the time, only the most far-thinking futurists could have imagined the 4004's impact."

    Bob Noyce did not see it. Andy Grove certainly did not. I think Gordon Moore may have had a clue.

  38. Anonymous Coward

    Overclocking for the senile

    Being an old fart, my memory is unreliable, but I vaguely remember overclocking 80286-vintage cpus by pulling some sort of crystal/tranny thingummy off the motherboard and soldering in a new one. But there's no one around here ancient enough to tell me if I am remembering this correctly. Anyone remember doing this?

    (If the tranny had been called Crystal, I might have remembered.)

    1. Michael Wojcik Silver badge


      Yes, on the typical 80s-era x86 system, the clock crystal was a discrete component on the motherboard, and you could easily remove the existing one and solder in a faster one.

      Whether your system would still work reliably was another question. And its real-time clock would run fast, IIRC, since it used the same time source - though that could easily be fixed in software, by catching the RTC interrupt and adjusting the timekeeping based on your new crystal's frequency.

      There were a lot of easy hardware hacks for 80s PCs (well, easy if you were handy with a low-temp soldering iron). I had an original 5051 PC with the CGA and a green-phosphor composite-connection monochrome monitor, and my dad and I soldered a couple of resistors onto the CGA to turn the NTSC color output into different intensity levels of monochrome output, for example.

      These days hardware hacks tend to be a lot more ambitious and impressive, but also generally require more skill - not so much for the casual trip-to-Radio-Shack types.

  39. Arthur Dent

    Not the first

    So Intel's public relations people are still perpetuating the myth.

    The AL1 was the best part of a year before Intel's 4004 (it was in use by customers months before the 4004 was first announced) and needed no more support chips than the 4004 did, so it's hard to see how the 4004 could be considered the first.

  40. heyrick Silver badge
    Thumb Up

    Oh wow...

    Those pictures... nerd porn overdrive...

  41. This post has been deleted by a moderator

  42. Neil 7

    Funny how Intel are given the credit for inventing tri-gate technology

    despite AMD being the first to announce triple gate transistors at the International Conference on Solid State Devices and Materials in 2003... in the following article, they sound awfully similar to how Intel describe "their" creation:

    Oh and yes, the 80186 superseded the 8086... a pretty basic error on the authors part.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2021