back to article Slideshow: A History of Intel x86 in 20 CPUs

Would there have been a PC revolution had Intel decided in the late 1960s to stick to making memory chips and turn its back on microprocessors? Almost certainly, but the company did get into CPUs and IBM chose its 8088 chip to build into its first Personal Computer, the 5150. The 8088 and its sibling, the 8086, evolved from the …


This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Shouldn't that be the office PC revolution?

    The most important CPU for the home computer and video games market was the MOS 6502.

    Not to mention that the 16-bit machines people owned at home were largely Amigas and STs.

    Nobody I knew owned a PC until the early 1990s and only when Windows 95 came along was it actually nice to use. Even then the responsiveness was abysmal compared to the much lower specced Amigas and STs.

    1. Anonymous Coward
      Anonymous Coward

      Not just MOS 6502... I would also include Zilog Z80 for the home computer for the ZX81, ZX Spectrum and Amstrad CPC.

      And I agree that the Amiga was better at multitasking than Windows 95.

      1. Christine Hedley

        > "Not just MOS 6502... I would also include Zilog Z80 for the home computer for the ZX81, ZX Spectrum and Amstrad CPC."

        I'd also like to throw in my tuppence as a former 6809 owner! Er ... quietly slinks away ...

    2. ThomH Silver badge

      The 6502 was important thanks to Commodore, Apple and Atari but it was an Intel 8080 that powered the Altair 8800, the genre-defining home computer, and also the 8080 that CP/M was originally defined around. And in the UK it was the Z80 — an improved 8080 from the same team, albeit as a different company — that ran the ZX80 and the ZX81, which started home computing there. There are also a raft of other notable Z80 machines, not least the Spectrum, the Colecovision, the Master System and, approximately, the GameBoy.

      I guess there are various alternative strands, like the 6502 inspiring (in at least a couple of senses) creation of the ARM or the 68000 and its progeny of the PowerPC (that, though gone from the desktop, powers the major consoles), but I disagree that you can write Intel out of the computer and video game market.

      1. Anonymous Coward
        Anonymous Coward

        Motorola 6800 inspired the MOS 6502

        The Moto 68K was the evolution of the most excellent Moto 6809E (16 bit internal/ 8 bit external). The 6502 was an enhanced clone of the 68000 (one that MOS got sued for.)

        1. ThomH Silver badge

          Re: Motorola 6800 inspired the MOS 6502 (@AC)

          My understanding is that the 6500 was pin compatible with the 6800 since MOS Technology hoped to be able to walk up to Motorola's customers and sell the 6500 as not requiring any wider system changes. Motorola obviously had something to say about that, especially as Chuck Peddle — chief designer of the 6500 — had previously been a Motorola employee on the 6800 team, suggesting a trade secrets angle (spurious, but beginning the action was enough in itself to do the desired damage) . MOS backed down and pushed the 6502, identical to the 6500 except for the pin out.

          I don't think the two processors shared any internal design features; they're not semantically equivalent (different numbers and sizes of registers, different addressing modes, different ways of handling decimal arithmetic, just a different instruction set overall) and certainly aren't byte-code compatible.

    3. LDS Silver badge

      Of course different people will have different experience, but mine tells that since mid-80s people really interested in computing and not video games started to move from Commodore/Sinclar and the like to IBM PCs and clones. People used to commad-line interfaces had not any issues with DOS, and didn't wait for Windows. I got my first Intel PC (an IBM clone) in 1987 when I started studying Physics at the university, and it was clear my Commodore couldn't help me much - I saved the money to get a math coprocessor so I could run Matlab also.

      I made a good use of Borland Quattro to quickly perform lab experiments calculations and graphs (some raised eyebrows from professors thinking I was "cheating"!) instead of using handheld calculators and later writing custom applications in TurboPascal. Thereby no, it wasn't the "office" PC revolution, they were really the first "affordable" personal computers capable of real "professional" tasks and not only simple "home/game" ones.

    4. Anonymous Coward
      Anonymous Coward

      6502 Assembly

      remember learning assembly language on the 6502 in college 1987/8 then using autocad on 286/386 + math co-pro. Got my first PC in 1990 at uni 486DX 33Mhz trumped my mates that all had 386SX 16mhz all a part from one poor sod that had a 286 (bit out of date by 1990)

  2. Chris 3

    What's up with the Celeron and Xeon?

    Can anyone explain to me why the Celeron and Xeon (the Celeron especially) look so different to the others? Instead of the standard 'functional blocks' look, they appear to have a fairly regular grid pattern. If I didn't know better I might have though that someone had taken photographs of the wrong side and those were connectors emerging or something.

    Enlightenment gratefully received.

    1. Stoneshop Silver badge

      Re: What's up with the Celeron and Xeon?

      It looks like those pics have the package pin layout superimposed, or something like that.

  3. Steve Mason

    I used to "game" on 8088/8086 machines that a mate's dad had at home.

    From Top Gun wireframe b/w to Joust and Mean18 Golf on CGA screens. I also used Windows 1.0 before 3.0/3.1/3.11, and DOS 3.0-6.22.

    I never owned an ST or Amiga, and owned my first PC back in 386 days.

    Great to see all this old tech, and what could be achieved on so few resources!

    1. fiatlux

      Having had a very early clone of the PC/XT (Olivetti something, running with a 8088 CPU) in the late 80s, after having had fun with a ZX81 and then an MSX, I also have great souvenirs of 4-colour graphics and silent games (no sound card included).

      I actually had an EGA graphics card (640*350 in 16 colors!), but no software I had could use its potential.

      I also had Windows 1.0 disks but cannot really say I managed to use it. The first really usable version of Windows was 2.1 and then there were only a very limited number of software that made use of it, Aldus Pagemaker being one of the few ones.

      My next PC was a 486 33MHz with 8MB of RAM! It was actually more powerful than the diskless Sun workstation I was using at University. With a DOS Extender, I could already use up to 4GB of virtual memory for my image processing courses... Those were the days...

      1. wt29

        Olivetti M16 - it had an 8086 with a 16 bit data bus as opposed to the 8088 8 bit data. A bit more expensive to implement but memory access in one fetch vs 2 for the 8088.

        The "holy grail" in those days was IBM video compatibilty, usually measured by the ability to run MS Flight Simulator. The M16 didn't but did run Dbase II ! which I used to write simple accounting programs.

  4. bazza Silver badge

    28 years

    Had a 8088 power Ajwad manufactured XT clone. Despite sitting in the loft for many many years it was still working perfectly in 1999 when I finally got rid of it, fantastic machine. Had a Viglen 80286 (great machine) after that, and then AMD ever since. Recently experimented with an Intel i3 1156, disaster! Proved to be hugely unstable (run times of minutes before randomly auto-resetting itself), so went back to AMD where I've stayed ever since.

    Despite their unarguably amazing progress in the past decades (and AMD too, hanging on to their coat tails) Intel are remarkably conservative. Perhaps that's why they've done so well. I can't help thinking that it's killing them. X86 as an instruction set is woefully long in the tooth, and it's preventing them from getting power down as low as ARM has managed.

    ARM have shown that there's a load of money in a simple RISC ISA implemented in a core containing less than 50,000 transistors coupled with moderate caches and specialised co-processors for video codecs, etc. It is making Intel's "The CPU Shall Do Everything" approach look even more antiquated than ever before. For example, remember when they were trying to turn x86 into a GPU? That didn't exactly work, did it.

    Another lesson that ARM is teaching the world is that the software industry *can* change ISA. X86 binary compatibility is clearly very unimportant to the mobile world, and even the server world is beginning to contemplate a change.

    It's something that I think Intel have been overly afraid of. They botched Itanium by not making it fast enough when emulating x86 in the eyes of some. I think that they shouldn't have ever made it x86 compatible in the first place, and done a deal with AMD to co-develop the architecture. That may have averted the rise of x64, which has no doubt been hugely successful but somehow one can't help but think of it as an opportunity missed.

  5. Alan Sharkey
    Thumb Up

    Anyone remember the NEC 8088 lookalike which was 1.3 times as powerful? Mad a massive improvement in Lotus 1-2-3 speeds. Also, do you remember cracking the Lotus 1-2-3 so it didn't need the floppy disk in the drive before it would start up (for those with 10mb hard drives - a massive amount in those days).


    1. elhvb

      NEC 8088 clone

      The V20. Yes, it's sitting in my loft in a "Falcon" clone machine.

      1. Anonymous Coward
        Anonymous Coward

        Re: NEC 8088 clone

        Seem to recall that the ICL M30 PC clones used the V20 as a go faster option.

        1. ThomH Silver badge

          Re: NEC 8088 clone

          The V20 was also superior to the 8086 in that it had an 8080 compatibility mode, though I'm aware of exactly one application that used it — a CP/M-80 emulator for MS-DOS.

  6. Gordan

    "Or did you long abandon them for x86 rivals like AMD, VIA Centaur, Cyrix or other makers of compatible processors?"

    Since about 18 months ago, I've been steadily replacing the x86 boxen with ARM ones as and when upgrade times arrive and memory capacity requirements of individual nodes allow.

  7. Mondo the Magnificent
    Thumb Up


    From 23000 to 1.4Bn transistors in 35 years!

    I worked at an Intel Disti and we had the 80386 core poster. It was quite large and and always seemed to catch the eye of our customers, they were astounded by the fact it contained 275000 transistors.

    I also recall the first generation Pentium and the Pentium Pro, the first ever x86 CPU with integrated L2 cache, in either 256K or 512K if one's budget could afford the latter.

    Those were the days, Pentium II and Pentium III in those massive Intel "Slot 1/Slot II" enclosures,

    Not to be excluded were the Pentium II Xeons that were also in Slot 2 packaging.

    It's amazing how Intel (and others) have managed to shrink the die and packaging as they've shoehorned more transistors into the CPUs.

    Marvelous stuff El Reg, a trip down memory lane, even reminded me of the ISA, EISA, MCA, VLB, PCI and AGP eras.. when jumpers were king and now I'm feeling nostalgic and may just dig my old Pentium MMX system out and boot her up...

    1. Anonymous Custard Silver badge

      Re: w00t!

      Yeah I just love the numbers that topics like this can throw around. It's part of a course I sometimes give to new entrants to our company, and when you can pick up a (not very new) iPod for example and tell them that there are more transistors in the little box you hold in your hand than there are people on the planet it really gets their attention.

      Especially given there are still some of those people who were born before the transistor was invented...

  8. Neil Barnes Silver badge

    But why did it take until the 386

    Before they started doing the images in colour?

    1. fch

      Re: But why did it take until the 386

      Educatedly guessing there, but it might be that the 386 was the first one manufactured at structure widths on the order of magnitude visible light wavelengths (i.e. not significantly more than a micron). That'd give color effects because the structures will work like diffraction gratings then, and the whole die looks like areas of color. Larger structure sizes don't cause this effect, at least not at close to perpendicular angles of incidence, so they would look largely grey - apart from intrinsic coloring of the material used.

      If you look closely enough, you'll notice some parts look reddish on the 4004/8008 (probably copper contacts), and the 286 one has that little red coil-like structure on the right edge. I'd contest all these pictures are color.

    2. Phil O'Sophical Silver badge

      Re: But why did it take until the 386

      > Before they started doing the images in colour?

      The world was in black and white back then, it only turned colour in the mid 1990's

      (with apologies to Calvin & Hobbes)

  9. pierce


    I started with the 8080a and 8085 and CP/M (having worked with various minicomputers before that, including DG Nova and PDP11, and IBM systems even before then). In addition with the stuff I used at work, I built my own 8085 based CP/M system, and a 8086 CP/M-86 system, both using 4x6 format vectorcards. The IBM PC with its slow 8088 was a step backwards from my homebrew 8Mhz 0-wait state 8086

    heh, I noticed Intel left out some other chips like the i432. I wouldn't have included the 4004 or 8008, as neither was ever suitable for a general purpose computer (the 8008's hardware call stack was just too shallow), nor the Atom or Celeron as both were downsized forks of the mainstream architecture... and the original Xeon was just a Pentium-III with multiple socket support.

  10. Phil O'Sophical Silver badge


    Elonex PCs with 6MHz chips, and the "Turbo" button that overclocked them to 8MHz. We had 12 in the office, and by the end of the first year all had been sent back to be repaired at least once. The 680x0-based single board systems were a delight in comparison. Solid, far easier to program with no segmented memory and a decent assembler. It's like VHS/Betamax, the better marketing won.

    Still, give me a PDP 11/83 over any of them :)

    1. fixit_f
      Thumb Up

      Re: Mid-80's

      It wasn't that it "overclocked" them per se. The "turbo" was their standard clock speed, when disabled it slowed the processor down a bit. If I remember rightly this was so that old games designed for slower processors remained playble!

  11. saundby
    Thumb Up

    I go back to the 4004, though I didn't build one myself until after I'd built an 8080A system. I did assemble and test 8008s for others, though I never owned one. I wouldn't really want to go back to any of those--multiple supply voltages, too many chips to implement the system core. I do, however, still use the 8085 for fun:

  12. Christian Berger


    The 4004 was obviously not an x86. I wonder how much Reg authors get from copying some images from some other site?

    1. John 62

      Re: Fail

      I think the friendly article, which I assume you read says that Intel supplied the pics.

  13. Prof Denzil Dexter

    Feel like a kid companred to all you old boys talking about your processors made of wattle and daub.

    my first was a 3rd hand compaq 486-66. cost me every penny of my savings account in 1992. The joys on windows 3.1 on about 8 floppys, and playing mortal kombat from DOS.

    Commander Keen anyone?

  14. Stoneshop Silver badge

    Intel competitors

    I'm not sure even the first PC-clone I owned (an Epson 286) had an Intel processor; everything after that has been AMD for x86-architecture machines, except for one Cyrix MII that I can't recall buying but did emanate from my parts hoard somewhat recently, and a dual PIII found in a skip.

    1. David_H

      Re: Intel competitors

      It was a NEC V series processor.

      1. Stoneshop Silver badge

        Re: Intel competitors

        OK, NEC V40 then.

  15. Anonymous Coward
    Anonymous Coward

    Coming to micros via mainframes the 6800, 6502, and 68000 families seemed logical implementations of instruction sets. The Intel cpus' non-orthogonal instructions felt clumsy and illogical.

    The 286 was particularly galling in the way it handled extended memory. The 386 was the first of their processors that handled large memories "properly". After that it seemed possible to keep expanding existing applications for new generations with extra memory by just recompiling. That ease seemed to stop with the Wintel 64bit PCs - where whole rewrites were needed to handle more than 2/4GB of application memory.

    It also seemed that Intel never took advantage of larger gate counts to implement a more secure hardware architecture - like that developed on the ICL VME mainframes. The use of descriptors to define the limits of a piece of data would have gone a long way to preventing buffer overflow exploits. Do Intel support such features now?

  16. Steve Williams

    A few microprocessors...

    Started with playing with Motorola 6800 and Mostek 6502 (in an Apple ][) but found more fun working with CP/M on 8080s and Z80s. I can't remember how many times I produced CP/M BIOS's for various systems and configurations for various S100 boxes. CP/M 2.0, 2.2 and 3. Last one was for Concurrent DOS not so long ago.

    During the 1980's programmed AMD 2901 processors on a bit-sliced custom integrated maintenance computer for a big mainframe. Wasn't bad, but the next generation of maintenance processor used Motorola 68010 / 68020s programmed in assembler: and that was such fun! I was really gutted when Intel won the microprocessor architecture wars, the 680x0 processors were so nice to use.

    After that I worked on mainframe OSes, so no more microprocessor fun.

    Six years ago I threw away my original Apple][ and a number of Altairs and Imsais and a few other odds & ends and just kept one Altar 680, which decorates the top of my bookshelf. I'm purely an appliance computer user now, by preference Apple Macs.

    However, there is somebody producing Altair kits again, and I'm sorely tempted!

  17. Peter Gathercole Silver badge

    Whilst compatibillity is one of the strengths of Intel centric computing

    I really miss the diversity of different manufacturers making radically different machines.

    I remember in the mid '90s when all of the articles in the PC magazines were essentially describing the same machine (IBM PC compatible) with the only differences being the clock speed, memory or disk capacity, processor generation or case-colour. It was the point that I stopped reading the magazines on a regular basis as computing was no longer exciting.

    I am not looking forward to the point where Intel have driven everyone else out of the server-processor market, and just hope that ARM can continue to make inroads into the desktop and mobile market. If Intel can achieve total dominance in all segments, then expect innovation to slow-down as the accountants try to extract more revenue out of each processor generation to maximise the R&D costs.

    1. Peter Gathercole Silver badge

      Re: Whilst compatibillity is one of the strengths of Intel centric computing

      Oops. "Maximise the return on the R&D costs"

  18. MondoMan

    I just don't see how anyone could count the 80286 as one of Intel's "great" CPUs.

    Started with a Z80 myself in an Exidy Sorcerer, but always liked the 68k's design and programming model.

  19. DJV Silver badge


    In the 1980s I worked as a Pascal/C programmer in a company that was using the Burroughs/Unisys B20 series of computers (designed by Convergent Technologies, a company that Unisys eventually bought). The B25 was powered by that rare beast, the 80186. Most PC makers skipped that one completely.

    1. Anonymous Coward
      Anonymous Coward

      Re: 80186

      The 80186 was quite often used in embedded system electronics.

    2. david 12

      Re: 80186

      Also used by Fujitsu for their MSDOS 2.11 compatible PC, which came to our company because we used their IBM compatible mainframe, and by a number of White-Box (beige actually) suppliers, including the luggable my dad used.

      Those chips were screamers. Large chunks of 8086 microcode had been replaced with dedicated silicon, so they ran like the later 286, (like an underclocked 286, since they didn't clock as fast). I used to take my final year thesis project to work to run compile/modify/compile cycles, since the 80186 ran compile cycles minutes faster than my V20 at home.

      Plenty of companies made 80186 motherboards, but not IBM. That's the only reason I can think of that the 80186 didn't get much publicity, and always gets left out of lists like this one.

  20. fixit_f
    Thumb Up

    Like Pokemon - I've pretty much caught them all. And I'm only 33!

    The early ones were hand me downs from my Dad. I still had my 286 at uni though in 1998, and it was a perfectly adequate word processor.

    8086 8mhz (Amstrad 1512)

    286 12mhz (Tandon)

    386-SX25mhz (Tandy own brand)

    486 DX2-66 (Viglen)

    Pentium 2 133mhz (beige box inherited from work)

    Pentium 2 333mhz (beige box inherited from work)

    Pentium 3 900 mhz (home built, overclocked to a bit more if I remember rightly)

    Core 2 duo 2160 mhz (HP Pavilion notebook)

    Core i5 650 no idea of clock speed, irrelevant these days it's all about the cores (Acer predator)

    There's probably a few I missed as well.

  21. Blitheringeejit

    > processors made of wattle and daub


    Though the main issue wasn't the processors being made of wattle and daub - it was the programming and I/O mechanisms being hand-cranked. My first encounter was with an 8080A development box - this had a "Program" mode, in which we entered individual CPU mechine-code instructions and data using 8 little switches and a "Next" button, and a "Run" mode, in which the status of the data and address busses was shown by banks of LEDs - 8 for data and 16 for address. Debugging? Well, the Run mode had a single-step button...

    Of course the proper computing types got to play with really advanced stuff, like punch-cards. But for us in the electronics world, "stored program" meant "write down a long list of hex numbers on a piece of paper with a pencil".

    You try and tell that to the young people of today...

    1. Anonymous Coward
      Anonymous Coward

      Re: > processors made of wattle and daub

      " But for us in the electronics world, "stored program" meant "write down a long list of hex numbers on a piece of paper with a pencil".

      Once spent Easter week writing a very large patch to alter the way a CTL MOD 1 Coral66 "application" worked. It could not be recompiled as the customer site did not have a development system. The whole thing was hand coded in hex on paper - then typed up on papertape for the hex patch loader to read. Those were the days when we still lived and breathed machine hex code 24/7.

      In 1970 on the System 4 J O/S there was a intermittent bug where a software I/O queue would freeze. A reboot was out of the question as jobs were halfway through. It was a neat trick to halt the machine - then use the engineers' panel to find the queue header in memory and set the missing flag. It's hard to believe now that it was one of the biggest IBM 360 compatible machines in the UK - with a whole 1MByte of memory. The 600MB hard disk had watercooled bearings and weighed one and a half tons - and took eight hours to archive to magnetic tape.

      1. Anonymous Coward
        Anonymous Coward

        Re: > processors made of wattle and daub

        Been there, done that! We were quoted £72000 to run the compiler and output the object code. But we did not hand code it - we wrote an assembler on an HP 85! It took a week and was an enormous cost a time saver.

        The F100-L also had an engineer's panel that was needed at every boot to fix an error in the bootstrap rom. Happy days indeed.

    2. Anonymous Coward
      Anonymous Coward

      Re: > processors made of wattle and daub

      I do. They sneer at me.

      I too used to be able to write down 8 bit machine code, and I built my first eprom programmers by hand, until I got a machine with big floppy drives and never looked back.

      One eprom programmer ran on an 1802 and the other used an Intel embedded processor with built in eprom. to read punched tape or a keypad. The handmade 1802 machine was needed to program the first Intel machine, after which it wasn't needed any more because I could simply use the Intel processor to clone itself.

  22. Kobus Botes
    Paris Hilton

    Our office bought an IBM XT in late 1985, to help with premium calculations (I worked for an insurance broker at the time).

    At that stage all calculations were done by hand (using a tape calculator), then handwritten on A3 sheets that were taped together before being sent to the typing pool to be typed on an IBM DisplayWriter, using massive 8" floppy disks that could store 256 KB.

    It would take two of us about 14 days to do three years' motor claims statistics for one client (just double-checking the figures was a major undertaking: one person would read the numbers whilst the other would add it up on the calculator - then we would switch places and repeat. More often than not the totals were different, forcing us to repeat the exercise).

    Once the typing pool had finished typing it all up, we had to verify everything again, plus correct typing mistakes. Once we were happy that everything tallied up, it would be presented it to the account handler for approval.

    Once he was happy, it would go to the branch manager (who would need to present to the client), who almost invariably asked for alternative calculations, using different excess amounts, et cetera, kicking off another two weeks of calculations.

    Since I had done Computer Science at university, I was asked to spec a system to automate the process as far as possible, saving time and improving accuracy and, most important of all, to enable quick recalculations.

    The system we eventualy bought comprised an IBM XT (running at 4.77 kHz), with an EGA Graphics card, capable of displaying 16 colours simultaneously, a massive 10 MB hard disk drive (it came standard with a 5 MB drive, but I reckoned we would fill it within the next three years or so, whilst a 10 MB drive would last forever), plus a 256 KB 5.5" floppy drive. We also upgraded RAM from 256 KB to 512 KB, soon afterwards going to 640 KB.

    Software was DOS 1.0, Harvard Graphics, Lotus-123 and MultiMate, whilst output was handled by a dot-matrix printer and a four-colour plotter.

    The whole lot was about 10% more expensive than a new BMW 518i (so you can imagine management's reaction when one of my colleagues suggested that everyone in our department should have one of those on our desks!). To put it into perspective, my gross annual salary was about one third of the cost.

    After spending a couple of weeks to set up the spreadsheets, hiring a temp to capture all the data and then creating the necessary formulas to do the calculations, the big day finally arrived when I had to demonstrate to management how the system worked (and justify the expense. Whilst the project was approved, they still needed to see it for themselves).

    It was amazing: half the office was jammed into our office to watch the show. I gave a short spiel of how it all worked, then changed a couple of key values (like rates and excesses) and pressed F9 to calculate (you could not leave autocalc on, otherwise it would take an age between entries, just to recalculate the whole thing).

    In less than 5 minutes we had an answer! Two hours later I had three different scenarios printed out, plus some graphs (We used to use a dedicated plotter, printing on 2.5" wide paper, to print graphs. This was also a painfully slow process, as the machine did not have any RAM, so you had to enter the co-ordinates and colours for each graph every time. The graphs for 10 booklets, containing only 12 graphs per booklet, was a week's work, as each graph had to be cut out and glued into place as well) - a whole month's worth of work for four people!

    Needless to say, when the first 286 came out, we got one and shortly afterwards a couple of 386 screamers arrived.

    Paris, as she looks as if she is also wiping a tear from the corner of her eye.

    1. Kobus Botes

      ...running at 4.77 kHz

      MHz! MHz, not kHz.


  23. fch

    To Intel - give us the chip art references !

    Like the Smithsonian collection does. You guys must be sitting on a secret list of your own chip design easter eggs, how about 'fessing up a little there ?

  24. Anonymous Coward

    Isn't that a 2nd gen 8080?

    I thought the first generation chip was 1MHz.

    Congrats, you didn't mention Moore once...or my eyes are going bad.

  25. Steve Graham


    A few months back, I found a PC board in the attic with a Cyrix CPU replacement daughterboard where the original 286 chip had been. I probably paid many tens of pounds for it, for the dubious benefits of some modest overclocking.

  26. RainForestGuppy

    Who remembers...

    the Infamous FPU bug that affected the initial Pentium processors.

    1. Lars Silver badge

      Re: Who remembers...

      PS. they simply lost the memory business.

    2. Anonymous Coward
      Anonymous Coward

      Re: Who remembers...

      "the Infamous FPU bug that affected the initial Pentium processors"

      ...and they were very reluctant to do an exchange programme until the PR started getting bad. There was a little program that tested the FPU to see if it needed exchanging.

  27. Andy The Hat Silver badge

    At least some people out there are as old as me :-(

    I'd would have liked to have seen the pictures also to scale. I made a demo card for the students many years ago with a series of processors from an 8080 up to a 486 that I'd taken the casing off so you could see the dies. That said more than 'number of transistors' alone ...

    1. Christine Hedley

      > "At least some people out there are as old as me :-("

      You say that as if it was a bad thing! We grew up in an exciting time. :)

      1. Anonymous Coward
        Anonymous Coward

        "We grew up in an exciting time. :)"

        But we didn't realise we were pioneers at the time. When our "old aged" technical bosses (viz nearly 30) gave us a task we would go back and say "how does this bit work?" - and they would say "I don't know either - it's up to you".

        I retired when it seemed most of the technical fun was gone - and my health could no longer stand the long irregular hours living out of the coffee/chocolate machines.

  28. PyLETS


    After the mainframe my first real micro computer was an Apollo Domain workstation based on this Motorola chip series. Must have cost my employer about 15 grand at the time. My first home computer was an Einstein, based on the Z81. Also worked on VAXes and Primes, but these were minis, not micros. We had an IBM 286 with 5 inch floppy disks in the lab but it was more a curiosity when we first got it than a practical tool. We did get it talking TCP/IP, but needed proprietary add on software to achieve that.

  29. Alan Sharkey

    I remember buying my first 8088 based system for £1000 in 1986. Before that, I had a Compaq luggable from work.

    Next big system was a 386 based system with an 80gb disk. Cost me £3000. But as I was earning quite a lot from making and selling software (anyone remember the original EasyEdit program), it was something I could afford.

    After that, I just made my own from the trade shows until I got good deals from Compaq/HP (I work for them now).


  30. bob, mon!

    Woodstock series

    I'd have to say my first processor was whatever powered my HP-29c programmable calculator. From there I've seen the 6502 (OSI), 6520 (C-64), 8088/8087 (Zenith z150), NEC v-20, 80386 (before it became the i386) w/ 80387, i486, i487, ... then Pentiums and successors. Oh,and680x0 descendants in PDAs, ARM in my smartphone, ??? in HP tablet. Typing this on a Core 2 Duo.

  31. Endymion

    manual chip layout

    Quite amusing that the 4004/8008 had sufficient spare die real estate to write the model number in HUGE characters!. That space would probably contain a whole core or a couple of Mb of cache nowadays

  32. Phil the Geek


    Having played with a 6502 (Ohio SuperBoard anyone?) I designed 8085 (enhanced 8080) and 8086 boards for my first employer. 8086 wasn't just a PC chip - BT's System X digital telephone exchanges used it in some subsystems. After the 286 I designed 386 hardware (and met Gordon Moore at the 386's UK launch!!!) and stayed involved with all the subsequent generations up to Pentium 4. Now I just tinker (for a living) with other people's boards...

    When Intel first launched Pentium, the MS Word spell checker insisted on correcting it to "Penis".

  33. Anonymous Coward
    Anonymous Coward

    Intel single chip microcontrollers were more fun ...

    I first entered the microprocessor age via the old MEK6800D2 evaluation kit, circa 1978. It had 128 bytes of RAM, nice. My microprocessor use peaked with the single chip micro - the INTEL 8051 at around 1985. I developed on the old Intel MDS 80 running ISIS 2 and using PL/M. Nothing better than using an in-circuit emulator and single stepping the processor except, perhaps, playing the original Star Trek ascii game on old MDS kit!

    1. Anonymous Coward
      Anonymous Coward

      Re: Intel single chip microcontrollers were more fun ...

      "...MEK6800D2 evaluation kit"

      Bought one when they first came out 1975/6(?). The cpu was still marked as XC-6800. Cost GBP160 for the bare circuit board and basic chips. It was supplied with an enormous book of worked examples of hardware and software implementations of peripherals. Went for the deluxe version with the maximum 1KBytes of static ram.

      Built it without realising that tantalum capacitors are polarised - and discovered later that they were all the right way round! Having had nasty experiences with early MOSFET transistors and static - I assembled it in the office wearing only my cotton briefs (late at night). Bought an industrial Gould 5volt psu with overvoltage protection as I didn't know if a cheap one was too risky - that cost a whopping GBP80(?).

      Having no oscilloscope it was made to work by tuning the two timing pots - and measuring the address bus voltages with a meter to see if it was in the "bios" idle loop. Eventually an oscilloscope showed it was successfully running somewhat overclocked.

      Unfortunately to interact with it needed an RS232 terminal - so it could only be used at work. One day we connected it to one of the bank of user terminals - and spoofed the mainframe login handshake messages. Fooled everyone as microprocessors had not permeated the everyday IT world at that point.

  34. gizmo23

    Old stuff

    A bit OT here but just last week I went to see my Mum who had written down a list of jobs for me to do while I was over and she had written it on the back of an 80-column punched card, one of several hundred I gave her when I was doing Fortran.

    I still think that if IBM had made their PC/workstation based around the 68000 instead of the 8088/86 then the processing power we'd have had on the desktop in the 90's would have been greater by a factor of 2 or 3. Most 386's were still running 8-bit code when Intel introduced the 486. Intel made the transition from 8-bit to 16-bit to 32-bit take far longer than it would have done if we'd started out with the 32-bit (internal) 68000.

    Just my 2p.

  35. Anonymous Coward
    Anonymous Coward

    Feeling really old

    First computer was a KIM-1 from around 1977. No enclosure, but at least I didn't have to assemble the main board (soldering is not my strong suit). It had a 6502 processor and a hex keyboard for data entry. Amazing what you could do with almost no memory, only keyboard (0-9, A-F) entry, and 4 LED "display". Eventually I build a "tape" I/O using a old cassette player and an interface board built from scratch using a design from a magazine.

  36. jake Silver badge

    I lusted after a 4004 in 1971 ...

    ... but, alas, I had to make do with the wire-wrapped relays & tubes ("valves" to you Brits) that my Father & I purchased as surplus at the late, lamented Haltech on Linda Vista in Mountain View. On the bright side, I learned how computer circuitry actually worked, which helped later in life (thanks, Pop!) ... A couple years later, I had a teletype & acoustic modem connecting to Stanford's Tymeshare system. It wasn't until late 1977 that I actually owned a CPU of my own, an LSI-11 (Heathkit H-11A, to be precise). The sticker on the back claims that my Father & I first booted the complete system in February of 1978.

    The first Intel CPU I owned personally was an 8086, again purchased as surplus at Haltech, probably in late '79 or early '80. I managed to convince it to control a greenhouse's internal systems. I built/populated the "motherboard" and attendant bits from scratch, including laying out the traces & boiling it on Mom's stove. I even had to build the power supply. Interesting hack, and a pretty good learning tool.

    Kids these days have absolutely zero idea ...

  37. Robredz

    Power Users, Circa 1994

    I had a 486 DX2 66 (the intel, version not AMD) 8 Megabytes of RAM, and was considered a power user by mates with that spec, at least it ran the software of the day well, and I didn't need to fiddle with Config.sys and Autoexec.bat for Doom and Rise of The Triad. these days I use AMD processors in a build, as I can put a better graphics card in a build whilst getting a reasonable baseline performance. on a budget

  38. phil harris

    mcs80 !

    Sod all you sad hobbyists ;) The biggest influence in my opinion was the development kits that allowed small industrial companies to experiment with micros without breaking the bank.

    The mcs80 was what my boss dropped on my lap while I was doing an onc course in 1976.

    That kind of jump-start really makes a difference to being able to design kit with new tech. (It still does.)

    Pause while s.o.g. goes misty eyed about numeric keypads and 7-segment displays.

  39. Glen 1

    Id be interested to see...

    the pictures compared to each other at the same scale...

    or scale at which the processes are the same size

    1. megastream

      Re: Id be interested to see...

      I started with the 8008 on a board assembled by DEC. I remember developing cross assemblers on a PDP11 using the DEC Macro Assembler. Then it was on to the 8080 - building one of the first Altairs in Europe. I recollect that we had problems getting our first floppy drive working, until we realised that there were different types of hardware and software formatted floppies. Then we started on developing a very crude disk operating system, but later switched over to the excellent CPM from Gary Kildall. For a while after we moved onto x86 processors we continued to run CPM86, but eventually DOS dominated. A weakness in the 8 bit processors, and in the early x86 processors was the lack of hardware floating point hardware. I recollect that Intel did produce a biploar logic board which carried out floating point operations, and consumed lots of amps and went through many revisions. Eventullay there was a chip for the 8085 family, but this was outclassed by the 8087 co-processor for the x86 family, when it eventually became available. This was a great leap forward in real time controllers which were solving differential equations - no need to try and improve speed by scaling everything to work in integers. Back in the bad old days, I remember an 8080 Fortran compiler making a wrong decision on a comparison of two integers - because the difference in the integers was out of integer range. The un-controlled machine was halted with the panic button.

      Happy days!

  40. G.Y.


    The "printable" version is unprintable (graphics, not rude words ..)

  41. Anonymous Coward
    Anonymous Coward

    Fried fingers

    First PC was an Olivetti M10. 8088 at screaming 4,77 Mhz - and TWO! floppy drives.

    It boasted an amazing 256 Kbyte RAM. Only I bought chips and expanded it to 768 Kbyte (640 Kbyte useable) in the soldered in sockets on the motherboard.

    Didn't work though. Used my thump to firmly press each chip into the socket again. Replacing the one chip hot enough to immediately fry my thump and leave my skin sticking to it - it worked. 640 Kbyte useable memory! wooohoo!

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2022