What this world needs...
... is a 4GHz 6502! ;-)
Ah, the memories. Cut my teeth learning Machine Language and Assembler on it, then later on the Z80 with an add-on card for CP/M.
*sigh* *cue kids today/back in my day/yorkshiremen rant*
When Bonhams put a vintage Apple I computer on the block, the Brit auction house thought the tech antique would fetch as much as $500,000. It was wrong. Instead, the 1970s-era circuit board fetched an eye-watering $905,000 (£564,000) in New York on Wednesday. The winning bid came from the Henry Ford Museum, which said it will …
I used to design using the 6802 for small scale industrial monitoring equipment and some naval instrumentation kit. It had a small amount of on-board RAM (about 2KB I think, which was all the system needed) so I didn't have to add any RAM to the PCB. Ah, happy days hand crafting assembler, stored on 8" floppies using a MICE. (That's Microprocessor In Circuit Emulator, for those too young to know, or even care probably.)
6502 was cool, you always knew exactly how many cycles (us) an operation would take.
I still remember quite a few opcodes in direct hex... (coding in basic with DATA)
4GHz 6502 would be awfully slow of course as it has only 3 registers (one accumulator, X,Y index ones), no cache, no branch prediction, no pipelines, basically it will constantly stall waiting for memory. OTOH Apple II had 48KiB RAM only, so adding 64KiB L1 cache to hold both RAM+ROM would do the trick and greatly alleviate the memory issues.
"Similar to what Henry Ford did with the Model T, Steve Wozniak and Steve Jobs put technology directly in the hands of the people".
That quote applies far more to the work of Torvalds than anything the two Steves ever did. Without cheap LAMP servers, broadband internet would be too expensive to afford, and there would be no services like iTunes, Google, etc to connect to. Without cheap as chips Android phones, half the world still wouldn't be connected with any kind of computing device.
The vast majority of everything that we have in terms of interconnectedness of billions of people rests on one fundamental decision - Torvalds' initial decision to license the Linux kernel under the GPL.
Compared to the work of Torvalds, Apple is practically just a savvy marketer of high priced luxury goods for the ultra-wealthy.
Without hardware there would be no way for people to become inspired. It is a bit like saying Newton's contribution to physics is no where near as important as Einstein's. It is a pointless statement.
They all have forged foundations that we now sit atop.
>"Similar to what Henry Ford did with the Model T, Steve Wozniak and Steve Jobs put technology directly in the hands of the people".
>>That quote applies far more to the work of Torvalds than anything the two Steves ever did. Without cheap LAMP servers, broadband internet would be too expensive to afford
Home computers were popular before the internet was popular amongst consumers. Anyway, the Model T Ford was a product, like the Apple I.... if you want to talk about infrastructure, maybe you compare LAMP servers to tarmacadam - a invaluable innovation for sure, but in a different category.
Besides number, it's the product design. Model T was not among first cars, but was designed to be cheap enough to be affordable by a large number of customers. Apple products were never designed to be cheap, like ZX80/81 or some Commodore models were.
I'm sorry for Ford, but Apple looks like more a sport car maker than Ford itself...
The Apples were precisely designed to be incredibly cheap, in a world where the cheapest 'real computer' was a PDP-11/34 which cost $10,000 and was a metal box about 8U tall. The Sinclairs and Commodores and Acorn Electrons were cost-reduced versions of things that had already been significantly cost-reduced.
IIRC Commodore PET line was available around the same time of the Apple II - don't know how much they were sold for, but I guess they were too far cheaper than a PDP.
Kit to build your PC were available too - after all the Apple I was something alike a pre-built one. Better quality than competitors? Probably. Better ideas on how to evolve the product? Yes. But hardly the cheapest and more affordable ones.
" Commodore PET line was available around the same time of the Apple II"
My Apple II was bought on the last day of UK 8% VAT on electronic components in 1979. It had 48KB RAM, one 5.25" floppy, and a black & white video card. That set me back about £1800. A B&W composite video monitor had to be added to that. Last time I did the currency inflation that would be at least £6K now. Most people were buying microcomputer DIY kits like the Tangerine.
My first IBM compatible PC in the mid/ 1980s had something like 1MB Ram and a 20mbyte hard disk and also cost about £2K.
None of these were what most people at the time would consider "affordable".
Was n't the Sinclair Spectrum selling in the early to mid 1980's? I recall a hardware engineer in NZ doubling the memory of one (it stuck out from the side as no room inside the case). Then the BBC micro was pretty wide spread. I remember one of the crew, even, had one on a fishing/research boat in about 1985. Whenever the winches started, a great spike wen t through the ship's electronics and the screen would flash, go blank then resume - astonishing.
Some form of BSD system was the basis of most UNIX web servers until quite recently, because it was free, efficient, tried and tested and relatively secure.
"Free" software was available, via GNU and others before most of us had heard of Linux. Of course, one had to compile it oneself. But the make files were rather good.
No. Here in the States Heath made a kit that had an actual monitor and case so it's quality was far better.
The problem for me back then was all this stuff was expensive and exotic. As a low income teenager, I couldn't afford it and my parents wouldn't spring for it. My first "computer" was an Oddesey console some guy at the mall talked my dad into buying instead of the TRS-80 or Atari I wanted. Games were decent, but it was no platform on which to learn computers and not nearly as popular as the Atari station was. When the Commodore 64 arrive I finally got my first computer. From there it was off to the races.
...and to that point specifically Jobs and Woz were no Henry Ford.
Apple kit has always been overpriced since day one. If anything, Apple is more like the pre-Ford luxury automakers like Mercedes. That's certainly the comparison modern fanboys want to make now.
The Ford of computing is more along the likes of Commodore or Sinclair.
If it were only up to Apple computing would have gone nowhere. The masses would never be able to afford it.
Sorry, but the diffusion of Internet predates Linux and not by a small time. Linux and Apache made possible the $9.99 hosted site, but commercial services don't depend much on the "cheapness" of the OS.
Without "cheap" PCs (BTW, Apple ones never were cheap, what lead to mass adoption was the Sinclairs, Commodores, etc, then the IBM PCs), the whole open source development would have had no momentum, because it would have been limited to universities and maybe some large companies. And there would be no Internet without clients... most of which didn't run Linux (and still today, just Android clients are in a sensible number). So maybe you also have to say thank you to MS for not licensing DOS first and then Windows to a single hardware supplier - which in turn allowed the IBM "clones" market and cheaper and cheaper PCs.
"Broadband" depends on a lot of expensive hardware to run - an mostly running proprietary OSes like Cisco IOS - and really it doesn't depend on a large number of cheap servers. Also, its billing don't run on cheap LAMP server, probably it runs still on some expensive mainframe...
Sure, open source software helped Google to increase its revenues even more if it had to buy Windows or Unix licenses, or develop its own OS, but it would have offered the same services anyway, and still made a lot of money.
Android is not cheap because it is Linux, it is cheap because Google spends the money to develop it and put in your hands a Google terminal to access your data .- and you still pay for the hardware!. And as with any device, the hardware costs are fare larger than the software ones.
Is Linux important in the IT history. Sure. DId it create the Internet and the interconnected world of ttoday? No, it didn't - it is just a player among others.
If you're headed there, the first thanks you have to offer are to IBM for underfunding their PC initiative. It was that choice that caused the Boca Raton office to build the whole thing entirely from commodity parts with no proprietary hardware. Which in turn allowed Compaq to clone the hardware.
Still Apple was one of the early innovators in home PCs and helped make them cool to own. Back then it was Apple who owned all the really cool video games like Castle Wolfenstein. We couldn't afford one and I envied my friend his IIe. He always let me play it and early on I was always stuck on "You have 1 bullets left."
"The vast majority of everything that we have in terms of interconnectedness of billions of people rests on one fundamental decision - Torvalds' initial decision to license the Linux kernel under the GPL."
Bollocks. If young Linus had happened to spend more time in the student union bar and less hacking away in his bedroom back in 1991, then the only real difference to the history of IT and the Internet since then would be that 386BSD (version 0.0 released six months after Linux 0.01) and its descendants would take the place of Linux in as the go-to open-source OS for all kinds of applications. Mr Torvalds didn't invent open source, y'know.
And anyway, Linux only existed because Minix existed, and Minix only existed because of the combination of the existence of Unix and the IBM PC, and the IBM PC (and of course, the Apple I) only existed because of the microprocessors that arrived in the 1970s.
How could you be so wrong? Torvalds followed on from Minix from Andrew Tannenbaum (some five years before Linux(. Minix was inspired as a student project by BSD and the original UNIX. These were widely used by universities and others, so by a lot of students and researchers, as well as by various research places and private concerns. I supported and developed on UNIX from the mid/1980s onwards, complete with usenet etc.. There were plenty of machines being sold as UNIX systems and others, such as Vax750 and all those PDPs hosting it, plus emulators such as Primix and multi-universe UNIX systems such as Pyramid. Even Sun was getting going.
The BSD tapes, if I recall, were available for the cost of the tape. So, if, in software terms, one could name people, it would have to be Kernighan, Ritchie, Bourne (for his accessible and programmable shell), Thompson and others. The first Apple Macs were the equivalent hardware in that they were moving towards being usable by the non-specialist (I recall seeing a Lisa in NZ at a neighbour's house in about 1985 and the flood of CVs from Victoria Uni, Wellington, all obviously written using Macs in the 1980s. Even Bill Gates can claim precedence with his consumer PCs in the late 1980's. That hardware is perhaps the Ford equivalent. The thing with Macs and PCs is that they arrived, not as hobby games machines but as offering word processors, spreadsheets, hypercards etc., all primitive but useable (of course other forms of PC did this too to some extent). Linux simply jumped on the back of students used to BSD and other UNIX implementations and, to some extent, is only just catching up on reliability and long ago lost the simplicity, being driven now by commercial variants and support contracts from the likes of Redhat, Ubuntu, Suse and others.
Be grateful to the embryonic Apple and its peers and to the cooperative spirit of those times when it was not considered a failing to share and learn from each other and even, strange as it seems now, even Primos, VMS, RTS and others came with source code (one of my first development/programming jobs was modifying the source of an OS to add file system enhancements and Coloured Book networking stack).
When the Apple I debuted at a meeting of the Homebrew Computer Club, it was practically like the scene in Back to the Future where the kids of 1955 heard Rock n' Roll for the first time. Here were a bunch of brilliant, driven hobbyists that had beaten the odds and cobbled together their own equipment using mostly second-hand parts and bare-knuckle ingenuity, some of which had to have programs toggled in with switches, some could be programmed in hex/assembly language, a very, very few were capable of driving video displays and running a high-level language like BASIC. Prior to this, the most excitement had probably been when someone figured out how to get an IMSAI to play music. (through a radio picking up stray RFI from it) Then Woz walks in with the fruits of his tinkering and here is a computer on a board, ready to hook up to a TV, ready to run BASIC and truly be interactive. (a case is optional--Otterbox, anyone?) This was a game-changer of the same significance to the early days of computing as when someone took a burning branch from a lightning-struck tree and brought fire back to their cave for home use.
Linux is very significant too, but more in the sense of a political revolution freeing the enslaved, and not in the sense of putting the first wheel on an axle.
Those that have read my posts here re. Apple know that I am not a fanboi of modern Apple kit, but please give credit where credit is due. If it weren't for Woz's passion for micro computing, as well as technical prowess, Steve Jobs would likely have still been successful, being the driven man he was, but perhaps his career would have been with another early innovator, or more likely you'd have seen him hawking wares on late-night infomercials, or selling Nordic-tracks or similar.
Meanwhile, I'll keep going to garage sales, hoping for a find like this. The hardest part would be deciding whether to keep it or sell it.
Re. the humble 6502, there probably hasn't been a more used CPU in history. They are still everywhere in the world, in the form of embedded controllers. (think light switches, coffee makers, industrial machinery, medical equipment, etc.)
The Intel 8051 has got to be giving the 6502 a run for its money then, because there's one in every smart card, so that includes SIMs and Chip and PIN credit/debit cards, and also in many types of NFC tag such as the ones used in the Oyster card. The reason is simply its architecture - because it has separate busses for instructions and data, it's physically impossible to make it execute something that it shouldn't be.
Yes, cheaper than the Motorola part and Intel's CPUs.
The original 6502 was pin compatible with the Motorola it was a "clone" of. But due to an employee of MOS stupidly having a reference manual from Motorola (they all used to work for Motorola) it was deemed to be infringing on Motorola's IP.
The pin layout was changed, that's about all.
Most of the pins stayed the same (all the data lines, address lines, power, and I think interrupts are in the same place on the 6800 and 6502) - what changed were the area around the clock (the 6800 required a two phase non-overlapping clock, the 6502 was driven from a single clock signal and generated ) and bus controls (e.g. the 6800 has a tristate control, valid memory address signal and data bus enable that are lacking on the 6502). Unless you were using a DMA controller that needed to take over the bus, the changes probably made the 6502 a little easier to design for than the 6800.
The 6501 bus interface was closer to the 6800 version - and too close for Motorola's liking.
"The pin layout was changed, that's about all."
Had a Motorola 6800 development kit in 1976 - and an Apple II in 1979. My memory says there were also opcode extensions? in the 6502 - but don't quote me on that.
The 6800 and 6502 had a nice orthogonal instruction set that felt quite comfortable after my mainframe apprenticeship on IBM 360 compatibles. The 68000 looked even better but never seemed to take-off into home PCs - apart from the Sinclair QL.
The Intel architecture and instruction sets felt very awkward - as if features were added to the spec as needed without considering the whole.
" AC probably meant 68008 instead of 68000."
Mea culpa. Faulty memory - possibly because the Atari and Amiga were not ones in my varied collection. Bought the QL but never found time to do anything with it. Now I think about it - did the Dragon 32 use a 6800x too? The Sinclair QL's design constraints in using the 68008 instead of the full spec one was in an El Reg article not long ago.
That's like saying that Henry Ford wasn't important because Karl Benz invented the first automobile. Wozniak was the first to create a single board design that incorporated all the key elements (keyboard controller, CPU, RAM, ROM and video controller) onto one board. This lead to the microcomputer boom of the 80s.
I am a Commodore fan. My first computer was a PET. I've always maintained that if Jack Tramiel was a bit less of a megalomaniac, we'd be all running Commodore-format PCs and not IBM's standard. I was not meaning to minimize Chuck Peddle's contribution--without a low cost, versatile CPU like the 6502, a lot of computing history would have been a lot less possible, or at least delayed, The 6502 was a lot cheaper than the main competitor (the Zilog Z80), and while running at lower clock speeds, was more efficient in many ways.
But I thought we were discussing the first microcomputers, not microprocessor design...
Not really. I remember when the IBM PC was introduced. It was a piece of shit, people admitted it was a piece of shit, but it had the magic IBM logo on it. As far as the non-computer folk were concerned, no other computer existed... the Apple, Commodore, Atari, BBC etc were "toys" and "you couldn't run a serious business on 'em"
Really retarded, but there ya go. The only think that saved Apple was that Visicalc had already embedded Apples in a bunch of businesses by the time the PC launched.
Not knocking the Apple 1 but comparing it to the Model T was wrong. The Model T was designed from the ground up with mass production in mind and a car for the masses. The Apple 1 was designed to be a cheap hobbyist machine with total production in the thousands at most probably (and it never even made that). A better comparison for the Model T would probably be the ZX Spectrum, it was cheap and mass produced and gained much wider adoption. I don't think any single computer ever really had the impact of the Model T though as there were far more players in the computer game.
I agree Ford overstated the case for the Apple I, but it was still a landmark.
If I were to pick the Model T of computing it would have to be the IBM PC. The first one, where PC was the model. It's the one that moved PCs from hobbyist to business. It's also the one that for better or worse standardized our world for both computer architecture and OS even if the OS has changed a lot in the intervening time.
The IBM PC was a terrible hack. Expensive (£1700 for a basic 1 floppy machine with 64K of RAM and a mono monitor), slow (being bested by a 2MHz 6502) and hard to program (16 bit segment registers anyone?). Only the IBM name and the open design saved it. Modern machines are more closely descended from workstations of the time, every part of the original ISA has been replaced with something to fix the original hacks (often several times).
This post has been deleted by its author
Never had an Apple 1 but we did work on an Apple II for a while (with Motorola 68000 board to do the heavy lifting). We actually did image analysis on that piece of kit. I think the Apple II is much more of a Ford-T equivalent (though not fully) than the Apple 1 (though many other machines would be contenders, like the IBM PC, the ZX Spectrum or ZX80/81, and a host of others). The Apple II was produced in much larger numbers, and had the expansion slots which allowed third parties to add stuff. THAT was an important step (not so much that the expansion slots were there, but much more that they were open to all). The IBM PC took that to another level again. Expansion slots were a way to stave off obsolescence (for a few months more ;-)), and to increase flexibility.
"The Apple II was produced in much larger numbers, and had the expansion slots which allowed third parties to add stuff. THAT was an important step [....]
That was the reason I bought an Apple II. Being able to buy third party comms boards and also homebrew my own was very useful in my troubleshooting career for about 10 years. At one point there was one Apple II driving a real comms front end processor - via an emulated mainframe peripheral interface. A second Apple II (ITT clone) was acting as a cluster of synchronous terminals. Very useful for reproducing software timing bugs and testing the fixes.
The local TV shop unloaded their residual stock of Micro Professors very cheaply after the IPR case ruled they were just too much like an Apple II. They had a very small footprint and ran the Xmas tree lights for many years. The UHF modulator was replaced by a LS74 TTL lash-up for outputs and the programs were in EPROM.
The Apple LIsa spec was a big disappointment in being basically a pretty "appliance" - and I never bought Apple again. IBM PC clones eventually took over the Apple II role by providing similar custom hardware extension capabilities.
If the Apple I was comparable to the Model T, then the Willys Jeep of computing had to be the Commodore PET/CBM machines of the late 1970s/early 80s. A lot of businesses used them, and a lot of industrial automation actually ran on them in the day. (some reliably for decades) It's a shame the versatile IEEE-488 interface never took off much outside of the scientific and industrial communities, as it was far superior in many ways to the RS-232 ports that most machines of that period used. There is still a lot of scientific and data acquisition equipment using the IEEE-488 interface, though I doubt it's pin and logic-level 100% compatible with old Commodore equipment.
The Model T was far from the first automobile, what made it important (in the USA) was that it was the first that was cheap enough and usable enough to be purchased in large quantities by average families, which set off the massive restructuring of our physical spaces around the automobile.
The Model T of computers were the IBM PC clones. Again, these were the first computers that were cheap enough and usable enough to be purchased in large quantities by average families, which set off the massive restructuring of our informational spaces around the computer.
If I had to choose one particular clone as a best match for the Model T, it would be the Dell (then PC's Limited) Turbo PC. Even better, if you can find one, it'll only set you back a few hundred bucks.
As far as cars go, the Apple 1 is much more like the thousand or so handbuilt cars built by the Ford Motor Company before Mr. Ford developed the Model T.
It was Commodore who made a proper computer that was affordable (the PET); Commodore boss at the time, Jack Tramiel, coined the phrase "computers for the masses not the classes." It was also a Commodore engineer named Chuck Peddle who helped the two Steve's get the Apple I working in the first place, and if you look closely at that picture there's a Commodore/MOS 6502 chip at the heart.
If it wasn't for Visicalc appearing on the Apple II before the PET, we'd probably be remembering Apple in the same way we talk about Oric, Coleco and Sperry.
Biting the hand that feeds IT © 1998–2021