wielding equipment
You mean, like forklifts and zero-point energy manipulators?
Mostly-forgotten silicon-maker Zilog is alive and has just released a new product that shares a little DNA with the Z80 CPU that powered many of the earliest mass-market microcomputers. The Z80 famously powered Sinclair's iconic ZX 80, ZX 81 and Spectrum computers. The CPU also made it into the Radio Shack TRS-80, the Osborne …
You mean, like forklifts and zero-point energy manipulators?
"But the 6809 really is much better than the crapola 6502"
Joking aside, from what I've heard, it generally *is* acknowledged by most people familiar with the subject- even fans of 6502-based machines- that the 6809 was a very nice CPU, and quite a bit more advanced than the 6502.
The problem with the 6809 is that two of its best-known applications (the Tandy Color Computer and the very similar Dragon 32) paired this advanced CPU with a relatively primitive and dated graphics chip that didn't even support lowercase text (as also used in the non-6809-based Acorn Atom), alongside limited DAC-based sound. None of this probably gave the best impression.
Similarly, I realised I had a slight snobbery about the Z80 because it was used in the ZX81 and Spectrum, whereas the 6502 was (to me) associated with the more advanced Atari 8-bit computers (which I owned), Commodore 64, BBC Micro et al. Even though the former two owed that to the custom chips, and the latter to its great OS and general design. If anything, I get the impression that the Z80 was slightly more complex than the 6502, though not clearly better to the same extent the 6809 was; I think it was more a question of different styles and tradeoffs.
(And none of this should be taken to say that the 6502 was a bad CPU, just that the 6809 was probably better... but also more expensive).
I'm no CPU expert, personally; most of this is secondhand from reading discussions between people who know more than I do. :-)
(#) Example:- "Too bad [the Atari 400 and 800] didn't use the 6809" / "I second the "too bad Atari did not use the 6809"
I used all three - 6502, Z80 and 6809 (and the 6800 before that).
The 6502 was in the first computer I owned, and the basis of a few homebuilt computers too - dead easy to knock up a simple system around it. I must have used it from about 1980 until 1988 (upgraded to a 286 machine then). A bit lacking on registers and 8-bit index registers seemed a bit feeble after I'd first cut my teeth on the school's 6800 machine (16 bit index register, but only one of them), but zero page memory access was quite fast and indirect addresssing made up for the lack somewhat.
The 6809 was lovely to program - 2 8 bit accumulators (which could be paired to make a 16 bit one for some operations), 2 index registers, 2 stack pointers, and reasonably orthogonal for an 8-bit machine. It seemed to be everything I liked from the 6800 and the 6502 put together with extras. But if I remember correctly some instructions seemed to take a lot of cycles. Never got to build any hardware around it though. In terms of 8 bit CPUs (I think it was the last 8 bit CPU design to be released) it probably turned up a bit late to have a huge dent on consumer PCs, had it appeared a couple of years earlier, it might have stolen more thunder from the 6502/Z80.
I only ever used a Z80 for one big project - easy to knock up hardware around it, but I never really liked assembly code on it - when you looked at what it generated all the opcode prefixes seemed like a kludge to squeeze extra stuff into the 8080 instruction set, and for all its complexity (about 8500 transistors vs the 6502's 3500 or so) it didn't actually seem particularly faster.
My understanding, at long arm's length, based on the literature and with no direct experience, is that the Z80 was often preferred to the 6502 because it had built-in DRAM refresh logic.
One of Sinclair's smart moves in the ZX80 and '81 was to use static RAM and repurpose the DRAM refresh counter as part of the video counter, saving some external logic at the cost of having to force the processor into a NOP/HALT cycle during pixel periods because exact refresh timing depends on the instruction stream but variability doesn't work when you're synchronously driving video. The program counter forms the other part, thanks to NOP conveniently being 0x00, meaning that one can easily steal the real value from the bus and then force a NOP before the CPU checks via open collector logic. I'm pretty sure the [proper, non-CPU-assisted] video fetches are used for DRAM refresh on the ZX Spectrum but if you've already tooled up for the Z80, why change?
A smart move indeed, but it increased the cost. At the time (and from memory), static RAM was several times the cost of dynamic RAM. I know because I periodically priced them up for when I built my own micro computer. I taught myself how the wiring was goping to go down from Zilog manuals and product sheets. Hard, for someone with no electronics training outside of being yelled at by my electronic engineer father when I connected the bits of my EE20 projects incorrectly.
Dad was no good for working that stuff out anyway. Strictly an analog, industrial-scale man. Had no idea what "Tristate" device pin statuses were when I asked.
I guess Clive's computer design saved more than the cost of the RAM in other ways. Shame the world fell in love with the idea of "standards". Sinclair products had a design-neatness chic above and beyond their function (and their shortcomings, as anyone who tried to use a Sinclair Scientific calculator could attest). Even today when I see a heavily overloaded keyboard (in the function sense) I think of Clive Sinclair and his insane "every key does two bajillion things" designs.
And he was the first person to understand that portability was important, even if he did expect you to have mains electricity when you finished porting. The Spectrum was the first real computer that coould be backpacked with some expectation that when you arrived at base camp and fired up the genny you could use your spectrum without repairing it.
"Even today when I see a heavily overloaded keyboard (in the function sense) I think of Clive Sinclair and his insane "every key does two bajillion things" designs."
tl;dr response- That was mainly a fault of the Spectrum, where the one-touch keyword system that was well-suited to his earlier, simpler machines got out of control.
The "keyword" system made sense on the ZX80 where it first appeared. Where appropriate, it let you input a single BASIC keyword at once, speeding up input via its cheap touch-sensitive keyboard. Since it was a much simpler machine, it also only had at most three functions (including one keyword) per key:-
https://commons.wikimedia.org/wiki/File:Sinclair_ZX80.jpg
See- that's not too scary, is it? Bear in mind there's no lowercase- the single SHIFT key accesses the orange symbols- and the programming-centric hobbyists it was aimed at would quickly have picked up how the keyword system worked.
The ZX81 retained the concept but had more keywords, so required multiple keypresses to fit some in.
By the time they got to the all-singing, er... all-beeping Spectrum... well, you can see where this is going. Up to five functions (and four keywords) per key- six if you count upper and lowercase letters- made the keyboard a confusing mess and required (e.g.) caps-shift and symbol-shift together (to enter extend mode) *then* symbol-shift and Z to generate the keyword "BEEP".
Many other BASICs (such as 1979's Atari BASIC) supported abbreviation of keywords upon input- while still retaining the benefits of tokenisation- and this would have made a lot more sense.
Even at the time, Your Computer magazine's review of the Spectrum recognised this as a flaw (see "Symbols and keywords" and the concluding bullet points).
As I said, good idea originally, but one that should have been ditched before they got to the Spectrum. Thankfully, they finally abandoned it with the sort-of-32-bit Sinclair QL.
I have no evidence but have always assumed that Sinclair's button-per-keyword was a means to avoid the space and time of writing a tokeniser; I guess it may partly also have been because the ZX80 can't run the display and process a key press simultaneously so one screen shake per word was preferable to one per letter? Then anything else he said about it was just marketing.
Interesting; you could be right that it was partly an excuse not to have to fit a tokeniser in (since the ZX80's BASIC/OS ROM was only 4KB, that probably would have been a legitimate issue). All the ZX machines' character sets had full keywords mapped to certain positions; in effect, a sort of pre-tokenisation.
Note that there *were* apparently a few functions on the original ZX80 that had to be typed out in full (some machines have a sticker attached saying "Integral Functions") and that those apparently *weren't* stored in the character set, so would still have needed parsing.
At any rate, given the flicker you mention and the ZX80's less than ergonomic (but economical) flat keyboard, it was probably still a good thing that typing was minimised.
I doubt the space required for a tokeniser would have been such an issue on the Spectrum though, which AFAIK had a 16KB BASIC and OS ROM, comparable to (e.g.) the Atari 800 which used tokenisation (the latter had the OS and BASIC clearly separated, but the BASIC was 8KB). It's notable that the later 128K versions of the Spectrum had an improved BASIC and editor that ditched one-touch keyword entry but still stored the keywords in tokenised format.
The irony of a "tl:dr response" that is a couple of inches longer than the post it purports to address is only marginally less funny than the amount by which the point was missed.
Uncle Clive was overloading keyboards long before the ZX series. If you'd read my post you'd have caught the tone, which was not one of aprobation, but of approval.
@Stevie; the "tl;dr" was a reference to my own comment. The intent was you could read the bit in bold and get the gist of it without having go through the rest of my longwinded(!) post. :-)
I'm sorry for not paying attention to your comment. I jumped to the conclusion that it was about the Spectrum since it's a commonly referred to design flaw there (and the point at which a good idea was taken too far).
Let's face it "insane "every key does two bajillion things" design" *is* the Spectrum, even if the rest of it was fine. :-)
Regarding the calculators, wasn't function overloading common on most scientific calculators from the beginning, though (not just on Sinclair's)? Calculator keyboards aren't computer ones, so why was Sinclair's so egregious (even in an affectionate sense!)?
Yes, the 16/32-bit eZ80 is the modern version. It was also the first microcontroller to offer free TCP/IP stacks. The eZ80 had a Z80 compatibility mode, and also an advanced mode with more instructions. Proud to say I was on the eZ80 development and marketing teams.
They should still be in production and you may be surprised by all the things they do. For instance, there should be a bunch of Z80s flying through space. While I'm not sure about the current state of the matter, until fairly recently, they were actually pretty common there.
The thing is that an average space probe does little to no processing of the data it gathers. Instead, pretty much all that it gets through its sensors is simply routed to Earth. Therefore, you don't need high performance parts. On the other hand, you want them to be "bulletproof", and with all of the behaviour of Z80, both originally intended and otherwise, being completely documented down to the last transistor by now, one can be confident that it won't throw a surprise once it gets up there, that might turn millions of $/€/£ into a piece of space junk.
The manufacturing process of those particular parts is a bit different, in order to make the par more resistant to space radiation, but, other than that, they should work very much like the stuff that powered Spectrums. :)
EN "...an average space probe does little to no processing of the data it gathers. Instead, pretty much all that it gets through its sensors is simply routed to Earth."
Data Compression is a very big thing with many space probes. Which, if you think about it, is about 100% of what it could and should do.
You're correct in that the average space probe doesn't analyze the data on-board and then email back PowerPoint slides.
>> “provides a platform for developing a number of power management solutions such as electric vehicle chargers, air conditioning systems, high-power LED lighting, and wielding equipment.”
Okay, okay, that's not something many of you will use any time soon. <<
But I bet nearly all the folks developing this sort of stuff come here. After all Wireless World is gone.
This has reminded me that when I was at college they had one of these which ran a computer aided software design package that used character graphics on green screens to show flowcharts and the like. I never used it myself but was given a quick demo (as part of a group) and remember being impressed at the way boxes could be inserted and the diagram automatically re-drew itself to fit around the new shape in an optimal way.
I only mention this because the most significant part of that demo completely passed me by at the time: it was the first of almost-too-many-to-count failed software design packages that promised loads but somehow never quite managed to become mainstream, let alone any sort of de facto standard. I feel trapped in some sort of technological time bubble where nothing will ever progress beyond Visio embedded into Word.
Z8000's, I'd wager, not Z80's.
Zilogs answer to the challenge of proper 16 bit microprocessors.
https://en.wikipedia.org/wiki/Zilog_Z8000
I well remember the days at ICL when all the 16 bit microprocessor vendors came trooping through Kidsgrove trying to sell us the magic things. Even TI, with the 99000, explaining that it were bloody good because it was software binary compatible with the 990 (??) minicomputer...
This post has been deleted by its author
The z80 did not have a paticularily nice instruction set the only nice thing I remember was two sets of registers for fast context switches but these things never die. There are still many many 8051 based parts still available and I do not think that is very nice to program but it continues and in much higher volume than anything derived from the Z80.
"TI graphing calculators still do"
I've always wondered why they do buy someone else's processors when they have several lines of their own. The MSP430's would more than suffice in most of their calculators, or they can use something from the OMAP or Sitara product lines and produce something that can actually do real 3-D, color, and multi-tasking.
The archaic, obsolete and expensive TI83 your American High School student child is required to buy (and loose, 3 weeks later) is powered by a Z80.
The student is required to buy this clunk because it is approved for use on the SAT exam. Any more advanced calculator can be used to cheat on the exam, by storing answers, or communicating with the net or confederates.
The TI83 can, of course, be used to cheat, but requires a higher level of skill. Anyone able to cheat with it would score well on the test without it.
"Any more advanced calculator can be used to cheat on the exam, by storing answers, or communicating with the net or confederates."
They could build one without wireless networking and a proper secure erase system.
When I went through school, our teacher required use to show them the "Memory Erased" screen, except it was trivial to just write a 'program' that just did:
print "Memory Erased"
pause
Looked identical enough to the OS function that it worked.
When I were a lad (actually in my late 20s/early 30s) I worked on software for a prototype data logger for an ice sounding radar. We used an S100 bus single card computer with a Z80 processor. The development system (using Wordstar, asm and link!) was an Osborne 1. It has left me with an enduring impatience with bloated software (I know what you can do in 2k of EPROM!) and memories of keeping vast amounts of code in my head so I could debug stuff in an environment where reassembling, linking , blowing a new EPROM and then testing the result could take an hour or more!
I still have the Leventhal Z80 Assembly language Programming manual on my shelf!
... well ...
we had a 'big multi-user' system with a rack of Z80 boards running CP/M on each board with a central pair of 8" floppies for boot and storage, lots of terminals with lovely TVI 920 (IIRC) terminals and RS232 interface cables ... what more could proper fanbois need? :-)
we had a 'big multi-user' system with a rack of Z80 boards running CP/M on each board with a central pair of 8" floppies for boot and storage, lots of terminals with lovely TVI 920 (IIRC) terminals and RS232 interface cables ... what more could proper fanbois need? :-)
MP/M of course. :-)
Worked at a place some time back, and happened to be in the coffee room when the technician was servicing the machine. It automated the brewing of ground coffee, using a giant toilet roll (well it looked like that) as filters for the grounds. And in the machine's innards was a Z80. I tried to explain to the technician that the fearful quality of the brew was because of the z80, but some people just want rubbery coffee.
4004! That was my first exposure to micros way back in 1976 I think. Programmed in binary as well on a custom made UV erasable EPROM burner.
I can also vouch for the TMS9900, always thought it had an elegant architecture and some nifty things like being able to execute an opcode present in a register. Context switching was neat, no push/pop business, same for interrupts. Sigh, those were the days what fun we had making industrial controllers from scratch all the way up to a forge control panel (Sheffield forge masters).