Since it was (Doo, doo, doo, doo doo) Nineteen Eighty-One...
"Do you know who recorded bloody Toyah over my ZX Invaders?!"
"Er.... it's a mystery, oh, it's a mystery... I'm still searching for a clue... It's a mystery to me."
520 posts • joined 11 Feb 2008
Sorry, I should have been clearer; I meant it would (AFAIK) take 2 bus cycles if the data was already available internally, presumably held on the 32-bit data registers. I also notice the Wikipedia article says that the ALU is only sixteen bits (but it has another two used for addresses or something).
That said, you sound a lot more knowledgeable about such things in general than I am anyway(!)
FWIW, in light of what you said I also noticed that the Wikipedia article claims that "Motorola termed [the 68000] a 16/32-bit processor".
And the "ST" in the 68000-based "Atari ST" also supposedly stood for "Sixteen/Thirty-Two" (though some suspected it was actually "Sam Tramiel".) I'm guessing the less-well known "TT030" workstation is so-called after the "Thirty-Two" bit 68030 CPU...
"I'm pretty certain that the original 68000 only had a 16 bit data bus"
That's correct; the original 68000 was 32-bit internally, but only had a 16 data bus. But it was generally only considered a 16-bit processor anyway (the Atari ST and earlier Amigas were all generally called "16-bit", and the 68000-based Mega Drive's badge said "16-BIT"). It wasn't until the 68020 that it got a full 32-bit data bus.
While we're on about the "8 or 32 bit" QL, it's worth remembering though that even the original IBM PC and PC XT *didn't* actually use the 16-bit 8086, they used the 8088 which was the cut-down version with (again) an 8-bit data bus. So though some people complained about Sinclair characterising the QL as a 32-bit machine, it was probably as legitimate as claiming the original PC was 16-bit.
(Some have also noted that the 68000- and presumably 68008- used two cycles for the 32-bit instructions, so one can start nitpicking this, but... yeah.)
The Register's own 30th anniversary article for the QL touched upon this:-
[Chief Design Engineer David Karlin] chose the 68008 [which] was built for backwards compatibility, so it featured an external 8-bit data bus and 20-bit addressing. The 68000 [had] a 16-bit data bus and 24-bit addressing. [..]
“I don’t know how robust the decision was,” David admits now, “but it seemed fairly clear that the the 68000 series would be a great platform [..] The problem with the 68000 was basically a pin-count issue. Motorola was pricing it gigantically higher than the 68008, double or treble the price. It was sufficiently high I didn’t even argue about it.”
Rivals’ use of the full 68000 would later come as something of a surprise. If Sinclair couldn’t afford the 68000, how could they? Today, David blames Sinclair’s negotiating skills, not just for the CPU but for a whole variety of logic chips and add-ons: “I question how good we were at purchasing, because people like Amstrad, certainly the Japanese, certainly Apple, who did not have gigantically higher volumes than us at the time, got massively lower prices.”
During 1983, it has been claimed, Motorola cut the price of the 68000 to below what Sinclair had agreed to pay for the 68008. Renegotiating the purchase contract might not have been costly, but adding in the architecture the full 68000 required [..] would have been, so it was decided to stick with the 68008. [..] It’s easy to say Sinclair would have well been better off going with the 68000 after all, but only with the benefit of hindsight.
Not sure about the Ceefax analogy, but that's otherwise broadly correct AFAIK.
Since the CPU was used to generate the display, the ZX80 could only output an image when things were otherwise idle or waiting for input (i.e. essentially the same as a ZX81 that could only operate in "FAST" mode).
The WAIT/NMI logic on the ZX81 gave the user the choice of a continuous display at the expense of speed (hence "SLOW" mode) by interrupting the processor whenever it was needed to generate the active parts of the display. As you say, you could always switch back to the full-speed non-continuous display via "FAST" mode.
It was just "Sinclair" in the UK, although they were manufactured under contract by Timex. "Timex Sinclair" was the brand used for the otherwise near-identical North American version of the ZX81, known as the Timex Sinclair 1000 or TS-1000.
Interestingly, there was also the obscure Timex Sinclair 1500, which was also essentially a ZX81 but with a Spectrum-style silver case and rubber keyboard and 16K onboard.
(Apparently a complete flop- most likely because it didn't come out until mid-1983 and cost $80 at a time when numerous more advanced machines were available, and the US market had become so competitive that many of those were being driven below the $100 mark).
There was also a Timex Sinclair 2068 which was based on the ZX Spectrum, but with a number of improvements that made it incompatible with the latter. (And which didn't help it succeed in the aforementioned cutthroat US market, which Timex were forced out of shortly afterwards).
Timex's Portuguese arm also released the Timex Computer 2048.
Indeed. As far as I'm aware, the hardware design of the ZX81 is almost the same as the ZX80's at a logical level (aside from the addition of WAIT/NMI). The biggest change is that much of it was re-implemented via a single ULA.
Even so, the similarities were such that you could (almost) convert a ZX80 to a ZX81 simply by replacing the original 4K BASIC/OS ROM chip with the same 8K ROM that the ZX81 used. Sinclair sold this as an official upgrade, complete with an updated keyboard overlay.
The only thing apparently missing was the non-flickery ("SLOW") display mode, which the ZX81's aforementioned WAIT/NMI hardware upgrade was needed to implement.
Despite the media's (and their own) habit of lumping together supposed groups of people with similar interests as a "community", I'm not sure that the people (potentially) interested in the new "VCS" is necessarily the same group as that which drove the Gamergate thing a few years ago.
I'm sure there's *some* overlap, but I wouldn't treat the two groups as synonymous. I mean, *I'm* discussing this here because I'm interested in 80s gaming and video games. But I really couldn't give a damn- or tell you anything about- anything remotely recent, or even most things released in the past 20 years, and I have sod all in common with the Gamergaters.
While I've seen various people dispute the fine details surrounding the development of the ST and the legal issues relating to the acquisition of the Amiga, this is- as far as I'm aware- broadly correct.
Something I forgot to mention in arguing that the ST was less the heir of the Atari 800 and VCS than the Amiga was is also that it was based more around "off-the-shelf" technology with less custom silicon than the Amiga. This probably helped its quick development, although the speed of its design is still generally considered impressive. (It also reinforces the comparison- in your other comment- of the ST as being somewhat the 16-bit equivalent of the Sinclair ZX Spectrum).
There's nothing "too good to be true" about the proposed console. In fact, there's nothing much good about it at all, it's just some arbitrary hardware using (presumably emulated) VCS compatibility as a selling point when you've already been able to do that on even the most underpowered computer for years now.
Pretty certain you're expecting too much and it's just going to be some poxy emulator-based thing anyway, not a hardware recreation. The only reason this is a problem is that they're designing a whole new modern console to run it on (which I've already ranted about in more detail here) despite the fact you could probably emulate the original VCS on anything more powerful than a musical Christmas card nowadays.
To be honest, even taking the piss out of them in this way reinforces their pretence that the modern "Atari" is remotely the same company as- or a continuation in anything other than name of- the one that buried the ET cartridges (and which made all those classic games and hardware).
Really, the sight of some minor subsidiary of the post-bankruptcy Infogrames-masquerading-as-Atari scrabbling to get funds for a completely pointless nostalgia exploiting console makes even Jack Tramiel's shoestring-budget Atari Corporation look good.
I had an Atari 800XL too and I feel *exactly* the same way. For all that I can get nostalgic about the system, loading games from cassette is the one aspect that remains stubbornly resistant to rose-tinted glasses.
I think it was particularly slow because the original Atari 800 was released back when 8K RAM was standard, and loading 8K at that speed would have been nowhere near as painful as the later 48K and 64K games were. I eventually bought a tape-to-disk thing which I wish I'd bought years earlier (since I had loads of tape-only budget games). Urrrrrgh.
That's ironic, as the Amiga has far more claim in terms of lineage to be the "true" descendant of the Atari VCS- via the Atari 800- than the Atari ST does.
It shared several of the same designers- after they'd left Atari and formed their own company. (#) It had architectural similarities- the copper was arguably a more powerful version of the same approach taken by the ANTIC graphics co-processor in the Atari 800 (which had in turn significantly built on and improved the architecture and approach of the original VCS design). And much like the 800 when it first came out, it used lots of custom chips to create a machine that was both state of the art and expensive.
The Atari ST was created after Jack Tramiel bought Atari Inc's former computer/console division, sacked most of the existing staff and had his own people work on an entirely new design that was far more along the lines of "Power without the Price".
(#) The major design work had already been done by the time Commodore bought it in
tl;dr - Original Atari Inc. split in mid-80s after videogame crash, its two "successors" are both themselves long defunct, PCs weren't a major factor, and today's "Atari(s)" are just what used to be Infogrames.
Longer reply- Bits of that are right, but lots isn't.
The original Atari Inc. did well in the *early* 80s... right up until the 1983 North American video game crash hit and Warners' former golden goose started haemorrhaging money. *That* was what brought about the end of the "original" (and some would argue only true) Atari.
In 1984 Warner sold off the computer/console division to Jack Tramiel- forming the basis of his "Atari Corporation"- leaving behind the arcade division ("Atari Games"), which was sold completely separately the following year.
This site suggests the PC line came out in 1987, i.e. under Tramiel's ownership. I think it was a red herring, rather than the reason for even Atari Corporation's downfall- I don't recall it being that big a deal at the time, when Atari Corp. was enjoying success in Europe with the Atari ST.
It was more the decline of the ST and the failure of their later products (including the Jaguar console) that brought about Atari Corp's enforced demise in the mid-90s when (as you suggest) it had lots of money from litigation, but no products or future. It effectively died after its "merger" with JTS (a second-rate HDD manufacturer), which was little more than a mechanism for Tramiel to reinvest Atari's monetary value in JTS.
One can argue about how much of a "true" successor Atari Corporation was to the original Atari Inc. They continued the existing products, but got rid of most of the existing staff and the "Power Without the Price" approach of the Atari ST, shoestring operation and general philosophy of Tramiel's Atari was completely different to that of Atari Inc. under Warner's ownership.
Regardless, both Atari Corp. and Atari Games are themselves long defunct, and anything after that *is* just exploitation of the name and IP. As far as I'm aware, the current Atari(s) are just the descendants/subsidiaries of Infogrames, which bought the rights in the early 2000s and renamed itself "Atari".
@deadlockvictim; Huh? Software distribution on CD-ROM wasn't merely the "way forward" or even commonplace by the time the iMac came out in mid-1998. It was *already* effectively standard by that point- at least on the PC- with floppies an obsolescent "legacy" option for software distribution by that stage.
The floppy was primarily there for the remaining use case that wasn't yet covered- non-read-only data transfer.
Crediting Apple for making a "statement" smacks of rationalisation of their wanting to have their cake and eat it. If they'd wanted to prove that CD *writers* or anything else were a sufficient replacement for the floppy, they'd have included it. The fact that they didn't- and the fact their users were all forced to buy floppy drives anyway- proves the exact opposite of the point they were trying to make.
Yeah, everyone knew that the floppy was due for replacement and that it could be dropped as soon as something better came along. But that "something better" wasn't cheap or universal enough at that point, and that's the only reason the floppy was still A Thing.
We didn't need Apple to figure that out.
"1998's iMac courageously did not feature a floppy drive and within the decade the vast majority of PC makers had followed suit as CD, DVD and USB storage become more prevalent."
I've always said that Apple got way too much credit for their supposed forward thinking, or for killing off the floppy drive.
You know what you saw attached to virtually every first generation, floppy-less iMac? A bloody external USB floppy disk drive in matching translucent plastic.
You know why? Because, despite the fact the 1.44MB 3.5" format was already badly dated by the late 90s, there was still no alternative that was quite cheap *and* universally-accepted enough to replace it. (#). The impetus was there to replace it, but there was no candidate yet.
That wasn't Apple's fault- what *was* their fault was the choice to leave the floppy out- and to trumpet it as a plus point!- while providing no adequate alternative.
The original iMac only included a CD reader. (Writers were falling rapidly in price towards the end of the 90s, but clearly still weren't cheap enough to be included as standard in the 1998 iMac). The modem was far from a sufficient replacement when it came to file sharing- this wasn't the broadband/Dropbox era, it was the days of dial-up 56kbps access when the other person/computer having Internet access couldn't be assumed.
Pen drives didn't even exist until couple of years later, and took a few more to be widely adopted. (If anything finally killed off the floppy it was those).
So, external floppy hanging off the side it was then.
If the original iMac deserves credit, it's with it helping give impetus to USB adoption (which I already had on my PC, but didn't have much support at first). But killing off the floppy? Nope.
(#) No, not even the Zip drive, which was hugely successful by most standards, but still not something you'd find in the majority of PCs.
Oh, and the addendum I wasn't able to add within the ten minute limit...
The worst of it isn't just that Scotland's oil was effectively stolen and wasted, it's that the money was used to prop up the Tories that most of Scotland was- and is- actively opposed to, and to lead us down an increasingly right-wing, English-directed path that ultimately led to Brexit and Boris Johnson as Prime Minister.
That's what believing the lies of the Westminster government got us. An imminent, economically-damaging Brexit we strongly voted against led by a calculating, self-centred hard right Tory with an intentionally-cultivated air of "loveable" buffoonery that let him slip past the political radar with the English as one of their own "eccentrics", but which (oddly) Scotland doesn't seem to have the same anaesthetised tolerance for.
And it was all done to us with the help of our own money.
It was revealed in 2005 that during the 1970s, the Westminster government had deliberately suppressed research that showed how hugely valuable North Sea oil was, estimating that an independent Scotland might be as prosperous as Switzerland. Both Labour and Conservative governments conspired in this.
You can see why they might not have wanted that information to be accessible to the rapidly-growing Scottish nationalist movement.
Instead, the vast majority of the money went to England and was squandered to- in effect- subsidise the Thatcher government's deindustrialisation and cover unemployment benefits, making the Tory policies look better than they were, keeping a government that was damaging to- and hated in- Scotland in power thanks to voters concentrated in the South East of England.
Even if the allegations in your comment were true rather than the usual typical weasel-worded anti-Scottish "subsidy" smear/propaganda ignorant of how government funding actually works, any figures you could allege to be involved in *that* would be made to look risibly microscopic by the de facto theft of Scottish oil over forty-plus years.
I won't hold *my* breath waiting for that back. Even if Westminster wanted to, they couldn't even begin to muster the funds involved.
If you're referring to the Enron scandal that enveloped then destroyed their accountants Arthur Anderson- which Anderson Consulting/Accenture was a spinoff from- that had nothing to do with it.
Quite the opposite, Accenture is generally considered to have been *very* lucky to have avoided guilt by association by (coincidentally) changing their name at the start of 2001, just months before the problems at Enron became public and rapidly snowballed.
Wikipedia claims that this due to a court-ordered name change as part of an arbitration case that resulted in the final severing of ties between Arthur Anderson and AC/Accenture less than six months prior.
So, yeah, they were very lucky to have severed ties *just* before the scandal hit, but Enron wasn't the reason AC/Accenture changed their name.
The original (plug-into-the-TV) Vega console was successful, so it wasn't unreasonable to assume that this one wouldn't be.
There's also the fact that the Vega+ (like the Vega) was only ever meant to be an emulator rather than a hardware-level recreation anyway, so it was already a solved problem; there must be countless low-powered systems that can run Linux or Android and an emulator on top of that. If- as I'm guessing- there are companies in China use these as the basis of pre-existing cheap, generic handheld designs, then that's practically an off-the-shelf solution.
It's not like the Vega+ looks much like the original Spectrum (unlike the original Vega's bizarre parody of the original Spectrum design or the proposed Spectrum Next's "Plus"-inspired design). Stick a rainbow flash and "Sinclair" logo on an existing design and it would have worked within the £500,000 budget.
In short, there's no reason the Vega+ shouldn't have succeeded. Its problems had nothing to do with hardware and everything to do with business politics, power-grabbing, infighting and incompetence.
> The campaign receives "Arrow certification", saying they have received a design review to ensure the campaign is ready for production. [Later] the "Arrow certification" that has been present since the 15th of October is revoked.
Now, *that* is potentially interesting in terms of liability, regardless of whatever attempted disclaimers Indiegogo has included in the small print.
@Phil Endecott; "There mist have been some very profitable locked-in customers somewhere."
"During the 2012 Hewlett-Packard Co. v. Oracle Corp. support lawsuit, court documents unsealed by a Santa Clara County Court judge revealed that in 2008, Hewlett-Packard had paid Intel around $440 million to keep producing and updating Itanium microprocessors from 2009 to 2014. In 2010, the two companies signed another $250 million deal, which obliged Intel to continue making Itanium CPUs for HP's machines until 2017. Under the terms of the agreements, HP has to pay for chips it gets from Intel, while Intel launches Tukwila, Poulson, Kittson, and Kittson+ chips in a bid to gradually boost performance of the platform."
The Spectrum Next is a far more interesting prospect anyway. Unlike the Vega and Vega+, which are internally just generic ARM hardware running Spectrum emulators (#), the Next is intended to be an FPGA-based recreation (and expansion) of the original Spectrum design/architecture.
Also, the case looks nice.
Not saying I'd definitely buy one myself, but I'd certainly consider the possibility. It'd be a shame if its prospects were hurt by the entirely unrelated set of jokers responsible for the Vega+ mess.
(#) Since one could already do this on pretty much *any* generic hardware nowadays- Raspberry Pi, low-powered Android smartphone or generic handheld for example- many people (myself included) were questioning what the point of the Vega+ was even *before* the fiasco unfolded.
I vaguely remember one design that featured a horribly mismatched combination of *both* translucent panels and the then-standard PC beige(!)
(From memory, I think it was an iMac clone, but it might have been a tower. The crapness of the combination- the epitome of obvious but clueless attempts to rip off Apple's design- was more memorable than the computer itself).
Biting the hand that feeds IT © 1998–2021