Re: Get off my lawn, etc.
Ah, interesting stuff- thank you.
I remember HOTOL- and when they axed it, unfortunately.
563 posts • joined 11 Feb 2008
You think the Spectrum tape loading was bad? My Atari 800XL did it at 600 baud and I had quite a few tape-only games that took c. 15 minutes (if not longer) to load(!)
Fantastic when you had 5 blocks out of 170 (or whatever) remaining and you got the dreaded "LOAD ERROR - TRY OTHER SIDE". :-(
(Which is why I appreciated the disk drive all the more and wished I'd bought that tape-to-disk transfer package years previously!)
As for type-in listings... I know some people enjoyed correcting the bugs introduced by their own mistakes, but I always found the process gratingly tedious, even for the ones that had checksums to make sure each line was correct.
That sounds pretty impressive (and something that might have been genuinely useful back then rather than just some proof of concept fun).
Did you manage to sell many? I'd have thought that the market for Acorn Atom peripherals was already pretty much dead commercially by 1985, even though it was only a few years old then. (Wikipedia says the computer itself was discontinued in 1982 and I get the impression it was quickly overshadowed by the likes of the BBC Micro and other shiny new machines).
Now he's had his fun, it's time to put those 512 floppy drives to serious work... create a RAID array of them.
It'd be even more impressive than the one that other person did a while back.
With 512 drives, it'll be able to hold as much as a CD-ROM, and I'm sure that Big Data and many other businesses will be interested in a giga-scale solution like that.
(Disclaimer: I'm not just old enough to remember when 700MB per CD *was* a lot of data. I'm old enough to remember when the 1.44 MB you could store on a *single* 3.5" HD floppy- or even 720 KB on a DD one- was a lot compared to the 120 KB per side my Atari 1050 5.25" drive handled. And I'm old enough to remember when even *that* was almost a luxury and infinitely preferable to the horrendous tedium of loading from cassette.)
Weren't those "hard-left/hard-right" early stereo mixes designed with 60s-era "stereograms" in mind, i.e. the ones with built-in speakers two feet apart that needed all the separation they could get?
They obviously sound rubbish on a modern system- and even worse through headphones- but it's surely unfair to damn stereo by that standard, since stereo mixes got rapidly better by the early 70s?
Vivaldi is apparently proprietary despite being based upon Chromium (much like Chrome is).
And I don't see any evidence that the free software movement is any worse for this than proprietary software.
If anything, it's the opposite, with the latter presenting self-serving, self-promoting suggestions to the customer, and obvious and unsubtle "nudge" tactics.
When free software does this sort of thing, it's often the result of a commercial imperative or business model based on free software (since the two aren't inherently exclusive).
Regardless, you need your head examined if you think the free software movement is the worst offender stopping you from "doing what you want with your own computers".
Remember that time Microsoft were practically trying to force upgrade Windows 7 and 8 users to Windows 10? They weren't just ignoring refusals (and frequently "forgetting" such refusals and asking again or even upgrading without asking), but going so far as to intentionally override software designed and installed to force MS to accept that refusal(!)
That alone went beyond the remotest plausible benefit of the doubt, and that's before we even get into their use of "dark patterns", i.e. ignoring the standard Windows acceptance of the close button as an implicit "cancel" and instead treating it as "Confirm/OK" for the unwanted upgrade.
Oh, and ask Apple about getting to do what you want with "your" hardware.
I remember Mitsubishi Motors had a big scandal around fifteen-to-twenty years ago when it was revealed they'd been covering up vehicle defects (which, according to the article, went back decades).
Yes, I know that technically that was a different company in the same group, but it wouldn't surprise me if there was a commonality of culture. It also shows that things like that can hang about vaguely in the consciousness, damaging a company's reputation for a long time.
Incidentally, that article reminded me that Mitsubishi Motors stopped selling cars in the UK last year. I wonder if that's because their market share was affected by the scandal? (They were never as common as (e.g.) Ford or Nissan, but there seemed to be more of them about decades ago).
I only ever tried out Gopher briefly circa 1994-5 (when the Web was already the Next Big Thing and- in hindsight- rapidly displacing Gopher). I almost forgot about it, and later assumed that it had been one of those pre-web, pre-Eternal-September-era technologies which disappeared after that mid-90s transitional period (a la Usenet (*)- which I *did* use a lot- and telnet/text-based BBSs).
But I found out just a couple of years ago that Gopher wasn't actually older than the web- they're pretty much contemperaneous!
Wikipedia says
"Gopher system was released in mid-1991"
and
Apparently Gopher's adoption was damaged by the announcement in early 1993 that licensing fees would be charged for implementations of the protocol, though I suspect the Web's more flexible design would ultimately have won out regardless.
(*) Yes, I know it's still used for binaries, but as far as its original purpose- news and general discussion- goes, it's been dead for a *long* time.
> PC was a term reserved for the IBM monster at the time
No, it wasn't. The terms "personal computer" and "PC" predate the IBM PC by several years and originally referred to *any* personal computer. (*)
IBM simply used the pre-existing term to denote *their* take on the personal computer.
Later on, "PC" became more associated with that specific personal computer- most likely because it was in the name- and a de-facto synonym/abbreviation for "IBM PC compatible". The term *was* used less for other machines after this, most likely to avoid any confusion. But that wasn't the case "at the time" of the IBM PC's launch or for several years afterwards and- strictly speaking- still isn't.
"Home computer" was a type (i.e. subset) of personal computer generally aimed specifically at the home user- back when that was a distinct market segment with its own formats- and not a different thing altogether. (Similarly, "microcomputer" and its long-obsolete abbreviation "micro" overlapped with both back then).
(*) For example, the magazine "Personal Computer World" started in 1978- a full three years before the launch of the IBM PC- and continued to cover a wide range of formats until the IBM PC compatibles became dominant anyway.
The new Mikes re-entering the room with the old ones still there reminded me oddly of that Michel Gondry video for "Come Into My World" that had multiple Kylie Minogues interacting with each other.
"How on Earth is there not just a simple government portal for that?"
Because the US is even further down the road of regulatory capture than the UK is, and companies like Intuit who profit from the system being complex lobby to keep it that way.
At the risk of stating the obvious, the US is run for the benefit of corporations, not the people who can be lied to at election time or ignored, since there are only two meaningful choices, both in the pockets of big business (*) and most people can be held to ransom on the basis that they won't risk the "bad" (i.e. other) one getting in.
(*) Regardless of what risible, foaming-at-the-mouth right wing Americans think about "leftist" Democrats. If those gun-shaggers want to bleat about being forced to live under an Obama/Biden/Democrat government as "communist", I'd *love* to see them forced to live under a "real" communist system like North Korea to see if they still couldn't tell the difference.
No doubt about that, but it was the economics of this specific case I had in mind. As far as that's concerned, barring all but the most ludicrous increases in the price of copper (i.e. several orders of magnitude) this technique would- assuming it works and everything else remaining equal- still be economically worthwhile regardless.
The conclusion notes that the price of copper is rising and that this use may further increase prices (*) undermining the economic case.
Except that, as the article already noted, "copper comes in at around $10k per metric tonne, while gold is over $62m per metric tonne"
So currently copper is just 1 / 60,000 the price of gold. Even if that went up ten times(!), the cost would still be a negligible percentage of the price of the gold currently used- i.e. close to nothing in comparison- and shouldn't affect the economics significantly.
(*) Only, as others have already pointed out, it probably won't.
Let's remember that James Dyson supported Brexit having already shut his factory in England and moved production to the Far East a decade earlier.
And who- *after* having supported the ra-ra nationalism of Brexit and puffery about promoting Britain- announced he was moving the company's HQ from the UK to Singapore.
The person who was painted by his Brexiteering chums- and himself- as a hero of British industry while being in reality everything that was wrong with it.
Odious excuse for a human being sells overhyped, overpriced plastic tat? No surprise.
"ME was built on Win9x."
Exactly. I remember that, before it launched, the *original* plan was for the next NT-based version of Windows (which ultimately became Windows 2000) to entirely replace the DOS-based line and become the "mainstream" version of Windows for all users (*).... except that they never quite managed that.
Compatibility issues et al (IIRC) meant Windows 2000 wasn't quite ready to take over, and it took a little longer until the NT-based Windows XP came out and they were able to entirely ditch the DOS-underpinned versions.
In hindsight, I assumed that was the only real reason for the (still DOS-based) Windows ME's existence and why it was so pointless and short-lived- it was little more than a stopgap and backup plan.
(It might also explain why Windows 2000 has a more "consumer-friendly" style name that *sounds* like it's the directly replacement for 95 and 98).
(*) Something which its Wikipedia article appears to confirm I remembered correctly.
"This was in the mid-90s when "hacking" was Very Naughty."
People used to complain that "hacking" had been distorted from its original 60s and 70s "hacker culture" definition to instead mean someone doing illegal/dubious security-related things (a la your 90s definition).
It's ironic that even *that* meaning has been degraded and redefined in an undignified way, a la "life hacks" or (e.g.) "menu hack", the latter being where you use various techniques (e.g. asking the staff for unlisted items or combining existing ones) to create something not on the regular menu.
I saw a billboard literally this morning where McDonald's informed us that we no longer need a "hack" to get a Chicken Big Mac.
FFS.
I wonder whether the regulatory authorities would view it that way?
I also wonder whether she was advised by someone who properly knew what they were doing, legally-speaking, that this wording would protect her from responsibility, and from any consequences.
Or whether she simply assumed that she'd be automatically protected if she threw in any old disclaimer?
The only reason- as far as I'm aware- they ask you to enter a password twice is to ensure that you didn't inadvertently make a typo. (*) So allowing the user to cut and paste from the first one would defeat the whole point.
(*) This isn't a big deal when simply logging on- as you can try again- but if you're changing your password and set it to something other than what you intended, you could lock yourself out. On the other hand, that's the only time I can recall *having* to enter it twice anyway. Unless Sainsbury's require it for a regular login?
Was going to point out that bananas are a flawed example, since your body will excrete any excess potassium so you generally don't end up with more potassium in general (and the small but constant percentage which is the radioactive isotope (*)) than you did before. But I notice the article itself mentions that anyway.
(This is unlike other radioactive substances which can build up in the body, often in lieu of similar but non-radioactive elements in the same column of the periodic table).
(*) The half life of potassium is- IIRC- in the billions of years, so decay of the "old" potassium in your body will have an utterly negligible effect on the timescale we're discussing here.
Stack Overflow is the original website for programmers; Stack Exchange is the parent network that includes countless other sites, some of which appear to be under the "stackexchange.com" domain (e.g. english.stackdomain.com, unix.stackdomain.com, etc.) and others which aren't (e.g. askubuntu.com, serverfault.com).
To be fair, I only know this because I always got them confused myself and (coincidentally) looked it up around a week ago.
"Now a lego kit makes one thing and one thing only."
Not true. Apparently they were moving in that direction around twenty or so years back, but stopped doing that, and none of the sets I bought my niece or nephew in the past few years have been like that.
Yes, some of them to have some fancy, semi-custom parts, but this was the case even when I was a kid in the early 80s.
Or did you mean that they only come with instructions for one thing to build? I'm not sure whether that's correct or not (though they do sell some explicitly '3 in 1' sets), but you can build countless things with Lego regardless of whether or not it's in the official instructions.
*Exactly* how I felt about that self-righteous, self-serving fake concern and attempt at moral ransom.
And as regards
his startup collected “public data from the open internet.”
Whether or not something is publicly-accessible (*) online, that doesn't automatically grant carte blanche to do anything they want with it. But then, they already know that damn well and they're resorting to this weasel-worded nonsense to defend the indefensible.
As you say, "scum".
(*) And this doesn't include cases where they had to click-through agreed terms and conditions and/or log in- again, having agreed to conditions when they signed up- to access that data.
Sorry, I should have been clearer; I meant it would (AFAIK) take 2 bus cycles if the data was already available internally, presumably held on the 32-bit data registers. I also notice the Wikipedia article says that the ALU is only sixteen bits (but it has another two used for addresses or something).
That said, you sound a lot more knowledgeable about such things in general than I am anyway(!)
FWIW, in light of what you said I also noticed that the Wikipedia article claims that "Motorola termed [the 68000] a 16/32-bit processor".
And the "ST" in the 68000-based "Atari ST" also supposedly stood for "Sixteen/Thirty-Two" (though some suspected it was actually "Sam Tramiel".) I'm guessing the less-well known "TT030" workstation is so-called after the "Thirty-Two" bit 68030 CPU...
"I'm pretty certain that the original 68000 only had a 16 bit data bus"
That's correct; the original 68000 was 32-bit internally, but only had a 16 data bus. But it was generally only considered a 16-bit processor anyway (the Atari ST and earlier Amigas were all generally called "16-bit", and the 68000-based Mega Drive's badge said "16-BIT"). It wasn't until the 68020 that it got a full 32-bit data bus.
While we're on about the "8 or 32 bit" QL, it's worth remembering though that even the original IBM PC and PC XT *didn't* actually use the 16-bit 8086, they used the 8088 which was the cut-down version with (again) an 8-bit data bus. So though some people complained about Sinclair characterising the QL as a 32-bit machine, it was probably as legitimate as claiming the original PC was 16-bit.
(Some have also noted that the 68000- and presumably 68008- used two cycles for the 32-bit instructions, so one can start nitpicking this, but... yeah.)
The Register's own 30th anniversary article for the QL touched upon this:-
[Chief Design Engineer David Karlin] chose the 68008 [which] was built for backwards compatibility, so it featured an external 8-bit data bus and 20-bit addressing. The 68000 [had] a 16-bit data bus and 24-bit addressing. [..]
“I don’t know how robust the decision was,” David admits now, “but it seemed fairly clear that the the 68000 series would be a great platform [..] The problem with the 68000 was basically a pin-count issue. Motorola was pricing it gigantically higher than the 68008, double or treble the price. It was sufficiently high I didn’t even argue about it.”
Rivals’ use of the full 68000 would later come as something of a surprise. If Sinclair couldn’t afford the 68000, how could they? Today, David blames Sinclair’s negotiating skills, not just for the CPU but for a whole variety of logic chips and add-ons: “I question how good we were at purchasing, because people like Amstrad, certainly the Japanese, certainly Apple, who did not have gigantically higher volumes than us at the time, got massively lower prices.”
During 1983, it has been claimed, Motorola cut the price of the 68000 to below what Sinclair had agreed to pay for the 68008. Renegotiating the purchase contract might not have been costly, but adding in the architecture the full 68000 required [..] would have been, so it was decided to stick with the 68008. [..] It’s easy to say Sinclair would have well been better off going with the 68000 after all, but only with the benefit of hindsight.
Not sure about the Ceefax analogy, but that's otherwise broadly correct AFAIK.
Since the CPU was used to generate the display, the ZX80 could only output an image when things were otherwise idle or waiting for input (i.e. essentially the same as a ZX81 that could only operate in "FAST" mode).
The WAIT/NMI logic on the ZX81 gave the user the choice of a continuous display at the expense of speed (hence "SLOW" mode) by interrupting the processor whenever it was needed to generate the active parts of the display. As you say, you could always switch back to the full-speed non-continuous display via "FAST" mode.
It was just "Sinclair" in the UK, although they were manufactured under contract by Timex. "Timex Sinclair" was the brand used for the otherwise near-identical North American version of the ZX81, known as the Timex Sinclair 1000 or TS-1000.
Interestingly, there was also the obscure Timex Sinclair 1500, which was also essentially a ZX81 but with a Spectrum-style silver case and rubber keyboard and 16K onboard.
(Apparently a complete flop- most likely because it didn't come out until mid-1983 and cost $80 at a time when numerous more advanced machines were available, and the US market had become so competitive that many of those were being driven below the $100 mark).
There was also a Timex Sinclair 2068 which was based on the ZX Spectrum, but with a number of improvements that made it incompatible with the latter. (And which didn't help it succeed in the aforementioned cutthroat US market, which Timex were forced out of shortly afterwards).
Timex's Portuguese arm also released the Timex Computer 2048.
Indeed. As far as I'm aware, the hardware design of the ZX81 is almost the same as the ZX80's at a logical level (aside from the addition of WAIT/NMI). The biggest change is that much of it was re-implemented via a single ULA.
Even so, the similarities were such that you could (almost) convert a ZX80 to a ZX81 simply by replacing the original 4K BASIC/OS ROM chip with the same 8K ROM that the ZX81 used. Sinclair sold this as an official upgrade, complete with an updated keyboard overlay.
The only thing apparently missing was the non-flickery ("SLOW") display mode, which the ZX81's aforementioned WAIT/NMI hardware upgrade was needed to implement.
Biting the hand that feeds IT © 1998–2022