Re: Could be Mandela effect
How was this a risky bug in the DOS era, a single-tasking OS built long before computers had power control?
3072 publicly visible posts • joined 18 Jun 2009
> There has never been a technical reason for creatives to prefer Apple. Creatives prefer Apple kit for the same reason arty types use cigarette holders and wear berets.
You obviously never tried colour-calibrating Windows 3.1-era PCs — it'd be via the card-specific video driver if it was available at all — or typing much on them that wasn't directly printed on a key.
E.g. Mac emdash: option+shift+-. Windows: hold down alt and type, in sequence on the number pad 0, 1, 5, 1. Mac, anything with an umlaut or a diacritic, e.g. ö: option+u to pick that particular diacritic, then o to type the character. Like a not-as-good version of the compose key on other systems. Windows: alt plus a four letter code again, memorised depending on the particular character (via answers.microsoft.com here's some of them).
Windows 95 didn't exactly solve these problems, but — believe it or not! — it crashed a lot less than contemporaneous Mac OS so in terms of getting your day done, there's a solid technical reason. In the six-or-so-year period after Windows 95 but before OS X the most likely cause for Mac needing to be rebooted was a single bad application whereas the most likely cause for a Windows PC was a bad driver.
For reference, codejunky is referring to this incident (via the Associated Press, a reasonably neutral source).
I had an M2 fail on me due to capacitor issues but rolled onto a brand new Model F (i.e. buckling springs but capacitative underneath rather than the Model M's underlying membrane) from www.modelfkeyboards.com and have had no issues so far. Though I did this only thanks to a generous employer. US$400 is, to put it mildly, more than I'd invest for myself in a keyboard.
I'm still not sure I understand the reasoning behind the trade war.
If I buy something from China then as far as I'm concerned we've exchanged things of equal value: I sent money, they sent goods. If I didn't think that was the amount the goods were worth, I wouldn't have bought them. There's a net effect of zero in terms of my total wealth, I've just made concrete some of my otherwise-unassigned liquid value. I'm not poorer.
If China becomes richer then that's because it has found an effective way to add value; it has converted goods worth $x into goods worth $y while spending less than $y - $x. And if I've opted to buy their product then probably it means they've done a better job of adding value than anyone else. So my quality of life is improved compared to what it would have been if I couldn't buy from China, because I am able to get better value for my dollar.
So what I don't understand is: given that America has just 3.7% unemployment, what's the problem that needs to be addressed here? The government needs to make my life worse, ensuring I can get less for my money, so that some people who are already otherwise employed can instead work in factories?
I struggle to understand how that's a step up, for the specific people or for the country as a whole.
If this complete failure of comprehension reveals me to be a simpleton, so be it.
(As an aside: unemployment by the same measure was 6.7% during the pandemic but has a current-working-lifetime peak of 9.9% in 2009; other highs include 10.8% in 1982 and 24.9% in 1933. The only sustained period with lower unemployment than now was during WW2 when presumably a combination of the military and armament manufacture soaked up almost all available hands)
In case anybody else is curious like I was, this Retrocomputing StackExchange post has die shots o the 486DX, original 486SX and later redesigned 486SX. Eyeballing it puts the FPU at about a sixth of the total surface area.
You're clearly desperate to dig your own hole.
My statements that the UK has grown less than the Eurozone since leaving the EU are both consistent and proven by the data.
Your response: "but the UK beat the Eurozone in three particular quarters, so haha, you're cherry picking" is nonsensical.
> Cherries are great, aren't they?
As per the linked document (and notably you're unable to provide any source for your own claims): measuring from when the UK left the EU to now, the UK has grown more slowly than the Eurozone.
So if you define cherry picking to be "starting at the date at which the UK left the EU and continuing until the end of the data" then, ummm, yeah? I mean, most people wouldn't define "using every single quarter of data that's available" to be cherry picking but I guess most people aren't working backwards from the conclusion they want to reach.
I also apologise that "Even if the OECD forecasts for 2025 play out as per the linked article — 1.4% growth for the UK versus 1.1% for the Eurozone — the UK will still be behind" seems to have gone over your head. To repeat: including all the available data plus current OECD forecasts as far as they go still leaves the UK behind the Eurozone in growth since leaving the EU.
> UK economy growing faster than the Eurozone!
Maybe over there in the land of fiction.
Per our own parliament the Eurozone's GDP is up 4.9% from pre-pandemic levels*. The UK's is up 3.4%.
Even if the OECD forecasts for 2025 play out as per the linked article — 1.4% growth for the UK versus 1.1% for the Eurozone — the UK will still be behind.
* since we left the EU on the 31st of January 2020, immediately before the pandemic, this feels like a fair point of comparison.
The original Apple ROMs are a collection of things Woz thought would be useful. The original integer BASIC is one. SWEET16 is another. Floating point routines are a third. The monitor and mini-assembler are others. The use case was just that anybody with an Apple II might want to use any one of these things.
The norm of 8-bit computers booting into BASIC (in many cases being pure raw-metal implementations of BASIC rather than anything more layered) is a later thing. You shouldn't view what's in the Apple II ROMs are being coupled to what is needed or used by its original BASIC.
Proof if any were needed that age does not imply maturity.
> You are making assertions based on third party opinions of people who were either not there at the time, did not use the products when they were first released
It is a stone cold fact that SWEET16 is not used by Woz's BASIC. This is not an opinion. It is an objective reality. It speaks to the frailty of your intellect that you don't understand that facts are not opinions, though it does somewhat explain your negligible grasp of facts.
> In fact with the first Apple II's I played with at the end of 1977 came the listing for "SWEET 16" which was the key bit of the Integer BASIC interpreter that Woz wrote for the Apple II.
SWEET16, although present in the ROM, isn't used by Wozniak's BASIC.
> Which actually worked, was clean and fast. Unlike the other BASIC interpreter shipped with the early APPLE II's. From MS. On a floppy. Which was a "complete" BASIC, very slow, used up almost all memory and had a tendency to lock up.
Appelsoft, as Microsoft's BASIC for the Apple II was branded, wasn't known to be slow or to lock up at all. That's probably why it replaced Wozniak's BASIC as the ROM-resident one as of the Apple II+, i.e. starting in 1978. Being an instance of Microsoft's 6502 BASIC it's very similar in speed and never-locking-up to BASIC on a Commodore, Oric, etc.
All of those things may be true, none affects any argument I've made.
Again, for clarity: the statement "The writable optical media I own have all rot within a decade." is representative of the optical media either you or I have likely ever created but it is not true of all optical media.
Going further than that: let's take it as given that you're correct about the net benefit of magnetic. Then there's really no need to support false statements about the potential longevity of optical — which are still the only thing I'm taking issue with. Just take that issue off the table and then discard optical based on magnetic's other merits.
NIST's conclusion was based on a metasurvey of empirical scientific measurement and M-Disc is designed for archival; it is distinct from standard writable optical media — e.g. it requires a newer, specifically-compatible drive.
The underlying studies did in part compare to standard media and find it to be essentially useless.
I haven't used it, haven't tested it, etc, etc, but noting for fairness: M-Disc claims "up to 1000 years" of longevity (i.e. subject to storage environment, naturally). At the very least it's an optical media that you and I can buy that has tried specifically to optimise for that.
The Biden-era NIST rated it as acceptable for 100-year storage.
I can't speak confidently as to its relative merits, but your claims about magnetic media aren't factual.
Both cartridges and drives are still in ample supply; based on a quick search both are available from: Amazon, Walmart, NewEgg, B&H Photo, and more. Essentially the same list as M-Disc, which I guessed to be a fair optical comparison.
> And he militantly disapproved of Free Software and Open Source, which were evil and COMMUNIST.
That'd explain some of the odd passages on the linked site; especially:
> You have ... a second-hand z/Arch (IBM mainframe) machine ... For whatever reason you have been restricted to using just public domain software, which rules out virus-licensed crap like z/Linux.
> It is the year 2023. ... You might be a girl from Slovakia named Alica Okano and you have the spirit of freedom from communist slavery flowing through your veins.
I guess the fall of communism was only 34 years before 2023?
I'd previously have argued that Brexit is a significant constitutional change, which will therefore affect the UK indefinitely whereas Americans* are strongly wedded to never, ever changing their constitution so Trump is more transitory. But he seems to be doing a very effective job of establishing that the US Constitution really only worked as a series of conventions anyway. He ignores the courts and the law, and Congress doesn't even feign autonomy. So he represents a sharp break in the constitution of the US regardless of the wording of the Constitution.
* noting, for fairness, that nowadays I also am an American. Last year I got to vote in both countries... and that's exactly as much international influence as I'll ever have.
Yeah, I think that's fair. Cameron was well-intentioned and thought he was doing what was right for the country, expecting a Major-style put-up-or-shut-up moment in which he put the debate to bed for another couple of decades. He just wasn't particularly good at politics.
Trump has never shown any indication whatsoever of being well-intentioned, but — regardless of what you or I might think about his policies — has proven to be exceptionally good at politics, having now dominated political discussion for almost a decade, increasing his level of support at every successive election (he won in 2016 with ~63m votes and 46.1%, lost in 2020 with ~74m votes and 46.8%, won again in 2025 with ~77m votes and 49.8%).
Cameron couldn't sell sanity; Trump easily sells idiocy.
It surely depends how you measure the scale of f***-uppery and against what intentions.
Cameron falsely believed himself to be good at politics rather than a flyweight that happened to have won the old boy's network lottery but at least he only destroyed the future of a single country.
Trump's fault isn't that he falsely thinks he's doing the right thing for the country, it's that he wants the daily narrative of him versus the elites for the benefit of his propagandists — both the usual media outlets and the codejunky types trolling social media. He has no obvious concern for the country or the rest of the world whatsoever. So on that basis of his actual objectives he is succeeding massively, but on the other side he's going to drag a huge proportion of the world with him through this stupidity.
I've actually had the native client stop functioning with Outlook until I went through the palaver of resupplying my password a few times over the years — to the point that reporting such as this was the very first thing that made me even consider that Microsoft might be at fault here.
Once again reentering my password worked, so no big deal. I'm still a lot happier than I am with the Outlook application — whether in iOS or on the desktop — which I am required to use for my work account.
I cannot speak for the original author, and think we might be grossly overestimating him if we assume any coherent rationale, but the objective of recreating a Windows 95-compatible version of Win32 would allow a lot more in terms of era-relevant entertainment content.
Windows 95's 'incomplete' runtime fault detection acts as accidental fault tolerance for many titles — in Windows 95 passing a NULL here or there to DirectX gets a free ride whereas under NT you'd get the page fault that one would ideally hope for.
Implementation of PutChar before somebody on Reddit very slowly explained the idea of testing ranges of values:
```
UINT32 index;
if (ch == ' ')
{
index = 26 * 16;
}
else if (ch == '!')
{
index = 27 * 16;
}
else if ... etc for another 42 if statements ...
```
Author's excuse: "Be aware that i wrote this code while i was multitasking and i am terrible at multitasking. Thanks for the idea, and i will surely rewrite that portion of the code."
No matter what his potential, right now this seems to be an enthusiastic child running fast towards the buffers.
> Didnt Maddow [get*] Trumps tax returns which shows he pays more than that socialist Bernie?
In 2017 Maddow compared Sanders in 2015 to Trump in 2005 because that was the information then available. She found that Sanders' ~13.5% rate on declared income of around $205k was substantially less than Trump's ~25% on declared income of around $152m. I can't speak as to bracketing and therefore what either of them 'should' have paid but it is clear that in 2005 Trump most definitely paid his taxes.
However, when more of Trump's tax returns were leaked in 2020 the NY Times was amongst those observing that Trump paid no tax at all for ten of the fifteen years leading up to 2020 and paid only $750 (not a typo; no suffix intended) for two of the other five. In all cases because of declared losses.
> That's all very nice, but you don't *seriously* believe that's where the money'll end up in this case, do you?
If we're having a laugh at comments that imply naive assumptions, what makes you think Trump's inevitable rich-people tax cuts will be based on money that's coming from anywhere?
The federal deficit in 2016 was $585bn, having increased from $459bn in 2008. Excluding 2020 for the COVIDness of it all, by 2019 Trump had managed to increase that to $984bn. Not as bad as W Bush who managed to turn a $236bn **surplus** into a $459bn **deficit** but in the same ballpark.
The negligible cuts he and Elon are pretending add up to anything plus whatever he gets from tariffs will make no difference whatsoever to the tax cuts he doles out, and the federal deficit is only going to increase. The GOP is provably the party of fiscal irresponsibility.
[2020's number, for the interested: $3,132bn. But, again, too much global noise to attribute Trump with causing the 2019–2020 change]
> C++98 certainly depended on manual memory management.
`std::auto_ptr` existed in C++98 but has the ignominy of not only having been deprecated since but actually removed — removals are extremely rare in C++ world and more or less flag that something was an active hazard.
(In this case, it's because std::auto_ptr had std::unique_ptr-style single-owner semantics but without std::move to flag programmer intent it moved ownership upon assignment. So `=` was a _move_ operator. Which really screwed with generic code since it's pretty normal to expect the assignment operator just to assign, i.e. to set the value of the thing on the left without mutating the thing on the right.)
Some maniac seems to be hitting the down vote button on a bunch of posts that make this point so: here's the Rust compiler's repository. Everything I looked at was Rust.
The std::unique_ptr, after being nulled because what it was holding has been moved, is in a valid state and can be set to point to something else if you like — whether from a raw pointer via .reset, by assignment from some other std::unique_ptr that you've moved from, or via std::make_unique.
There's nothing unsafe about that. And I'm unclear what you'd even mean by setting a std::unique_ptr to nullptr as distinct from setting it to point to nullptr.
(Though, for clarity: I'm only in this conversation because I think what C++ does has been misdescribed, giving a misleading impression. I'm insufficiently-experienced in the latter to have a meaningful opinion on what Rust does better or worse.)
> And no compiler vendor presented a propiertary solution (but plenty of propiertary solutions for other stuff).
Apple supported a garbage collector for C; you marked your pointers as __strong (i.e. owning) or __weak (i.e. automatically nilling) to opt in. It was primarily marketed as being for Objective-C but worked in regular C too.
It wasn't supported on iOS though, so failed to be at all relevant during the sudden explosion of Objective-C programmers circa 2008, was deprecated in favour of automatic reference counting which unlike garbage collection is for Objective-C objects only.
All runtime support is long-since gone; it was removed about a decade ago.
> The borrow checker and smart pointers/references are not the same. In C++ you can use variables after they were moved ... The latter is problematic: the programmer can forget to move the pointer, can forget to zero the moved pointer, and last but not least, a zeroed pointer delays the detection of the problem to runtime.
Since he's talking about smart pointers, I think the poster is referring to moving things only via std::move and r-value refs, in which case it is a requirement that anything moved from still be in a valid state. You don't need to zero anything manually; if you std::move from a std::uique_ptr then the latter is guaranteed now to be set to nullptr by the language specification.
Of course other, more informal, meanings of 'move' remain possible — including anything you want to do with a raw pointer obtained via either .get() or .release() — and the compiler can't currently verify that your custom type with your custom low-level code is obeying the 'must be in a valid state after being moved from' rule. So those are problems, especially in legacy code.
Apple is rumoured to have sold around 500,000 Vision Pros so clearly the "throw $3,500 at something regardless of utility" crowd might not be massive but it definitely exists.
Of the two dissimilar products, I guess the Apple has greater brand draw for wealthy types but the Huawei might make up for that by having slightly more purpose?
John Peel was on Radio 1 right until his death, with a quick search indicating that his final Radio 1 broadcast was on the 14th of October 2004.
On the one hand, 2004 feels more recent than I'd have guessed, but on the other I used to record his broadcasts to Minidisc and try to cut out the particularly-good songs later. So that dates it, and me, somewhat.