Or enough time to read it...
2578 posts • joined 23 May 2011
Kudos to those doing the hard graft necessary to find downright malicious code rather than taking the easy option of running a static analyser over the code and grabbing some headlines with the large number of results.
It's the difference between finding blood and proving it's human blood that belonged to the suspect or victim.
Do patent inventing algorithms dream of electric sheep, now?
"Do they last for the "life" of the machine, until its next reboot..."
I don't know how Australian law works, but patents are typically granted for 20 years, subject to renewal. It's copyright that depends on the lifetime of the
"it may be perfectly reasonable to exclude those inventions that have not been devised by a human inventor."
In which case, legislators need to the change law. The judge hasn't made a decision on the basis of what he thinks is beneficial for society. That's not his job; that's what we elect people for. His role is to determine the law. And he he has said Australian law doesn't prohibit machine-generated patents. Maybe he has erred. But if not, then it's not his fault that's allowed.
Hard drives at Autonomy offices were destroyed the same month CEO Lynch quit, extradition trial was told
Since it's the only way to differentiate in a Chromium-dominated market, Vivaldi 4.1 introduces 'Accordion' tabs
You, too, can be a Windows domain controller and do whatever you like, with this one weird WONTFIX trick
Re: This is a global problem
"What are Ofcom going to do about a cloud of satellites launched elsewhere"
They'll apply the usual regulatory sanctions. After all, these are economic ventures aimed at turning a profit. They want to be able to operate in the UK.
And if the operator doesn't have a UK presence, then there's the ITU. Even if they haven't got their act together over satellites, I'm sure there are mechanisms via which they can be sanctioned - no nation wants to end up with international telecommunications curtailed because of a bunch of pirates located in their territory.
Also governments can talk to governments. Beaming random interference over a nation is a hostile act. I imagine the US government/FCC would lean on its operators to shut down transmissions over the UK. For other governments, it will depend on how much they want to up the temperature. But if they're prepared to do that to us, then we, or our allies, could do the same back. It benefits everyone to have some order. The only way I can see some fly-by-night operator getting away with flagrant violations of reasonable licencing restrictions would be if it was a John-McAfee wannabe operating a handful of satellites out of a tropical island.
You Keep Using That Word; I Do Not Think It Means What You Think It Means
"It's also a display of Keynesian economics at its finest..."
You might need to look up what Keynesian is. Because it's not what you say it is. Although I guess it's getting it wrong with a touch more class than saying "socialist economics" or "Marxist economics".
Financial Ombudsman Service to ditch tech heads as it open arms to Workday and outside service provider
"the Surface Neo is still officially positioned as a work in progress, with no word yet on whether Intel's move will change anything on that front."
Have you got this the wrong way around? Has the Neo been cancelled, behind the scenes, and that's why Intel has pulled the plug on the chip?
Bonus points: Intel publicly cancelling the chip before Microsoft publicly cancels the Neo means Microsoft can use the public announcement of the cancellation of the chip as the public-facing reason why the Neo has been cancelled even though, privately, the only reason the chip has been cancelled is because Microsoft have pulled their contract without making this private knowledge public and as now, both privately and publicly, the chip has no buyers, Intel have announced the cancellation publicly allowing Microsoft to tell the public the Neo has been cancelled and use their behind-the-scenes private cancellation of the Neo as the justification for it's public cancellation. Simples.
And if I managed all those "publiclies" without a single "pubicly" I've done well.
Microsoft defends intrusive dialog in Visual Studio Code that asks if you really trust the code you've been working on
Re: Over the hill
Sooner or later you run out of people to teach you and have to figure it out yourself. Being taught is nothing more than a leg up. And people who are taught don't always understand why it should be done like that. If you've tried it, you know, and know when you can cheat. And I certainly don't think people who are taught produce better code. These days they often produce worse code because they haven't grown up with the machines and don't understand that a CPU is actually going to have to execute what's been written.
I can sympathetic with finding old code brilliant and atrocious, or an amalgam of both; the impossible made possible before your eyes. Lets face it, for most of us, the code is a prototype that should be thrown away and rewritten (and then thrown away and rewritten again because you succumbed to second system syndrome). But that's just not possible. And the rest of the time the code is a quick bodge on a prototype that should have been discarded.
Within those limits, it's often pretty good. The bad habits acquired on 8 bit micros are rarely visible. It's well factorised. It's almost like looking at a codebase written by adults. And there are inspired flashes I think I'd struggle to match. No wait, I've just had an idea...
Vulcan is an old WIMP
The OPs point was "Maybe that’s where Vulcan went?" But Vulcan couldn't go somewhere because, as you say, it never existed and could never have existed.
Theia has a reasonable chance of having been real; the best competing theory has the earth-moon system formed from a head on between two even bigger planetary bodies.
But thanks for reminding me of Vulcan. Next time we have an argument about dark matter I will bring it up because it's almost an exact parallel: an anomaly that is most simply explained by an unobserved mass which turns out actually to be an error in our theory of gravitation.
"I was the future, once"
Yeah, I'd've said D was O(two decades) old. That makes it the generation before Go (2009), Rust (2010), Swift (2010), and Kotlin (2010).
It was up and coming for a while when C++ development had stalled. I played around with it and it looked like a genuine contender. Then C++11 unblocked the pipes and D never offered enough to justify the jump. And at this point, I can't see it gaining mindshare. I'm sure aficionados will keep it alive; but it's not something I'd trust a codebase to.
EDIT: Systems language get taken up when they offer new "safeties". C++ offered type safety (and cleanup safety via destructors/RAII). AIUI Rust offers memory safety and race safety. D doesn't offer any new safeties.
Oh dear, Universal Windows Platform: Microsoft says 'no plans to release WinUI 3 for UWP in a stable way'
Hubble telescope in another tight spot: Between astrophysicists sparring over a 'dark matter deficient' galaxy
Produce the particle or move off the pot
Dark matter is a beautifully simple explanation. I worked on it as a student. It's missing one crucial thing: the bloody particle(s). And since GR isn't renormalizable, you start to think maybe what's missing is our understanding of gravity at scale. With the right tweaks maybe we can solve two problems with one theory. And we have form on getting gravity wrong.
Which is not to say the dark sector couldn't be out there. But for many years, now, I've felt dark matter resembles epicycles or the new aether and in a hundred years time students will look back and think "why did people ever believe that crap?" and think we're all idiots. Clearly, we're not idiots. But maybe we should be looking harder at alternatives. They aren't perfect but that's because they are mathematical challenging and under-researched. We've gone for the easier solution; the universe may have other ideas.
(For readers unfamiliar with the details, Sabine Hossenfelder has a recent blog which runs through some of the arguments and some of the problems.)
A stench more pungent than the odour of any stink bomb a comics shop might sell
Company directors have a legal duty under the Companies Act 2006 (“the Act”) not to file false information on Companies House. Knowingly or recklessly delivering information or making a statement to the Registrar of Companies that is misleading, false or deceptive is a criminal offence under s.1112 of the Act and can lead to imprisonment and/or a fine.
Although the article goes on to say that "Inadvertently filing inaccurate information is unlikely to breach s.1112 of the Act." I imagine forgetting to update your filings would be the same - especially in the panicked response to a pandemic.
The most relevant point is the computer has only been sat in space for one decade, not three. And as it's not been turned on, there's no chance of damaging current surges or voltage spikes; it's only at risk of cosmic rays having carved a few bonus tracks. But being 1980s design (aka built like a brick shit house), the parts are going to be pretty robust against that.
The typical cosmic ray is a proton. 90% are. Whereas alpha particles (helium nuclei) only count for a miserly 9% of cosmic rays. [It did surprise me that they outnumber electrons.]
X and gamma rays flooding the system aren't included in the cosmic ray tally. They're just part of the general shit that is "space". But I'd hazard the most damaging X rays are those caused by charged particles going splat onto a transistor. (Bremsstrahlung)
Excuse me, what just happened? Resilience is tough when your failure is due to a 'sequence of events that was almost impossible to foresee'
Re: Partial Failures
I didn't mean to imply it was impossible; only that it was impractical.
I guess dodgy memory is read back and verify. You sometimes used to have to do that with peripherals. Although I suppose it gets more tasty if there are caches between the CPU and main memory.
For arithmetic, you could perform the calculation multiple times in different ways and compare the results. But I'm not doubling or trebling the size of the code and the wasting all that extra time just in case I've got bum CPU. (Particularly if I've already spent a lot of time optimising the calculation to minimise floating point rounding errors.) In real world, you can't write code against on the off chance the CPU is borked.
Re: Partial Failures
I was thinking about Google's insights into chip misbehaviour. You can't write your code defensively against the possibility that arithmetic has stopped working.
Likewise, as a consumer of a clock: you've just go to assume it's monotonically increasing, haven't you? (And if you do check, have you now opened up a vulnerability should we ever get a negative leap second?) That said, my timing code nearly always checks for positive durations. But it's response is to throw an exception. Which is just to switch one catastrophically bad thing for another.
Ireland warned it could face 'rolling blackouts' if it doesn't address data centres' demand for electricity
Re: Spitting at security standards
I'm not sure I can reliably tell llama from alpaca, if I'm honest; I think they might be pet llama. But I used to be able to see the buffalo and the llama/alpaca in a short walk from my home. The buffalo broke through the hedge onto the towpath, but I'd rate them as more docile than the average dairy herd. They went when the farm changed hands. The llama/alpaca are still there. The ostriches need a car journey.