RISCy
As any fule knowze, there is no menu bar. The menu is opened by clicking the middle mouse button.
No tree icon. But most trees look like this at the moment ------------------->
3279 publicly visible posts • joined 23 May 2011
Given my mum can apparently walk several miles just by sitting in a chair and knitting, I'd say it'd less be proof of steps and more proof of arm waving. Which, to be fair, is an adequate summation of the whole cryptocon. "So how will this thing make money?" *lots of arm waving* "Then a miracle happens."
I've actually just bought a few RoombaCoins™. If you haven't heard of them, your chance of winning is proportional to the amount of dust your hoover collects. So if you see my robot vacuum running up and down the street, you know why.
(And somebody below has invested in FitBitCoin.™)
"If you want something that will stay connected for a very long time and you don't want it to drop, just arrange with the other side to send polls to one another from time to time. "
Or set SO_KEEPALIVE.
Aside: and if you think things are bad now. Go back 25 years and discover why "pipelineing" is an optional extension in rather too many protocols.
Kudos to those doing the hard graft necessary to find downright malicious code rather than taking the easy option of running a static analyser over the code and grabbing some headlines with the large number of results.
It's the difference between finding blood and proving it's human blood that belonged to the suspect or victim.
"Do they last for the "life" of the machine, until its next reboot..."
I don't know how Australian law works, but patents are typically granted for 20 years, subject to renewal. It's copyright that depends on the lifetime of the mouse author.
"it may be perfectly reasonable to exclude those inventions that have not been devised by a human inventor."
In which case, legislators need to the change law. The judge hasn't made a decision on the basis of what he thinks is beneficial for society. That's not his job; that's what we elect people for. His role is to determine the law. And he he has said Australian law doesn't prohibit machine-generated patents. Maybe he has erred. But if not, then it's not his fault that's allowed.
"What are Ofcom going to do about a cloud of satellites launched elsewhere"
They'll apply the usual regulatory sanctions. After all, these are economic ventures aimed at turning a profit. They want to be able to operate in the UK.
And if the operator doesn't have a UK presence, then there's the ITU. Even if they haven't got their act together over satellites, I'm sure there are mechanisms via which they can be sanctioned - no nation wants to end up with international telecommunications curtailed because of a bunch of pirates located in their territory.
Also governments can talk to governments. Beaming random interference over a nation is a hostile act. I imagine the US government/FCC would lean on its operators to shut down transmissions over the UK. For other governments, it will depend on how much they want to up the temperature. But if they're prepared to do that to us, then we, or our allies, could do the same back. It benefits everyone to have some order. The only way I can see some fly-by-night operator getting away with flagrant violations of reasonable licencing restrictions would be if it was a John-McAfee wannabe operating a handful of satellites out of a tropical island.
"It's also a display of Keynesian economics at its finest..."
?!!
You might need to look up what Keynesian is. Because it's not what you say it is. Although I guess it's getting it wrong with a touch more class than saying "socialist economics" or "Marxist economics".
That wouldn't work on windows. "*" and "?" are not legal in a path name.
EDIT: nor is newline permitted. And alterring MAX_PATH requires you to hack the registry or group policy.
"the Surface Neo is still officially positioned as a work in progress, with no word yet on whether Intel's move will change anything on that front."
Have you got this the wrong way around? Has the Neo been cancelled, behind the scenes, and that's why Intel has pulled the plug on the chip?
Bonus points: Intel publicly cancelling the chip before Microsoft publicly cancels the Neo means Microsoft can use the public announcement of the cancellation of the chip as the public-facing reason why the Neo has been cancelled even though, privately, the only reason the chip has been cancelled is because Microsoft have pulled their contract without making this private knowledge public and as now, both privately and publicly, the chip has no buyers, Intel have announced the cancellation publicly allowing Microsoft to tell the public the Neo has been cancelled and use their behind-the-scenes private cancellation of the Neo as the justification for it's public cancellation. Simples.
And if I managed all those "publiclies" without a single "pubicly" I've done well.
Sooner or later you run out of people to teach you and have to figure it out yourself. Being taught is nothing more than a leg up. And people who are taught don't always understand why it should be done like that. If you've tried it, you know, and know when you can cheat. And I certainly don't think people who are taught produce better code. These days they often produce worse code because they haven't grown up with the machines and don't understand that a CPU is actually going to have to execute what's been written.
I can sympathetic with finding old code brilliant and atrocious, or an amalgam of both; the impossible made possible before your eyes. Lets face it, for most of us, the code is a prototype that should be thrown away and rewritten (and then thrown away and rewritten again because you succumbed to second system syndrome). But that's just not possible. And the rest of the time the code is a quick bodge on a prototype that should have been discarded.
Within those limits, it's often pretty good. The bad habits acquired on 8 bit micros are rarely visible. It's well factorised. It's almost like looking at a codebase written by adults. And there are inspired flashes I think I'd struggle to match. No wait, I've just had an idea...
?!
The OPs point was "Maybe that’s where Vulcan went?" But Vulcan couldn't go somewhere because, as you say, it never existed and could never have existed.
Theia has a reasonable chance of having been real; the best competing theory has the earth-moon system formed from a head on between two even bigger planetary bodies.
But thanks for reminding me of Vulcan. Next time we have an argument about dark matter I will bring it up because it's almost an exact parallel: an anomaly that is most simply explained by an unobserved mass which turns out actually to be an error in our theory of gravitation.
The planet you're looking for is normally called Theia. Spoiler: it was spiralling away when it had a little prang and didn't get ejected intact (or possibly, at all).
Yeah, I'd've said D was O(two decades) old. That makes it the generation before Go (2009), Rust (2010), Swift (2010), and Kotlin (2010).
It was up and coming for a while when C++ development had stalled. I played around with it and it looked like a genuine contender. Then C++11 unblocked the pipes and D never offered enough to justify the jump. And at this point, I can't see it gaining mindshare. I'm sure aficionados will keep it alive; but it's not something I'd trust a codebase to.
EDIT: Systems language get taken up when they offer new "safeties". C++ offered type safety (and cleanup safety via destructors/RAII). AIUI Rust offers memory safety and race safety. D doesn't offer any new safeties.
Dark matter is a beautifully simple explanation. I worked on it as a student. It's missing one crucial thing: the bloody particle(s). And since GR isn't renormalizable, you start to think maybe what's missing is our understanding of gravity at scale. With the right tweaks maybe we can solve two problems with one theory. And we have form on getting gravity wrong.
Which is not to say the dark sector couldn't be out there. But for many years, now, I've felt dark matter resembles epicycles or the new aether and in a hundred years time students will look back and think "why did people ever believe that crap?" and think we're all idiots. Clearly, we're not idiots. But maybe we should be looking harder at alternatives. They aren't perfect but that's because they are mathematical challenging and under-researched. We've gone for the easier solution; the universe may have other ideas.
(For readers unfamiliar with the details, Sabine Hossenfelder has a recent blog which runs through some of the arguments and some of the problems.)
Company directors have a legal duty under the Companies Act 2006 (“the Act”) not to file false information on Companies House. Knowingly or recklessly delivering information or making a statement to the Registrar of Companies that is misleading, false or deceptive is a criminal offence under s.1112 of the Act and can lead to imprisonment and/or a fine.
Although the article goes on to say that "Inadvertently filing inaccurate information is unlikely to breach s.1112 of the Act." I imagine forgetting to update your filings would be the same - especially in the panicked response to a pandemic.
The most relevant point is the computer has only been sat in space for one decade, not three. And as it's not been turned on, there's no chance of damaging current surges or voltage spikes; it's only at risk of cosmic rays having carved a few bonus tracks. But being 1980s design (aka built like a brick shit house), the parts are going to be pretty robust against that.
X-fingers
The typical cosmic ray is a proton. 90% are. Whereas alpha particles (helium nuclei) only count for a miserly 9% of cosmic rays. [It did surprise me that they outnumber electrons.]
X and gamma rays flooding the system aren't included in the cosmic ray tally. They're just part of the general shit that is "space". But I'd hazard the most damaging X rays are those caused by charged particles going splat onto a transistor. (Bremsstrahlung)