Re: Here's the detail
If Apple had any sense they'd use some of their money to lobby for patent reform. But one suspects that, under the current system, they gain more than they lose.
3279 publicly visible posts • joined 23 May 2011
>Try "chocolate" and you'll similarly see the decay down to 2012.
Or try "google". (I did worry it might break the internet - but it seems okay.) Apparently Google peaked in 2005 and dropped off so that by 2012 we weren't talking about then at all. I wonder who they got bought out by?
"The real question would be how well the media had survived."
My experience is that 5¼" are probably still readable but good luck with a 3½".
Partly it's because 3½" were mass produced commodities whereas 5¼" date from a time when the price of a disk meant it could be made to a decent standard. They're also well spaced out - so there's no interference. (And the older stuff is single sided.) And because the tech was rudimentary it was written using a fridge magnet (in terms of strength of signal and signal area) in order to accommodate the varied tolerances of drives.
I imagine all that applies even more to 8". If its been well looked after, there's probably recoverable data.
With respect to More or Less, they'll conduct studies looking for Covid-19 antibodies and use that to estimate how many people have been exposed. (Apparently such programmes are already under way in China.) Once the infection has damped down, you can use that to get a good estimate of the death rate. That covers (b) and (b) is probably the predominant source of error. And (a) is neutered by looking at the infection as a whole.
The UK's death rate is certainly inflated because the NHS have only been testing people who have come back from an affected region or have come into contact with an infected person. (At one point they weren't even testing people admitted to hospital with pneumonia.) Although, without some limit, everybody with a cough would want a test.
"...going this way closes the door on people without the new hipster skills..."
How's the recruitment for COBOL programmers going?
I've not programmed Go. But it looks a fairly straightforward language to pick up. Programming is programming. I agree with the rest of your point, though.
I haven't got time to look it up. But if the rules were opaque or missing, then the Tribunal may have had to "clarify" the law to the cover the case. In that situation, I would expect them to be let off as a matter of natural justice, provided they weren't taking the piss; there would be no way they could know they were in the wrong till a court adjudicated. And I would expect anybody who'd behaved in a similar manner to be let off prior to the precedent.
DISCLAIMER: IANAL, but I have beaten HMRC at a Tribunal. :)
And the consensus in the comments was it was inconceivable anyone would conceal their password alongside their fishing tackle.
But this case shows exactly why downloading it onto unhackable media is not such a dumb idea. Although, personally, I would tattoo it onto a part of my body that is never normally visible in public - ideally a part that is covered in thick, curly, dark hair so even a strip search wouldn't reveal it.
Have you seen the @PresVillain Twitter feed?
I haven't read the legislation. But a quick google suggests the lawyers are right:
The fees ban applies to all new tenancies entered into on or after 1 June [2019].If you signed a tenancy before this which included agreements to pay further fees – for example, check-out fees or tenancy renewal fees – you will still have to pay these up until 31 May 2020.
But from 1 June 2020, any term in a tenancy which requires you to pay fees will no longer be binding, so you won't need to pay them regardless of what your agreement says or when you signed it.
Expansion? What expansion? An infinite universe is already infinite and therefore, by definition, cannot expand. So your argument is clearly invalid!
Yes, if you apply modern astrophysics to an idea that was already old when Herr Olber penned it in 1823, then you will find a lot wrong with it. For example, three quarters of all stars are M-dwarfs, which aren't as bright as the sun, so even in an infinite, non-expanding universe, it's unlikely the sky would be as brighter as the sun. And the instellar medium famously reddens light, even without expansion.
I don't think anybody wants to revisit the horrors of 286 protected mode. But the four privilege levels are still present, even in 64 bit chips, and could be used.
In reality, however, it's hard to grade security that way. (Being "a little bit kernel" is like being "a little bit pregnant".) A capability based model is a far better bet - a driver runs in user space but with the permissions it needs to do special things. Hardware support for that comes in the form of the io port bitmap which allows userspace processes to be granted access to specific ports. (See iopem(2).) But ports are only part of the story, and giving a process access to ports may grant it more power than it needs.
"Plus it wouldn't stop flaws in the kernel itself being used to escalate privileges."
It does reduce the attack surface, though.
"Plus how do you deal with latency-sensitive stuff without context thrashing?"
It depends how latency sensitive it is. VMs seem to manage. But maybe some drivers will have to be partially or wholly in the kernel. That's still a gain if most drivers are user space.
If a https download page links to a http mp3, it will be blocked. Other than that, everything is fine.
So a https download page must link to a https mp3. But a http download page can link to whatever it likes. All that's required is the encryption scheme of the download is at least as secure as that of the page it's being downloaded from.
But even by his own figures, you'll be seeing one every three minutes. You don't have to have been a pro to have done deep field exposures for longer than that. And the ones that aren't shining can be as damaging as the ones that are - if they occult the star you're measuring.
But the numbers are astonishing. 50,000 satellites would mean 25,000 in your hemisphere. Imagine a patch of sky 2 moons x 2 moons. Every patch would permanently have one satellite (as one leaves another enters). That assumes they're evenly distributed - which they won't be.
"The chances of generating 2 identical hashes whilst manipulating the contents of a file - which would have to be the same length anyway(?) - are so unlikely that I'm unsure why this is being treated as something that needs fixing."
Why would the length, the commit message, or any of the other factors have to be identical? Who's going to notice that a five year old commit is slightly longer or has a slightly different message? The biggest problem is not making it disrupt any of the commits that are sitting on top of it.
Variety of devices and channels are more important than the brute factor count. Username+Password, SMS, biometrics and auth app can all be run through one device and, if that device is compromised, then it's game over - even if you use every last factor. But if I'm logging in from a laptop or a desktop, and the 2FA comes in via my phone, then you have to compromise my mobile phone as well as my computer. That's squared the difficulty.
Likewise, it's not enough for a hacker to man-in-the-middles the bank; they've got to intercept the SMS as well.
"A 2FA app installed on a phone is secure."
Possibly. If it's well written. More likely it's just vulnerable to a different set of vectors. It's closed source so we have no clue about the internals or the protocol. Worst case, we could attempt to read the code via spectre, rowhammer or some other side channel.
I reckon the independent token generators are probably the best. My bank even issued one and I was using it to log in. But they kept nagging me to set up a phone; and doing 2FA authentication via SMS means I can do mobile banking in an emergency - which wouldn't be the case if I was without my card and my token-generating card reader.(And, okay, I admit, some laziness came into play: insert card into reader, enter pin, select function, enter challenge and type response into website. All to protect my negative money.)
How accurate are the position reports? If they're using wifi it might be pretty granular. Even GPS can end up being rather inaccurate. So the phones in the cart could appear spread out to Google. And it might not be unusual for a street full of cars to have several reporting they're in the same spot.
So maybe next time we'll need two or three separate carts to fool Google.
BTW did anybody else notice the map showing he was next to the office of "Google Berlin"? That's real class.
But OpenBSD has a robust code base, an experienced team of developers, and the infrastructure for testing fixes and distributing them to users. And it's funded by a Foundation which these days seems to be reasonably good at securing donations.
None of that would be in place for a Win 7 source dump. It would suddenly exist and be ripe for exploitation.
"Still, that's a language compatibility issue rather than API so it doesn't invalidate my point."
Is being forced to rewrite your app in a new language not enough?
You can actually do it from C++. But those square brackets are indicative of message passing using named parameters. So every API call has now become objc_messageSend()
after first finding the object, then looking up the method and preregistering some atoms. And have fun constructing objects that will the OS will use to interface with you. (And the OS does expect to be able to make method calls into your app as a matter of course.)
I've not looked at this in detail so I have no idea how compatible individual APIs are or which ones have gone missing. But that's the point - every C API call has not got to be reevaluated to see if it has an equivalent Objective-C one with identical semantics. If they're not - it's rewrite time. The UI APis were very different, partly because the C interface came from the old Macs whereas Objective-C ones came from NeXT and partly because different languages naturally have different idioms with different conventions.
Meanwhile, 25 year old, 32-bit code that ran on Win95 continues to chug along fine on the latest iteration of Win10.
Cocoa is Objective-C. So instead of FILE* file = fopen(path, "r");
you do
NSFileHandle* fileHandle = [NSFileHandle fileHandleForReadingAtPath:path];
and reading data from the file has to be done [fileHandle readDataOfLength:bytes]
rather than fread()
So the entire IO section needs to be rewritten. (And given the age of the app they presumably have a bespoke file format that users would like to maintain backwards compatibility with.) With a bit of care something can be done. But it's a helluva lot of work.
And it doesn't end there. It's every single fucking API call everywhere. Even basic strings are manipulated with that square bracket notation. And that's before you get to the GUI - which is a large chunk of any modern app.
Presumably the tax logic itself could be factored into a shared library. Other than that, its starting again writing a new app in a new language.
"Now I must say that I would really love to know what APIs _accounting_ software is using that are hard to update."
I may be wrong, but haven't they've ditched Carbon? Previously, any app using the Carbon API would probably run. But now they've got to be rewritten to Cocoa. As I say, I may be wrong. I'm not doing any Mac work, these days.
The code might have been pretty good when it was new. It's just it was new 25 years ago. And machines were rather more limited back then - getting things to work at all, could be an achievement. (And unit tests were for pussies. ;-)
I was looking at some of my old code the other day and wondering "Why did I write it like that? Or like that?" And then it dawned on me: the APIs I'd use now didn't exist back then. In retrospect, what it did was pretty impressive. But the limits of technology were stretched and trying to disentangle it is a pain. And newer code using newer APIs has been added along side it, without anybody updating the old code so there are now multiple systems.
TL;DR it's probably bloody awful and needs a scratch rewrite.
"Or think of objects glued to a balloon. Inflate the balloon, and the objects move further apart."
I tried this once. What happened was, as the balloon expanded, the contact patch between the glue and the object also expanded, until, eventually, there wasn't enough adhesive in contact with the object to hold it in place, at which point it fell off. The object in question was a steak knife. I now have a 15mm hole in my foot.
Therefore I deduce that if the universe keeps expanding, we will all fall off and land in either heaven or hell, depending on which side of the universe we are on.
I saw the answer to this in a paper the other day and I realised my understanding of it wasn't as secure as I thought. Unfortunately I can't quite remember what the paper said or which paper it was. But it boils down to the "force" of metric expansion being much smaller than the forces holding you together - or even the "force" holding objects in orbit. So the expansion pressure on say, an orbital electron, is dwarfed by the attractive force of the nucleus and it doesn't even enter the calculations. Space "pushes" the electron out; the "Coloumb force" "pulls" it back.
"The tidal-locking also suggests that the planet may not still have a spinning core."
The opposite's the case. If you're phase locked, then your core gets tidally heated. So there should be enough energy to keep the dynamo ticking over. That's why Mercury (3:2 resonance) has a magnetic field. It's likely TOI 7000 d does as well.
It's the atmospheric effects that will be the killer.