I though we all knew Spectre-based attacks were real pretty much as soon as El Reg published their story, all the time ago. Didn't we?
Google on Friday released proof-of-concept code for conducting a Spectre-based attack against its Chrome browser to show how web developers can take steps to mitigate browser-based side-channel attacks. The code, posted to GitHub, demonstrates how an attacker can pull data from device memory at speed of 1kB/s when running on …
IMHO this is what happened: we reported upcoming OS patches to work around design flaws in today's processors -- Intel's Meltdown being the worst as it was easy to exploit.
Everyone went wild. Markets, media, analysts, vendors. Things were patched before they were exploited. It reminded me of Y2K. In the end, very little went wrong because of all the work beforehand, leading some to say it was a load of hype. I see the same for Meltdown+Spectre.
The obvious Meltdown and Spectre flaws were addressed early on. But as we wrote in early 2018, Spectre will continue to haunt the computer industry for a decade or more as the family of bug is quite large. Google's pointed out that there's still work to be done on the web front-end side, and so released this PoC exploit to make web devs wise up.
There are exploits for Spectre out there but they tend to be in expensive toolkits (Immunity Inc's Canvas IIRC). Now here's one for free.
I suppose it comes down to how prevelant Spectre based exploits are in the wild. Though I'm far from convinced that giving a working Web based one away for free to anyone who wants it is actually going to be of material benefit to the rest of us. Probably the complete opposite in fact. Isn't it like dropping a zero day on to the entire Web?
> Isn't it like dropping a zero day on to the entire Web?
It definitely is: It's making a complicated, grown-ups exploit script-kiddie-friendly, while the potential victims have no means of protecting themselves.
Don't they? Oh wait, I forgot, apparently they'd be safe if they only use Google programs!... That's textbook blackmail. "Nice computer you have there. A pity if something happened to it, so come to uncle Google and get out of those panties..."
I think a lot of people believe making the timers coarser was enough of a mitigation so this neat way of controlling the cache evictions is a welcome reality check that it's not. As long as the side-channel signal is there it will be statistically measurable, and stealing small secrets like password and keys will be practical even with an extremely low bandwidth leak.
(The recent Aspelmeyer Group measurement of the smallest gravitational force so far between two tiny gold spheres and their expectations for huge further improvements is another cool case study in this phenomenon).
Yeah, it's going to be a while before Web 3.0 (or is it 4.0?), where there's no client code, just CSS. Maybe doable once bandwidth increases such that round-trip times are quick enough and browsers can standardise in signed code in the application the little page interactions that scripts cover but which seems to always need to be supplied by the site server.
Most web developers would not understand this or even care, they too busy delivering.
Google lives in a corporate world with change control and one person who understands this is importnat. the 300,00 web deveopers in the world don't give a shit as long as they get paid.
Google needs to target the people paying web developers to ask questions and give them tools to test code.
> web developers in the world don't give a shit as long as they get paid
Came here to say exactly that. Nobody will ever even try to fix that, because there is nothing in for them, not to mention it would imply to change all and every program anybody uses.
While the novelty-hungry fashion victims out there obviously won't see the problem with that, it's not them who do the real work. Real working people usually have dozens of old-yet-vital programs, most of which don't and won't have updates, or they simply can't afford to upgrade everything in those remarkably prosperous times.
To the armchair commentators, mitigation did not work. It is a big dangerous hole, and clever foreign powers will be looking for, or already have several weakest links. Maybe it keeps schoolkiddies from playing. Don't forget about the management engine, ot the cpu's in peripherals that can be deployed. There are thousands of DRIVERS out there, unchanged or unmitigated as we like to say, pre flaw discovery. Fixes for older and not so old motherboards - did NOT come either. I expect quite a few biggies (security CVE's 9+) to come out over the next few years.
This post has been deleted by a moderator
Linux xxxxxxxxxxx.xxxxxxxxxxxxxx.xxx 5.10.20-200.fc33.x86_64 #1 SMP Thu Mar 4 13:18:27 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
[Fedora 33 x86_64]
Based on the results submitted the attacks appear to only work on Windows. All the Linux results reported "No, did not work". Same here.
Stop using Windows.
Can you read as well as code? Let me answer that for you, no.
From the linked article.
"We've confirmed that this proof-of-concept, or its variants, function across a variety of operating systems, processor architectures, and hardware generations"
And even if you were to busy mashing the keyboard to prove Linux is the greatest thing since the discovery of fire making, you could of at least bother to read The Reg's article properly. Maybe the bit about working on Apple's ARM processor may of given you a clue? No?
So go back, read, comprehend.
> "We've confirmed that this proof-of-concept, or its variants, function across a variety of operating systems, processor architectures, and hardware generations"
I am very happy they confirmed. Really.
However, the test results submitted to the vulnerability POC website do NOT confirm the above confirmation. They are, in fact, pretty clear: Linux isn't vulnerable, Windows is, and a very small percentage of other operating systems where the users had no clue what they were doing and didn't understand a single thing. Also known as Mac users.
Why don't you check the facts instead of believing everything you read at ElReg.
Now go back and hyperventilate about Spectre. Makes you sound really knowledgeable.
> Or did I misread and you’ve tried every release of every district in a comprehensive combination of architectures?
Yes, you misread. That's the best-case scenario. Worst-case, you can't read and/or can't understand English.
I can't fix that. Only you can.
>AMD chips, for example, are more resistant.
Yes, possibly.* But this assumes some CPU choice skew among Linux users. IOW Windows users would clearly prefer the vulnerable chips over Linux users who would prefer the less vulnerable ones.
I don't know if that's the CPU choice distribution in real life. As a single-point, statistically irrelevant example, I'm typing this on an Intel chip, not AMD. Supposedly I should be toast.
[*] AMD's claim about being less vulnerable to Spectre compared to Intel is just a claim.
You're really bad at reading comprehension. The exploit was developed on Linux. And the page specifically says it's not designed to test whether your machine is vulnerable:
"It was developed and optimized for Chrome 88 running on an Intel® Core™ i7-6500U processor on Linux. While it was confirmed to work on other CPUs (different vendor and/or generation), operating systems and Chromium flavors, you might have to adjust the configuration and it might work less reliably (or not at all). Note that the goal of this proof of concept is to demonstrate the feasibility of a web-based Spectre exploit. It is not a test to see if your device is vulnerable or not."
> [... ] it's not designed to test whether your machine is vulnerable
Awesome then! It's not designed to prove the Spectre vulnerability, or to test whether your machine is vulnerable or not. And it might not work reliably. Or at all.
But, please don't forget to run around screaming with your hair on fire: OMG! Spectre!
So what is it designed for then? Remote identification of Internet Drama Queens?
The Google engineers say they also developed other PoCs with different properties that they aren't releasing. One, they claim, is capable of leaking data at a rate of 8kB/s, though that accelerated pace comes at the cost of diminished stability...
'Tis an interesting cost/benefit analysis to ponder whenever the accelerated pace of those ones also deliver increasingly excessive volatility in a commanding control led environment. It is though no wonder they wouldn't be releasing those Proofs of Concept, for it's a veritable flash crash/fast cash cow of an Easter egg which everyone would be wanting to exercise and milk/bilk, and with more that just a chosen few determined to try and corner the market in order to render to them an extraordinary overwhelming exclusive positive advantage.
I thought Google weren't into the Universal/Global Arms Race Market and yet here they are with something quite AWEsome .... and more than just alien in form for conventional defence and traditional military forces and sources to be supplied with in order not to be totally disengaged and helpless against future novel activities such as ACTive Abiding Astute Zer0day Hostilities against Systems Deaf Dumb and Blind and Brain Dead to Radical Fundamental Base Change.
Now ..... should that not have been the case that Google engineers discovered, then it sure as hell is one already in a development with/for A.N.Others.
Not knowing that creates more colossal and complex catastrophic problems made impossible to resolve without such a simple action.
A new Artificial Intelligence agency is meanwhile set to develop autonomous weapons systems.
Stressing the importance of data in future conflicts, a senior Whitehall source said: “What’s certain is that the future will be about cyber, space, AI.”
An MoD spokesman said: “As threats change our Armed Forces must change and they are being redesigned to confront future threats, not re-fight old wars. The Armed Forces will be fully staffed and equipped to confront those threats. .......... https://www.telegraph.co.uk/politics/2021/03/12/80bn-equipment-revealed-defence-review-tanks-jets-drones-hovering/
How very odd then that so much is being spent/wasted on a current physical arms re-fit to re-fight old wars style. It just doesn't make common sense, although that is hardly surprising to the many who listen to all that is shared for media to report as news as opposed to entertainment.
'Tis a common human failing that, ... a complete lack of future common sense .... and a systemic vulnerability to exploit and export and enjoy by virtue of excessive enrichment.
The Today programme was interviewing someone this morning about the recent exploits of Microsoft Exchange. His suggestion towards the end of the interview was that businesses should consider whether running services on-premises now required too much security expertise for most of them and that they would be better off with cloud provision.
I'm just not sure it's sustainable to have a situation in which web application developers (who more than most suffer from "gratuitous include syndrome") are being advised to code defensively against CPU vulnerabilities and businesses are being told that computers are essentially too dangerous to be kept on site.
It's a relatively small number of people that fully understand the nature of these vulnerabilities and have the ability to investigate and mitigate them. Of the people who might have that capability, few are likely to sign up voluntarily for what is frequently an exceptionally tedious and thankless task. If the future of "secure computing" depends entirely on this small pool of experts to fight fires, we might as well all publish our passwords online now.
It's a relatively small number of people that fully understand the nature of these vulnerabilities and have the ability to investigate and mitigate them. .... Warm Braw
And methinks, Warm Braw, having more fully understood the nature of such vulnerabilities more than just a relatively small number of those folk would be successfully tempted to take a well cloaked walk on the wild, dark side and make a quiet fortune or two or three ....
After all, such is surely to be certainly expected for that would be only human of them when engaged in business for profit, you know, extra money for nothing tangible, an apparent reward for really just knowing what needs to be done .... and which ideally cannot be done by anyone else. That can prove extremely lucrative.
That would be the classical economic theory. Things work out a little differently in practice. If you make a job sufficiently financially-rewarded that people can live comfortably by working fewer hours, a proportion will do that. Some will work part-time, others will simply have shorter careers and move on. The more arduous/tedious the task, the higher the proportion.
Hence many (if not most, now?) GPs work less than a full week; headteachers who have older and more generous pensions are retiring early. Giving people financial incentives can have peverse effects (at least as far as the source of the finance is concerned).
Talking of traditional and conventional home-grown on-premises security expertise v. any novel and relatively alien cloud provision technologies, ..... of which there would appear to be an expanding number quite bereft of either effective covetous competition or zealous opposition, ..... raises the spectre and introduces the opportunity for an old familiar frenemy to exercise it attractions and demonstrate, or not be able to demonstrate as the case can be, its capturing of technology and hearts and minds abilities, with the following few words being indicative of the current, present running situation ??? ‽ ‽ ! !
While most people were dazzled by the bounty of China’s economic boom, Chen [Qiufan] was ambivalent. In his first short story, “The Bait,” which he wrote as a precocious high schooler, aliens arrive on Earth, give humans an invaluable new technology, and eventually enslave them with it. ...... https://www.wired.com/story/science-fiction-writer-china-chen-qiufan/
:-) I wonder if the new frenemy is better at tending to order than the old technology with the award of rewards that fences pretty printed paper counterfeit goods/accepts and supplies fiat currency as a simple meme and very convenient means for global population control with massive remote command?
:-) However, should it/they/new frenemies even simply allow it to continue virtually unscathed and practically unattacked by a hostile force with their gracious acceptance of perfectly timed, and extremely excessive and generous judiciously gratuitous investments, would it be a classy win win and classic temporal resolution for an Earthly problem ....... and as I'm sure some may point out, not at all unknown or not to be fully expected, copying/mimicking/counterfeiting as it does ye olde business model of Danegeld.
Those who would disagree would cite, methinks, and recite https://en.wikipedia.org/wiki/Dane-geld_(poem) and in so doing invite and suffer the slings and arrows of outrageous misfortune, Doom and Gloom and FUD.
It is time to get out of IT.
Let the younger guys run the show while the older guys ride off into the sunset, off to do sheep/alpaca/goat/chicken/ostrich farming.
After all, there is less stress that way, no Bossly Units who tend to scream at you and no pesky Exchange servers to keep on patching.
Aye, I'll go along with your sentiment.
I retired six years ago now and I must say that I don't miss all the pressure, late night sessions on conference calls and being on call every other week.
It was good whilst it lasted and I loved the job but now that I am away from all of that I realise that there are other things in life. I'm not sure about the ostrich farming but being my own boss does have its compensations even if my income is a lot lower now than it was and I am away from the baleful influence of IR35.
Perhaps I am nieve here, but WTF do you need microsecond resolution for on a web page?
Having milliseconds with random dither (so it really is +/-0.5ms variability) might not stop it, but it sure seems a means to make it too much hard work to be practical?
Genuine question here.
> WTF do you need microsecond resolution for on a web page?
To be part of the cool kids. Somebody, many years ago, decided that the browser would be the new OS, and since everybody has hopped on the bandwagon, just "because we can".
And to be able to shoehorn a full desktop application inside a browser, you need to give it all the handles and rights a local program would have. Except the local program is installed and controlled by you, while the remote one is installed and controlled by someone you don't know, don't trust, and who most likely hasn't your well-being in mind.
Biting the hand that feeds IT © 1998–2022