Re: Customer pushback
I often repaired cables, headphones etc. when I was a broke teenager, but the repair was rarely as robust as before. So now I have the money to do so, I usually just "buy a new one".
21 publicly visible posts • joined 24 Jul 2009
And this is why, when I'm checking whether untrusted installers are signed, I check to see if they've been signed with the same certificate as previous, trusted installers.
It would be nice if the OS could pin certificates automatically, and highlight to users/admins if the signing certificate has changed from the previously-installed version.
Symantec issue (sell) code-signing certificates to software authors. Those authors then sign their artifacts (executables, DLLs, whatever) before release, so as to allow users/admins (and the OS) to verify that they are indeed legitimate and unmodified. Symantec do not sign artifacts themselves, nor do they perform any kind of per-release code review of their customers' pre-release software.
Symantec (and other CAs) may do some due diligence to assure themselves that the applicant is not impersonating an established code signing entity, and that they are technically competent to keep their certificate safe and only sign things they intend to, but then again, they may not. Certainly, attackers have previously managed to get their hands on legitimate signing keys and certificates and use them to sign malware in the past - various versions of Stuxnet were signed with Realtek, JMicron and Foxconn certificates.
The 32 bit ccleaner.exe in the portable version of 5.33 included the Floxif malware. If you've ever run it on a system, that system should be considered compromised. You'll have to decide between Talos' recommendation to restore to a pre-CCleaner 5.33 backup of the system, or Piriform's assertion that Floxif only ever profiled machines and sent the details back to the attackers' C2 host.
The 64 bit ccleaner64.exe in the portable version of 5.33 seems to be safe. If you only ever used that version, then supposedly you have no reason to be concerned.
Unless you have the resources and time to do analysis in a sandbox of every update that comes your way, automatic updating is still less risky than continuing to run software with known vulnerabilities. And, even if you do sandbox analysis, then there's still a chance that vulnerabilities in your existing version will be exploited before you complete the analysis to inform you that the update was indeed safe.
But, there's a logical problem - like looking for WMDs in Iraq, one cannot *prove* the absence of malicious behaviour: one more hour, day, or week of analysis might always turn up something unpleasant.
I was burnt when, buying the then-flagship Samsung Galaxy S 2, I was shocked and disappointed by slow firmware updates, firmware updates with serious bugs that went uncorrected for months and sub-standard hardware (notably flash memory, but others also reporting problems with the BAT500 battery leaking and destroying the WiFi chip).
So my next phone was a cheap-but-solid SIM-free Moto G paired with a SIM-only contract. I figured if it lasted me a year before I needed to buy something equivalent, the two together are still only 3/4 the price of a flagship device at most.
Google "CNNIC certificate authority"
Essentially, GCHQ sets up a CA (or surreptitiously obtains assistance from one or more established CAs) and gets its root certificate installed in (i.e. trusted by) $ALL_THE_POPULAR_CLIENTS (IE, Firefox, Outlook, Thunderbird, K9, Chrome). Then, when they want to see what you're doing on Facebook, they issue a bogus certificate for a proxy they control and poison your DNS or use NAT to ensure you go via their proxy, rather than a legitimate Facebook server. You'll get the normal SSL "yellow lock" in your browser, and everything will look fine, but they can see (and optionally modify) anything sent and received.
"How the hell can you charge twice as much for something you already had in stick and got for a cheaper price that it would cost at the moment"
Because if it sells today, they'll want to replenish their stock so they can sell another tomorrow. Given the price rises, that replacement stock will cost rather more than what they bought the original for. If they can't sell another tomorrow, they'll not only lose the margin they could have made, but maybe also other trade as well if the customer goes elsewhere.
In October of last year I received spam to a number of semi-private mail aliases each used in connection with only a single web site. Eventually, I determined that each of these sites had used ThinkSend (aka createsend.com aka thinksend.com) so send their legitimate opt-in marketing emails at various times during 2009. One of the organisations followed up on this and confirmed that ThinkSend had been compromised during that timeframe: http://www.campaignmonitor.com/blog/post/2852/
More recently, I have received spam targeted at an address only known by me and laterooms.com, but their investigations drew a blank on that one. Thinking about it, I wonder if any data sharing goes on between laterooms and Travelodge?!?
At 40p per minute for out-of-bundle voice calls, it's only 12 minutes before paying for them exceeds £5 for the next bundle up.
That was the process I went through (rather like Havin_it regrets not doing) before settling on a 300 minute/month tariff rather than a 100 minute/month tariff. I've never used more than about 120 minutes in a month - comfortably within 300 minutes, but would have cost more than £5 if I'd gone for a 100 minute bundle.
Dolby 3D Digital Cinema (which is what the new digital cinemas seem to use, judging by the new Showcase De Lux in my city) is a digital format. I seem to recall reading that the cinema's servers retrieve the films via FTP and key material is delivered on DVD.
http://www.dolby.com/professional/solutions/cinema/digital-cinema.html makes for interesting reading.
Some Linux kernel symbols (functions, constants) are deemed protected and can only be used by kernel code that is GPLed (see http://lwn.net/Articles/205644/ , http://kerneltrap.org/node/4674 ). Code which uses these symbols is considered a derivative work and must also be distributed under the terms of the GPL. Code which doesn't use said symbols, isn't, and can be distributed under any licence. Presumably Microsoft's implementation of these drivers used some GPL-only symbols, and was not ported from some other OS either.
"Under the terms of the GPL, any module that's been combined with code licensed under GPL must be released under the GPL."
Incorrect; Microsoft could have chosen to cease distributing their code, and optionally re-written it in a non-infringing way.
Furthermore, the FSF has no right to sue in this case as the Linux kernel copyright isn't owned by the FSF. Only copyright owners whose rights have been infringed have the option of suing.