"..for the uninitiated.."
"DNS, for the uninitiated, is the vital system that points browsers at the correct servers.."
For heaven's sake, this is El Reg, not AOL. How many readers don't know what DNS is?
Innocent websites were blocked and labelled phishers on Wednesday following an apparent conflict between OpenDNS and Google's Content Delivery Network (CDN). OpenDNS - a popular domain name lookup service* - sparked the outage by blocking access to googleapis.com, Google's treasure trove of useful scripts and apps for web …
... the effect was electric. (Borrowed)
But I do agree.
Although, I find it quite annoying when sites use the Google APIs since enabling them for a page (FF+NoScript) means 3rd party sites can use them too. It's particularly annoying on pages that use JS links for no good reason.
...only yesterday a commenter here said that although he's not in the industry and not necessarily tech-literate, he does enjoy reading many of the articles and learning.
So although he could probably go off and Google "DNS", it's a nice touch for people like him, and totally harmless for people like you and I. Well, unless you feel threatened by a red-top appearing to talk down at you, that is. ;-)
to be fair to Kaspersky, its entirely their faulty "sandbox" security implementation, Safe Run for X, under chrome and IE.
Firefox, it does work with. so if you're using Safe Run for internet banking / outlook, switch to firefox for your security stuff.
it's only under x64. it is on their site.... so it's just something they advertise,
then don't provide. Classy, but it's SOP for a $100+ product.
i.e. details from sandboxie, who have a working x64 sandbox, http://www.sandboxie.com/index.php?ExperimentalProtection
Safe Run for Applications, the component of Kaspersky Internet Security 2012, doesn’t work with Microsoft Windows XP / Vista / 7 x64.
Safe Run for Websites, the component of Kaspersky Internet Security 2012, doesn’t work with Microsoft Windows XP x64, and works with limitations on Microsoft Windows Vista x64 and Microsoft Windows 7 x64.
"The fact the issue popped up suddenly on Wednesday would suggest that engineers at Google had been fiddling with SSL certificates"
I wondered why a site I use went offline for a few hours...
So OpenDNS system saw the SSL certificates as potentially dodgy and took action to protect its users from sites using SSL certificates it didn't see as authentic? I'd call that proof that it is doing what it says on the tin and its one of the reasons I use OpenDNS.
Perhaps. Of course, it isn't actually the job of a DNS server to decide whether the answer to your query is safe to use. If there is a problem with the certificates on the target site, it is the client's job to decide how to handle that. But if you've punted that responsibility to OpenDNS, then they are indeed doing what you ask.
Either way, if people are now migrating to the MS alternative, it looks like Google have paid the penalty regardless of whose fault it is.
>> it isn't actually the job of a DNS server to decide whether the answer to your query is safe to use <<
But for most of openDNS's users a major reason to use the service is the <b>optional, configurable</b> nuisance filters.
99% of the use of googleapi seems to be to serve Free and OSS libraries, that could easily reside on the primary sites host, without introducing unnecessary privacy intrusion and opening the visitor to potentially dangerousthird party scripts..
Does it really make sense to download the same piece of code, time and time again, from every site you visit? If everybody loads jQuery from one or two CDNs, then the chances are it will be in the browser's cache already. (I clear my cache on exit, but it's there.)
The issue here was not having a backup for Google.
Given how often jQuery versions update, and how many are available, this effect is somewhat negated by the user having 1.5, 1.5.1, 1.6, 1.7 etc, but not 1.7.1. Unless you want to always bind to the latest version of an available library, which is asking for trouble when it updates.
I prefer to have a version of the code available on my site where it can't go away. The extra few millis to load the page are less important to me than it always working. Plus, I often implement code to bundle JS scripts together to save on requests, so the speed saving is negligible.
I use NoScript, and I'm used to having to temporarily enable the domain that I'm using. But some sites are just awful and you end up going through seemingly endless domains just to load the damned content. There are some sites where virtually everything seems to rely on a remote script to load. Madness.
On NoScript: Options>General tab, you can allow top level domain temporary permission by default. This means you can open website.com and it will temporarily allow scripts from website.com for that session. It saved a bit of wear and tear on my fingertips when I found that out.
By that token, you could say that any website that relies on CSS to look good is not a website.
Your attitude is about ten years out of date.
Besides, here we're not talking about sites which ONLY work with JS. Whether you use JS to save page loading times, or to style elements, or to provide a full app experience, you will be equally hit by this problem.
Grow up you arse. Years ago people said the same about CSS and images - time and life moves on. Many, many commercial developers are driven by strict requirements and guidelines and JS is neccessary. Just because some prick like yourself decides to disable functionality does not mean that the site should still work in all its glory.
For fu**s sake...
Just because we were scrabbling around fixing sites doesn't mean those sites were crippled, just that they should have been working better.
The rear-seat reading lights in my car have been out for, oh, seven years or so. Since they're there, they must be there to do a job. Since they're not functioning, the car is not working as intended. I suppose I'd better attend to that before driving it again!
Much software (including most user-facing applications for general-purpose computers) these days is loaded with features that a majority of its users never use and could not care less about. Often users are happier when such "features" are not functioning, in fact. Consider Clippy, for example. Or the recent complaints on the Reg about resizing ads, and accompanying expressions of glee from users who have script- and/or ad-blocking browsers.
Many web sites use scripting to accomplish nothing useful, or provide convenience features such as client-side form pre-validation (which is often done so poorly that it's worse than omitting it would have been) or prompting (which saves, what, a few seconds at best?). Since such sites should fall back gracefully in the event of script blocking, there shouldn't be any need to "scrabble around fixing" them.
For that matter, such simple functionality shouldn't be implemented with bloated, error-ridden scripting frameworks (like jQuery) written by people who can't be bothered to read the spec and throw a hissy fit when confronted with an implementation that conforms to it rather than to their preconceived notions (Reisig).
Are there web apps which are fundamentally built on client-side code, and so have to have working scripts in order to do anything useful? Yes (for various values of "useful"). But if those sites are really important to someone, they shouldn't depend on third-party-hosted code, as other people have already pointed out; and if they do, for some reason, then they should already be prepared to handle that failure mode.
...my DNS is provided by my DSL provider, so I haven't noticed any sites falling over owing to their inability to load scripts from googleapis.com. Of course, I have NoScipt set to Block Scripts Globally, and manually allow scripts as needed to provide any important functionality; last I checked, I had sites like googleapis and googleanalytics tagged as "untrusted" in NoScript, so I don't load them anyway.
Biting the hand that feeds IT © 1998–2022