I facepalm in your general direction, Java Script "programmers"
_SO_ bad, it's not even funny (just pathetic and sad) when I snark all over it.
icon, because, FACEPALM
Got, NoScript? NoJS?
If the web seems slow, blame third-party advertising and analytics scripts. Many internet users have already come to that conclusion but Patrick Hulce, founder of Dallas, Texas-based Eris Ventures and a former Google engineer, has assembled data that clarifies the impact of third-party scripts in the hope it prompts more …
And guess what, I get pages that load in one third the time, ads are mostly nuked without JS and all the other advantages of improved security. It's a no-brainer really.
If perchance I ever do need it then I've a large red/green toggle on my browser that turns it on/off. Being a large button, I can't accidentally forget to nuke it.
Amen to that. I choose what to view on my screen, not hucksters. Advertising does not help me in any way, ever, and it never will no matter how "nice" the slick ones pretend to play. It doesn't improve or enhance my life. From my viewpoint, advertisements are completely and utterly useless, and so I choose to never see them anywhere I have control. It's not my problem to figure out how websites get paid and I don't care - I didn't choose to litter the Internet with ads and I'd be happy without them forever, in any context. Sites that play gatekeeper games either have their flimsy security disabled on the spot or get dumped into /dev/null forever. Internet overlord wannabes: you are never going to be able to show me any ads, ever. If your data slurping site dies without ads, that's just fine with me - adios mother fucker, and take all your useless social media 'jobs' down with you.
Go to phys.org -- fantastic site full of the latest science news. They have a reasonable and sad header that says it sees you are using an adblocker and why this hurts them.
I will not unblock even for this great site, so I use it guiltily. I would be happy to pay them, but they've gone 100% ads and so I face a moral dilemma, and don't go to the site as often as I want.
I can help what that.
Ask them what liability they accept when their ad network serves you malware. (You don't actually need to do this... just run through the mental exercise.)
They will tell you that they have nothing to do with the ads that are served, it really has nothing to do with them, and accept no responsibility let alone liability.
When I go to phys.org I see "Did you know? You can become a Phys.org sponsor and enjoy all our sci-tech content without ads! Simply donate any amount and not only will you experience our site ad-free, but you will be part of the Science X community mission to promote science and technology knowledge."
The internet was generally more open and informative before ads came along.
With ads in the equation, you get more clickbait headings, salacious articles, stupip gossip, and even lies, just to hike up those ad hits. The actual infromational value of content is secondary.
Having said that, sure there are some good sites run professionally that need revenue, and I do have sympathy for them - El Reg being one. (I'd actually pay for an El Reg subscription as long as subscriptions *weren't* necessary [ forced subscriptions would stagnate the variety of commentards ])
But anyway, the adverisers, and many of the sites using them only have themselves to blame, displaying more and more ads, stuffing them with flashing images, and popup windows, and generally behaving obnoxiously - that's what started off the ad-blocking movement.
Now the ads are generally less annoying, they are instead full of tracking software that tracks far more than is necessary for fair analytics.
Of course, the reason for cookies to be restricted to the same domain was to avoid this sort of tracking - something the ad companies purposely get around by using a single domain for serving ads everywhere. Isn't that against Americas DCMA? Deliberately circumventing security software.
Anyway, for all their huff and puff, the browser vendors aren't helping.
One small change would make all the difference: Only allow cookies to be set in the same domain as the url of the main page) Any cookies from third-party sites (e.g. iframed, or inlined etc. should be ignored.)
Don't give me all this "but it's faster for the user if they already have acme-whizzy-bling-thing cached" etc. crap... If the difference is that big, then the thing is too bloated anyway. As it stands, it's a magnet for hackers, and could also be affected by remote site downtime, and accidental code 'upgrades".
How on earth anyone thinks its good to load third-party modules into their web pages, live from the third-party site is beyond me...
"With ads in the equation, you get more clickbait headings, salacious articles, stupip gossip, and even lies, just to hike up those ad hits. The actual infromational value of content is secondary."
I was tolerant of adds until now, you managed to persuade me to install add blockers.
Personally, I don't mind static ads that simply sit there, and possibly provide a link. But your average adster seems to think they have an actual right to waste your screen space and processor cycles. I suspect the bitterest opponents of adblockers are malware authors.
> How on earth anyone thinks its good to load third-party modules into their web pages, live from the third-party site is beyond me...
Ugh! Like the sites that rely on Disqus for their comment section. I find those sites' comment sections take several minutes to load... if they load at all.
There was one page I accessed recently where I was having issues with the page. I downloaded the page and looked at the source. The payload (data I was interested in on the page and what the page purported to contain) was about 1 Kb. The source for the overall page with all of the scripts (not counting referenced but not in-line "third party" items) was 220 Kb. With bloat like this it is no wonder the pages load slowly.
If you check the scripts run by The Register's article you'll see the following (at least that what NoScript shows):
One would like to think that they should take a cue and start cleaning up their own house.
I doubt that the culprits will give a toss. More likely, this information will enlighten more people about what is going on and lead to greater use of script-blocking tools like NoScript.
I think the only thing that will really cause a change in developer behaviour will be a significant increase in people using such tools to avoid the mess that the developers have created.
I'm not even sure faster coding will help: some of the delay is bound to be down to the time it takes to auction and source the ad "in real time".
Basically, the only thing that will encourage media companies to change is if enough people use ad-blockers. The current approach hands all the power to the advertisement brokers to the detriment of the users, but also of the website owners.
And to make things worse, those scripts are usually hosted on inadequately slow/cheap servers.
(After all, my original reason to use an ad blocker back then was to escape the 5-10 seconds delays on each page load because of "waiting for Google Analytics"...)
As about shaming the ad industry towards better practices - Sure, like they even know what the term "shame" might mean... They're greedy bloodthirsty parasites, and we're the unwilling hosts.
One of the more obvious (at least to me) reasons to block ads/scripts is not to minimize the load time, but to avoid the premiere delivery method for tasty new exploits to the masses since the ad slingers don't actually check what they are serving up.
One news site, not sure which one I happened to stumble across a while ago did have a very nice opt out of tracking advertising cookies etc. dialog pop up when entering, I gave up when I passed the 200 mark and just left. In addition quite a few of the ones they served up could not be disabled from that site, but had to be separately opted out from from separate sites. Given the volume of crap pushed at us it's not surprising that things grind to a crawl.
Track off you bastards, I say. After trying many many things out as well as their combinations, I've found that the uMatrix add-on is enough to get a decent browsing experience. Its presets are well thought out and easy to change. I get only what I want and need and nothing else (well, one thing, I get to see the oh so very long list of stuff the website seems to want but I don't). It lets you black- and whitelist just about anything (images, scripts, iframes even). If I want to avoid Google a bit more, I'll toss Decentraleyes in.
Up until now I wouldn't push for FF in particular, as uMatrix worked for Chrome as well, but now Google decided it's their call and not yours about what to load and what not (in other words, it's impossible for add-ons to check your traffic in Chrome anymore) and just about everyone except Firefox uses the Chromium engine (with a custom add-on implementation being damn difficult to nearly impossible) it's just this one browser left which lets you do it.
Yes, I tend to forget the fact that one does need to get the hang of how websites are built in order to be able to use the u-thing to their full potential. Which fact is not helped by Google behaving like the R. Scott's Alien - first hug your face with youtube and analytics and a fast browser and then bust out of your chest because without ajax.googleapis et al every other webpage can't seem to be able to show you a damn thing (which is what bugs me most, since you can block all other slurpers like Facebook quite easily without functional penalty).
I grew up at the time when every user had had to understand how a computer works before attempting to use it. Companies like Apple and Google did great work for computers to be usable without understanding them - and those who don't understand make for very good slurping targets, sadly.
Not the largest i've seen but one I visit regularly is astronomy.com
uBlock origin on Firefox ESR reports:
26 elements blocked, 4 out of 15 domains connected
Ironically 'lijit.com' doesn't appear to be!
Make its "programmers" load pages using dial-up (57k) speed. If it were a requirement, things might speed up a bit. Every time a web page loads, you get ALL sorts of stuff loaded in, taking up bandwidth as well as memory.
While I used to use noscript, I don't know, and I might go back to is soon.
I want every web front end developer to sit in a sealed glass room (one at a time please). Just a computer and a 33 k modem to the internet in the room. Freshly booted but logged in computer with a web browser open. I control the oxygen supply.
The web jockey types in their URL. At the moment they hit enter, oxygen is completely evacuated and replaced with an inert gas. Oxygen is only released when their page finishes loading ... and does NOT jump around the screen to the slightest scroll or mouse movement. (So called "progressive" pages count as "not loaded" and don't release oxygen until loaded).
Extra time penalties before oxygen is restored:
Auto playing video +5m
Auto playing music or other sound +20m
Auto playing multimedia requiring flash +1 hour.
Marketing Managers that request that shit receive double the penalty and are required to sit in the room too.
All penalties are cumulative.
Anyone surviving may continue their craft, er cruft, er um craft.
I am an old skool web developer and it seems every week I have to ask one of my frisky young team "why would we do that?" or "why is doing that better than not doing it?"
They've been taught to do it and pointless bloat or 'but we always do it' is how you do it, so they do it.
I'd like to point out, as one of these "programmers", that this kind of thing is largely outside of our control.. we are forced to include things like gtm for analytical purposes, which are then abused beyond belief to side load all manner of nonsense, without the actual "programmers" even being aware. Some of us do actually care about load times and are even more upset about this than you are.
'for analytical purposes'
Just wondering, what does this data include - and why is it useful? - other than the referer, number of hits and how long each visitor is on the page for, what use it? (not to the hyper-slurpers, that's obvious - they want a profile of saleable data for everyone and everything on the planet) I'm aware that 'social standing' can be approximated from location, browser, OS and referer etc. - does it really matter? There's a phrase i'm fond of using when people have them selves in a knot trying to resolve an issue - 'don't try and over-analyse things' and I suspect this is what manglement are doing with the wealth of data being slurped on their behalf. In the end, no-one benefits. (except Google & friends, obviously!).
>for analytical purposes
That's a very polite name for 'spyware'!
We all know that the reason for all this scripting is to track the user, the data is then used to tune the advert ecosystem. The problem up to now has been that browsers don't directly identify users and (I'm starting to have my suspicions about the new Chrome engines) so programmers have to use algorithmic workarounds to simulate the effect of accurate tracking (sometimes with amusing results).
The fix for the average JS programmer is to learn a real language and get a real job. JS has its place but it must rate as one of the most abused languages ever.
Thereby, it's up to the ads industry - stop trying to exploit the user, or you will be blocked. Site owners should start to be aware of that. Otherwise your business model will become non sustainable.
Regardless of whether the ads use JS or not, the page overhead is the offense. The advertisers are using your gear to - theoretically - sell stuff to you. They not only expect the use of your gear for free, but hope you will toss in a tip as well. They could stick to static ads using minimum real estate on the screen, ditch the animations, voice overs and the rest of the television ad wannabe cruft and the number of complaints and contempt directed at the industry would almost vanish. But broadcast TV is dying and the admen want a new home with a minimum learning curve.
Stupid users should upgrade to 128 core processors with 512GB DDR5 ram and minimum 20TB PCIe SSDs and overclock the lot. Minimum of 10GB/s fibre connection required. Here's another pizza advert. $10.50, $8.50 advertising surcharge, $2.00 to make. Please enjoy your improved browsing experience.
I have JS enabled and a blacklist of sites where it's disabled (in $Chrome). The disabled sites are the more or less the newspapers that I read daily. I experimented with globally-disabling JS and having a whitelist instead; it was about the same size so I didn't take that plan further. But it was soooo close, very much on the cusp.
I wonder how long it'll be before disabled JS is popular enough - or the default on enough browsers - that we'll see nagware like the "you appear to have an Ad Blocker..." saying "Please enable JaveScript to continue..." :-(
I wonder how much electricity has been wasted worldwide over the last decade because of browser scripts. I bet it's at least in the terajoule range.
The ad people don't seem to get it. If you make adverts intrusive and annoying, people will start ad-blocking. Once they start blocking, they won't stop. Running a fucktonne of 3rd party JS counts as intrusive and annoying. Advertisers are the ones who have killed the goose etc.
If someone could come up with a decent micropayments service for accessing things like news, that might be better. I pay for news sites I use all the time, but I'm not going to subscribe to sites that I look at 5 times a month.
If someone could come up with a decent micropayments service for accessing things like news, that might be better.
Well, let's do some numbers. Global digital advertising spend is reckoned to be about $350bn for 2019. There's plenty of competing figures, that'll do for me. Now let's guess how much of the digital ad spending ends up in content producer's pockets - Google's cost of sales is 43%. Let's be incredible and unrealistically generous and assume all of that is to content owners. So there's around $150bn of global ad revenues that support content producers. Now let's assume for simplicity that those costs are paid entirely by around 300m homes in developed nations - again, maths can vary, that's adequate here. That's $500 a year for each developed world internet using household, or about $1.40 a day.
So there you have it: Ignoring the issue of ad-blockers, that's the value of your privacy AND at the same time the implied value of all currently ad-sponsored free-to-web publications. Obviously we'd need steel-jawed legislation to ensure that "pay to view" customers don't have their data scraped by the unscrupulous and untrustworthy big tech corporations (or their smaller accomplice corporations).
I'd pay that if it ripped the arse out of Google, Facebook, Amazon and Microsoft's data scooping, where do I sign up?
$1.40 a day sounds OK to me too.
Maybe, instead of wasting time on the internets while pretending to work, I should set about designing a global micropayments service. I reckon you could preserve privacy too by having the service throw anonymous tokens at the sites providing the content and then bundling up all the payments in one.
"Well, let's do some numbers."
For the advertising industry there's an even more terrifying set of numbers if only the raw data were available. That's the net value to the actual advertisers. The gross value is easy - the marketing department or consultants who sold them the deal can say how much it cost to place the ads, what percentage clicked through, made a purchase and what the value of that was. What they can't really get a handle on is the number of people who might otherwise have made a purchase and were so pissed of by the ads that they went elsewhere and hence they can't get that cost to be added to the cost of their advertising campaign.
I think I have a solution which should please users and advertisers. The advertising industry might have to do a bit of adjustment.
Add a variable that shows the user's current attitude to ads. This variable is used to introduce a weighting into the auction. By default it's zero - the user doesn't care so everything works as at present assuming no adblocker. The user gets all sorts of crap shovelled at him him on behalf of hopeful advertisers. It's a gamble as to whether it does the advertiser any good but the actual probability of that is pretty poor. However if the user sets the value negative it tells the system that the user really doesn't want to see ads and will likely punish any advertiser whose wares get shoved in his face by not buying from them in future. This will weight the auction against showing any ads without actually using an adblocker (precautions need to be taken to avoid gaming the system by arranging for competitors' ads being shown). OTOH if the normally ad-adverse user is looking for something they can set the value positive. At this point the value of showing an ad rises and it becomes worth bidding high.
Under such a scheme the users benefit - they don't get crap shovelled at them all the time but when they're looking for something to buy they get shown useful ads. The advertisers benefit because they're not getting negative value for a lot of their ads and getting good value for the ads they show. The advertising industry has to rethink the way it works. At present the advertising industry only sells one thing. It sells advertising. Not soap, not new shiny, not widgets, just advertising. It needs to change to selling results. Not just results which look good when you can't see what damage you're doing but actual results and price its services on those, not just on ads foisted on the uninterested and unwilling.
The problem with that is that advertisers don't give a toss.
Either they arrogantly think that if they can get you to see an advert - even against your wishes - you'll be persuaded, or they are the sort of third party broker who doesn't care whether the adverts piss you off - they just get paid by the number of impressions.
What you describe would seemingly be in their best interests, but if you remember when spam-blocking started - it was originally something we did at a user-level not a server/corporation level, yet the spammers still tried to get around it.
Similarly with TPS - why phone someone who specifically says "NO UNSOLICITED PHONE CALLS"?
Arrogance and number counting.
I subscribe to two websites, since I value their content, but I leave Unlock Origin enabled. They're major newspapers, one in the US and one in the UK, for which I pay about $7 a month in subscription fees. Both sites allow access to non-subscribers, albeit with prominent notices at the top of articles suggesting you subscribe. It would be interesting to know how much revenue this brings in for them.
"I keep hearing about micropayments, but nothing has come of that so far."
Micropayments has been the mantra since dial-up days at least. Maybe they've been waiting for blockchain to come along to make it practical? I wonder if it will be like fusion power, always some years in the future? Maybe they need some agile devops guys to look into it and maybe kickstart the process with crowdfunding on...erm...kickstarter?
Although this is true for many sites - switching adverts back on is like going back to dial up on some sites - there is another issue that started occurring in the past year.
Especially when using mobile internet connection.
I see this on some webpages that hang for 10+ seconds after all the apparent content has loaded, but you cannot DO anything. Even worse, click on a link, and you are stuck waiting for ANOTHER TLS handshake.
Sometimes this will happen 5-6 times before the websites effected, settle down and let you get on with your life.
"I'm convinced that rewarding sites that deliver positive experiences is the path forward,"
I'm sorry, but I'll never see it. Because I browse with an adblocker. The web may slowly, a bit at a time, improve so everyone has the experience someone with an adblocker already has. The only problem I ever run into is a message that says "Disable your adblocker to access this content" which makes me evaluate "Do I wanna read that page if it has ads?" and invariably, that answer is "Nah."
I hope this isn't too far off topic....
I don't usually use an ad. blocker, but this is what really tempts me.
A download page surviving on ads and the ads have been tailored to look like the software download button.
Yes it know it's best to just avoid them, but at times it is not practical.
I have a PIHhole up and running at home. Not on a Pi though, but it is good, really good, ok too good, certain things break, but that is probably my problem, as I've not really looked into tuning it to a happy balance of functionality and blocking the CRAP... Next rainy day!
This post has been deleted by its author
Once upon a time, web pages were built with simple HTML and CSS. Which is light on bandwidth, and easy for browsers to render.
An example - I have to submit meter readings for an online energy company. The page used to use a straight-forward HTML form, and I could fill out all the fields and click Submit. A little while ago, the whole site was replaced with one of these damn "responsive" designs, and now that simple form is overloaded with JS that changes it from a single page fill-and-submit into a 3 page fill-one-field and click NEXT time sink. All just to make it look very pretty (probably to serve some ads too, but I don't see those!).
"....has assembled data that clarifies the impact of third-party scripts in the hope it prompts more efficient coding"
As the problem seems to be with pointless (to the user) advertising and tracking scripts, "making them more efficient" seems the wrong solution. Eliminating them entirely seems far more sensible.
Oh really...? So do tell me, does Google still load the invisible little animated thing in the centre of the page every single time - you know, the thing I blocked years ago because it produces a sustained, continuous 50% CPU load on an older machine like mine...? Because I'm basically certain they still do... and after something like that I just don't see them giving any fucks whatsoever to the whole problem.
Just get a 7th gen i7 or a 9th gen i5 then.. Google doesn't give a sh*t about your old ass computer, because the majority upgrade their systems. You should also get new RAM and maybe a new motherboard to support it. Use PCPartPicker if you don't know how, because you can build a part list with compatible parts. Even those who know how computers work (like me) use it as a guide to stay on budget.
I was going to post the same thing.
"Sites need revenue, and the threat of ad-blockers in some cases actually makes the situation worse for the rest of users by triggering convoluted workaround logic and complex disguising of ads that increase script execution time."
Maybe, even as an ex-Google employee, those assumptions just come naturally.
Not blocking ads on the Internet is like unsafe sex in the 80's.
Hulce hopes internet users make the ecosystem better by recognizing good behavior. "I'm convinced that rewarding sites that deliver positive experiences is the path forward," he said. "Let publishers follow the money from there."
Never understand how the Market worshippers reckon the producers automatically work out where a blockage is that's stopping them making more money.
They may just as well decide to determine slowly loading pages for consumers need more scripts and heavy images to improve the experience.
I close sites that take too long to load.
A lot of American news sites suffer particularly badly, for instance. The second that tab starts slowing up and affecting the others, I've lost interest in it.
I mean, I kind of get you using a lot of CPU if you making live dynamic heatmaps, or I've asked to load a page with a thousand products and images on it, or things like that.
But a simple news story should load as fast as BBC News articles load, whether they have ads or not.
I feel absolutely no guilt about blocking ads. I have run ad-supported websites myself. You only make a pittance - it's really not a viable income stream for 99% of sites at all - and it annoys everyone.
And, as far as I'm concerned, loading third-party code from that third-party's URL and blindly executing it for all your visitors is tantamount to a virus, and certainly an easy avenue for someone to compromise your viewer's or your website in some fashion.
Just today, I went on the TFL site to plan a train journey on a smartphone. Four times I went on, and it worked fine. The fifth time, there was some different ad at the bottom which decided that every time I was going to click on the "Starting station" textbox, it would try to invoke some kind of popup (which Chrome blocked) and thus stop the keyboard appearing. So I literally couldn't type into the box at all after a dozen or so tries. Reloaded. Got the same ad. Same thing. Reloaded a couple of more times, got a different ad, no more popups, and it started working normally again.
That ad could easily have cost you money, TFL. Or done something even more nefarious.
I'm curious if folks think that the Basic Attention Token idea (from the folks behind the Brave browser) will take off? - https://basicattentiontoken.org (or if you want to reward me with 'BAT' a referral link - https://brave.com/obm729)
I'm happy paying what the ad would have generated in most cases rather than actually seeing it...
How have these idiots skipped Lesson One in user interfaces: when the user does something, show a response ***IMMMEDIATELY*** regardless of whether anything functionally has actually happened. Otherwise the user is going to think: oh, nothing's happened, I must have mispressed, I'll try again... s*** I've just ordered 3 million pencils!
A LOT of man-hours went into minimising the impact of ads on load times, but you're tied into contracts with a lot of these advertisers, and if you don't keep up your end of the bargain by including their code where they want, and including their ads exactly where they want then they could and would cut what they paid you. It was a continual balancing act to keep cash flowing in, and trying to minimise the impact of those things on the user experience. At the end of the day we (the developers) were at the mercy of the contracts agreed between our company and the advertising companies, no matter what our opinion was of the negative performance, traffic and privacy implications.
Incidentally we found that during trials, providing subscription/ad-free models was just about as profitable as the advert stuffing route - but only for certain markets. People in the big cities were far more willing to pay a subscription for content, but the rest of the country expected their stuff for 'free'.
I'm guessing that 50% of page load time comes from these shit tracking scripts but what hasn't been mentioned is the other half (well, 49%). I propose that the other part of the equation is security, mainly SSL/TLS handshake, authentication cert download, multiple session encryption renegotiations. This depends on the website, but a lot of mainstream websites need good protection. HTTPS is a good thing but it comes at a price.
As for 'it's your fault, developers' I don't know about that. If you want a quick and dirty website you'll go to Wix or some canned wordpress theme and ya gets what ya get, no tweaking allowed. I got off the phone with a Google Ads rep yesterday after she stepped me through changing my paid ad campaigns. I'm not using tagged advertising but it's pretty easy to see how someone could have a lot of slop code running on their website without knowing it.
For some time now, I have been wondering about the viability of setting up a premium ISP business with "transparent" ad-blocking, including half-proxying i.e. fetching the file but never actually delivering it to the browser. Basically I think I'd just need a simple Linux server with some customised Bind zonefiles, and some broadband to re-sell.
I'm sure people would pay extra for no advertisements.
I think that advanced advert-handling will defeat the kind of ad-blocking that you describe, and by "defeat" I mean that if you don't see the ads then you don't see the web page content either. I get that at The Register sometimes, or I used to, not by blocking but I think when I loaded several pages at once and that either means ads arrive too late or it's just Not Allowed. I reload the scolding news page and it usually comes up fine.
Another model uses a proxy server on the PC itself, which just refuses to download adverts from known advert sources. Or, I'm not sure this existed, but it doesn't download advert-sized graphics to the browser, but to read the picture size, the graphic file has to be downloaded to the proxy server - which is your proposition.
Another Opera product has or is basically a proxy browser; the browser runs in the cloud (not called that at the time, I think) and on your PC or phone (I think the target was phones with low bandwidth) there is something like "remote desktop" so that you see what the browser is displaying. That theoretically could prevent showing adverts to the user although they were present in the browser. But I don't know if it did. It is cheating, after all.
But - it can be done.
I think that a browser with a "reading mode" - discarding junk from a web page and showing you the useful stuff - also amounts to doing something like that.
The protocol used to load web pages is a kludge, and a bad one at that. At its root is the practice of carrying out protocol exchanges over a stream protocol, something it inherited from FTP which is maybe tolerable for the occasional file transfer but is utterly unacceptable for general use as its grossly inefficient and none too reliable.
But then, as I've learned over many years of working with this stuff, programmers just don't seem to care. They just grab resources without thinking of their impact on the system (or the network) and when performance comes up short they witter on about "Moore's Law" and grab yet more resources. Its always the user's fault that you need what would have once been a high performance supercomputer just to open and display a web page.
The rot isn't confined to web pages. All this bloat has spread to the cloud, its management and to the so-called "Internet of Things" (or, as someone aptly put it, the "Internet of Vulnerabilities"). Its a mess but its one with so much momentum behind it that I have no idea how we are to unwind it. Maybe we should start by redesigning web protocols (I believe that Mr. Berners Lee has suggested this) -- they were a kludge, they're messy, unreliable and insecure and could very easily be cleaned up and made a whole lot more efficient.
Even the JS fiasco is preventable. Just start with a decent user model, recognize that anything that comes in from 'outside' needs to be sandboxed as its own user with very restricted capabilities. It won't happen, though, because Marketing needs that cross site capability to snoop on the user...so the arms race continues....
Biting the hand that feeds IT © 1998–2021