Its not just the scripting
The protocol used to load web pages is a kludge, and a bad one at that. At its root is the practice of carrying out protocol exchanges over a stream protocol, something it inherited from FTP which is maybe tolerable for the occasional file transfer but is utterly unacceptable for general use as its grossly inefficient and none too reliable.
But then, as I've learned over many years of working with this stuff, programmers just don't seem to care. They just grab resources without thinking of their impact on the system (or the network) and when performance comes up short they witter on about "Moore's Law" and grab yet more resources. Its always the user's fault that you need what would have once been a high performance supercomputer just to open and display a web page.
The rot isn't confined to web pages. All this bloat has spread to the cloud, its management and to the so-called "Internet of Things" (or, as someone aptly put it, the "Internet of Vulnerabilities"). Its a mess but its one with so much momentum behind it that I have no idea how we are to unwind it. Maybe we should start by redesigning web protocols (I believe that Mr. Berners Lee has suggested this) -- they were a kludge, they're messy, unreliable and insecure and could very easily be cleaned up and made a whole lot more efficient.
Even the JS fiasco is preventable. Just start with a decent user model, recognize that anything that comes in from 'outside' needs to be sandboxed as its own user with very restricted capabilities. It won't happen, though, because Marketing needs that cross site capability to snoop on the user...so the arms race continues....