I personally think my own education was about the right level and pacing as far as education in computing went - to give it some chronology, I left secondary school in 2000.
Now, from as long as I can remember, we had computers at home, a ZX Spectrum then an Amiga then a PC, each for several years. And from an early age I was introduced to programming by my dad, so it's always been something I was familiar with.
When I first entered primary school, they didn't have computers. I don't remember seeing a computer in school for the first couple of years.... it would have been year 3 when I first saw a computer in school, and it was a BBC Micro. A simple enough program where people were entering numbers to draw simple graphs (mirroring the stuff we were doing in class as a whole), plus a few other educational programs. Even then, those of us who were more comfortable with the computer were allowed more time on it, and those who weren't as comfortable (and who didn't go into computing later on) didn't have to.
Through year 4 and into year 5, more of the same: those who were more familiar with them were allowed more time on them, doing more complex things. I can fully remember writing stuff in BBC Basic and Logo, for example. Round about year 5 was when I started to see the Acorn Archimedes being introduced. Kids were given more scope with getting comfortable since it was a case of using a mouse, something they could interact with that felt more natural. But the scope of what was done was mostly pootling about with the art program and a few educational things again - it was almost like a second wave to see who was comfortable and would progress, and who wouldn't.
Then I joined secondary school, where it all changed. At this point, pretty much PCs everywhere, but also there's an increase in comfort levels, as it wasn't just a single machine in the classroom, it was one between two or even one per student. And we were encouraged to experiment and try things out for an hour per week.
Then on top of that, one module in 'Design and Technology' during year 7 was to implement traffic lights. A simple I/O box with 8 outputs and 4 inputs, and turning a traffic light to red was as simple as turning on the relevant output with a dedicated instruction (i.e. 'Turn on 1'), and students could see that the simple instruction had consequences. And of course it got more complicated by having multiple lights interacting - while there's little skill in implementation, it's a classic case of approaching it from the design/analysis side, which is what it was teaching, of course.
There was a little more of that later on, but most of the compulsory education thereafter was in ICT, i.e. using computers to communicate, rather than as part of computer science. I don't have a problem with that - we live in a world where using MS Office is a staple of the workplace, but those who knew about programming etc. were actively encouraged to make more of that.
Consequently when I first encountered programming properly at A level (2001-2), it did not surprise me that we spent a lot of time on proper design and analysis, on database design, entities and relations. We then also spent a lot of time in Visual Basic, but using it to implement basic constructs like lists, trees and so on, plus a few lessons on assembly language and even a couple on Prolog. In other words, a decent enough (in my opinion) grounding in computer science in general. Oh, and we spent the grand total of one lesson (1 hour) on HTML, and in fact I led the class in explaining it because I knew it better than the teacher did!
And today, after a career in financial services (that I sort of fell into), I work in computing. I feel comfortable with the journey I had, where those who weren't interested or didn't really have much in the way of aptitude for it weren't pushed through it, and probably wouldn't have made much of it even if it had been compulsory, while the few of us that showed an aptitude really went out and out into it in the end.
The tl:dr; version is that I'm all for encouraging those who have aptitude in an area (like I had no problem consistently being last in PE and those who were far better being given preference during PE to pursue their aptitudes), and I'm not sure what you could teach out of comp-sci that would be generally useful to most students. But ICT as it currently stands would be worth teaching to all, as it currently is, because most students will end up using ICT generally.
I also think that the one-day-of-HTML is well-meaning but somewhat misguided, it's like showing an avid reader how part of the book is printed, not the process in how to write a book, which is the part currently lacking.