Windows stopped being good....
at NT 4.0. And by Good, I mean tolerable
On January 30 2007, 15 years ago this week, the delayed and delayed again Windows Vista was released. We could have fun talking about how bad Vista was, and how much people hated it. It's one of the most unpopular releases ever, along with Windows ME. But if you run a vaguely modern Windows today, you're running a descendant …
It certainly motivated me to figure out how to interact with their OS the absolute minimum when using it.
"Pin to Start" is handy, on the few occasions it works
as is win+R
or right clicking the start button , if visible , gives you some quick access to the important bits
Claimed by MS was the biggest defect with Vista. So many people brought brand new machines with only 512MB and vista and discovered what a terrible combination that was.
Yes it would boot up eventually, yes it would open a document eventually but "eventually" is not why you buy a new PC.
To be fair, it's not the first time MS have screwed up system requirements. Windows XP calls for a minimum of 64MB RAM, and recommended of 128MB. Yet it runs positively *painfully* on such systems. I once saw XP on an old laptop with 64MB and it took over 10 minutes of hard-drive thrashing before it booted. Even the recommended 128MB was pretty painful if you wanted to do anything other than stare at an empty desktop.
I think Vista's main issue (apart from the cluttered UI and overly-lean minimum specs) was that it was just too heavy for many mid-range systems of the time and ran considerably more slowly on systems which had handled XP just fine. Windows 7 avoided this because more time had passed by, systems were more capable, and it maintained pretty much the same system requirements as Vista - hence people didn't see a performance drop when they installed it and most average systems were now well above the minimum specs.
I agree, but remember in hindsight a lot of that was Intel twisting their arm to make the 915 Chipset "Vista Capable"
I remember getting a new Dell laptop which came with Vista Home basic and had 512 MB of RAM and it ran like a dog, barely any spare memory left after just running the OS. So after about a week I formatted and installed XP instead and it actually felt like a new laptop then.
This was one of the main problems with Vista, MS marketed the system requirements way too low which meant many peoples first impressions of it was that it was like swimming through treacle. And so they reverted back to XP and never gave Vista another chance. If you had dual core PC, with 1GB+ of RAM and a decent GPU it ran pretty well after SP1. Just as good as Win 7 on the same hardware a few years later.
Indeed, my only exposure to Vista was around 2009 when my daughter bought a new laptop (Dell 1545 IIRC) with 3GB of RAM, running Vista. I had no problem with it, startup was reasonably quick and it was generally a bit faster than XP. No real upside to Vista I could see, but it didn't seem anywhere near as bad as I'd heard.
Makes you wonder if they (i.e. M$), are being a touch too conservitive this go 'round with Windows XI? What with the cutoff being at eighth generation Covfefe Lakes? Not sure how this is in say the real world though. As I'm rocking a ninth gen Covfefe Lake Hackintosh, with no further intrest at the moment for any of MicroSofts' products, sans perhaps the XBOX Series X. (Still kinda hoping that S0NY can mange to get its marde in order though.)
I had a low-specced Vista machine, and I'm here to tell you I would have loved to see it boot in a mere ten minutes. The average was between two and three times longer than that.
(It was a work machine, I don't recall the actual specs if I ever knew them.)
Most of that is cache used to speed up the system. Free RAM is wasted RAM. If it's just sitting there empty, what's the point of having it? Sure you get a warm fuzzy feeling seeing that big number, but what practical value does it provide? It's like renting a storage locker and then intentionally leaving it empty. Why!? Why not keep all those core DLL files in memory instead of reading them off the drive every time some app makes an API call? If you are constantly opening a particular Word document, why not keep that, and Word, in memory?
Fill every last byte of RAM with data that may be needed. The system can and will dump things to make room as needed.
I'm guessing you have more than 4 GB in your machine. Run on a device that only has 4 GB and it won't cache so much. Run memory-heavy stuff on a machine with 4 and it will quickly evict down to make room. It really can run in 4 GB. It can even run in 2 GB without becoming unusable, though it will get slower.
It's the eviction - and then re-fetching from storage - that massively slow the system down.
Would be fair to say that large amounts of space consumed by disk cache are not a big deal, because if the system needs that RAM it doesn't have to evict anything.
And now with flash based storage there isn't the mechanical random-access penalty of disk paging.
I think the other problem was drivers. Despite the fact they had a long alpha and beta program, many manufacturers left it to the last minute - or just took ages over it. I bought a new PC a year after Vista's commercial launch - and so at least 18 months after the public beta, let alone the private stuff shown to Microsoft partners. And yet the drivers for the soundcard weren't released until 3 weeks after I bought it, so it couldn't do 5.1.
Pre-release Vista was the system I was working with when attempting to integrate our game (which was supposed to be - it wasn't; we failed - a Vista launch title) into the new and exciting Games Explorer. It crashed so often and generally failed to do anything right I took to calling it the Games Exploder. A name, I have been reliably informed, started being used informally at Microsoft for a while.
"Games Exploder"
I tried a $1 subscription to the Microsoft service thing that lets you "start" 1 of ~120 game titles available for PC Windows. You had to be "logged in", no offline play. After enough crashes of the clien/license manager/"Game Exploder", I was simultaneously logged in and logged out. Because I was logged in, every game would "start", but because I was logged out no game would play.
It's amazing how bad that client/license manager is.
"So, pretty much on par with a lot of Microsoft's crap, then..."
No I was being kind, it's way worse than that, not joking. If you consider the years of experience Microsoft (or any company) has had developing, then correlate/compare that to their final product, it's hands down the worst Microsoft product ever... ever. Hands down. There's no contest to this, it's horrible.
The whole experience feels like an Electron based "App" however, even if you're not a fan of Electron (like me), you still have to imagine the worst Electron based app ever made, except that app probably doesn't crash as much as this one.
It crashes without telling you why. It starts multiple instances without letting you know. It pretends to be doing things when it's really not. It pretends to not being doing things when it really is. The UI isn't really a UI, it feels like an optional options menu. Honestly, I think the only reason it doesn't make any kind of "headline" news is because nobody uses it. While it goes unnoticed and is clearly not a enterprise concern, it is hands down the new black sheep... it's amazingly horrible.
I remember paying extra for the 'Ultimate edition' and being promised all these 'Ultimate extras' which would be exclusively released. In the end there were hardly any of these, a screensaver I think and a little robot arcade game. So that was a let down. However, I always liked the actual system - I had a decent spec PC and the experience was definitely more polished than before. Although I did love Windows XP, Vista did feel a bit more polished in places and a nicer experience. Not perfect by any means, but the driver changes felt welcome and whilst UAC was annoying you could see what they were triyng to do... and I seen plenty of havoc wrought by applications abusing their privileges in the hands of friends and family. So I think looking back it was a lot better than its reputation might suggest.
...was that all of a sudden, lots of connected devices didn't have working device drivers anymore. Arguably this is the fault of the vendors of that equipment for not providing updates, but more could have been done. Lots of perfectly functional equipment was rendered useless all of a sudden. I'm looking at you, HP desktop laser printers!
Microsoft were slated for years for the blue-screens on XP (and previous), which were mostly caused by badly written drivers having direct access to the kernel. So for Vista they prevented most drivers from having that direct access, but most companies hadn't bothered to spend much time/money writing drivers in the first place (which was why there were so many bad ones), so they didn't bother writing new ones for Vista.
Absolutely this. Vista actually forced peripherals to use secure device drivers, and in doing so revealed that almost no vendors had been writing them properly. The fact that MS got blamed for fixing this, often by people who had spent the previous decade banging on about Windows being a security shit show, has always struck me as rather unfair tbh.
Vista was really the point where Microsoft started taking security seriously, and they should have been praised for it. Instead, everyone crapped all over them for it.
At least Vista was not:
* All 2D FLATASS FLATTY FLATSO like'Ape', 'Ape point 1', 'Win-10-nic', and 'Win II'
* Embedded with SPYWARE and ADWARE
* A marketing platform for "The Store"
* A strong-arm tactic to make you use a MICROSOFT CLOUDY LOGON
* Plagued with forced updates that are often time bombs waiting to stop you from using your computer for several hours or render it un-bootable
(and so on)
I once did a graphic showing Max Headroom saying "C-c-c-catch the Wave! New WIndows!" with a Vista logo.
Thing is, Coca Cola quickly realized THEIR mistake with 'New Coke' and LET CUSTOMERS HAVE WHAT THEY WANTED. Microsoft appeared to release 7 as a sort of 'Windows Classic'. THEN they doubled down on STUPID with everything that followed...
"Modern" - it does NOT mean what THEY think *FEEL* it means.
Not sure the coke analogy really works as part of that rant. A new fizzy drinks recipe is arbitrary, updates to a commercial OS are not. OS updates are - and always will be - necessary.
There's a brand loyalty with coke, and people want a familiar experience when they buy a familiar branded product. With an OS there's brand loyalty too, but part of the customer experience of using an OS is to want new features and functionality - it's just that getting the appropriate balance of familiar comfort and new stuff is difficult, and won't please everyone.
This will always be the way - If it wasn't I'd still be happy with using GEM on my Atari ST.
I'd say that, with a little bit of imagination, the analogy actually works quite well.
Changing the recipe of a familiar beverage if, say, one of your ingredients is found to be somewhat less good for people than had been presumed up to that point, would be a necessary change, and if the result is something that tastes not *quite* the same as it did before, then so be it.
Changing the recipe just because you feel like adding, say, a dash of lime juice for no good reason other than you can, would not be necessary, and if the result is something that enough people think is utterly revolting then you really, REALLY, need to own your mistake, accept that your brilliant idea was actually a pile of fetid dingoes kidneys, apologise, and revert to the original, or at least make the original available again for anyone who wants it.
Applying this same would/would not principle to Windows, we can quite easily seperate out all the changes MS have made into necessary (security, stability etc.) changes, and unnecessary ones (frivolous twiddling with UI themes, moving stuff around, removing features, adding telemetry etc.).
Not surprisingly, it's these latter changes which incur the overwhelming majority of the wrath directed towards MS every time they demonstrate their ongoing superiority in the "we know better than anyone else how to royally screw up something used by millions, and we really don't give a rats ass what any of you think about it" rankings.
... part of the customer experience of using an OS is to want new features and functionality ...
Well, up to a point. Users want new things, but they want the old things to stay as they were -- they don't want to have to learn again to do everything they already knew how to do.
Businesses, even more, want continuity. They don't want to have to spend the time and money it takes to retrain the workforce.
"Thing is, Coca Cola quickly realized THEIR mistake with 'New Coke' and LET CUSTOMERS HAVE WHAT THEY WANTED."
Coke also had Pepsi to worry about. MS just has to worry about people jumping ship to what, Apple? Linux? Not currently realistic for many windows users.
I was sitting in my usual corner of the pub, back to the wall, minding my own business when I spotted him out of the corner of my eye. Skinny, gaunt features, ill-fitting clothes - you know the type. For a fleeting instant I thought he hadn't noticed me, but no such luck. In an instant he drew a bead on me and made a determined straight line approach, rudely pushing people out of his way.
I sighed heavily. It's not for nothing I always make sure a have a table in front of me in such places. For some strange reason, all the weirdos of the world seem to be drawn to me.
"You're one of those computer Greeks." He proclaimed.
"How very perceptive," I replied pleasantly, as I partially closed the lid of my laptop. "although I'm actually Spanish."
With a dismissive wave he went on, "Computers are evil." He attempted to intone, but the effect was rather spoiled by the squeakiness of his voice.
"Oh really?" I replied, trying to look like I was interested.
Seeing the usual badge he jumped back a little "Ahha!" and with a shaking finger pointed. "Intel!" Crossing himself, he went on "Spawn of the devil straight from Hades."
"Well, I know Intel processors get pretty damn hot," I said lightly, "but surely not that hot."
"You may mock," was the stern pronouncement "but in the fullness of time you, and all your ilk will suffer the wrath of the one true God."
"So how can Intel, a mere hardware maker be this evil." I asked?
"They are Satan's enablers of greater evil." He said darkly.
"Oh, come on!" was my exasperated reply. "How on earth do you make that connection?"
Placing his hands squarely on the table, and leaning forwards so we were almost eyeball to eyeball he gave a quick, furtive sideways glance, then, "The ultimate evil." he hissed "Windows Vista".
So...
As a friend, I think I should warn you...
Computers are evil.
Plest,
I disagree. I think MS had to do Vista. XP had shown them that they couldn't just update it, polish it and re-release it. It was too much of a security nightmare. Vista was what they had to release after re-writing so much of their legacy code, but trying to keep it as backwards compatible as they could manage.
Windows 7 was what Vista could have been if they'd had more time, but was also helped by the fact that hardware had continued to improve in the intervening time.
The lesson they seem to have failed to learn is Windows 8. Fucking with the UI for no purpose (except for the tiny minority running it on tablets) and then refusing to listen to everyone's complaints. They corrected that quite well with Windows 10 - but seem to have fallen off the horse again by needlessly fucking up Windows 11.
At least Windows 8 worked though, even if it was bloody hard to use. You forget how hard until you see a PC before it was updated to Win 8.1 and you were forced to interact with the full horrors of Metro. But for complete craptasticousity it has to be Windows fucking ME.
I think people forget the added bloat to Windows ME meant it would blue screen what felt like 3 to 4 times more than Windows 98 with the exact same software (Active Desktop crap, the Explorer Shell bloat not helping). I brought my first pc with 128MB of ram (Evesham Vale). I experiemented a lot and as my parents had a slightly older Evesham Vale PC, re-using the 98 license and media install found my pc was fine. As my dad wasn;t happy with this, I reinstalled with Windows ME, upgraded memory to the maximum I could at some point (I seem to remember 512MB but I'm not 100% sure) and it was then stable.
I did similar.
My first Windows PC at home, a prebuilt from Gateway, came preinstalled with ME (I'd used Amiga's up to that point).
I hated it, crash, restart, crash etc.
I ehem, got hold of a copy of 98 SE, wiped the drive, put that on instead, no more issues! (Well, not many anyway).
I've also managed to miss the worse if the recent Windows versions, I went (at home):
* ME (briefly)
* 98 SE
* XP
* Win 7 (64 bit)
* Win 10 (64 bit)
I plan on going Mint next ;-)
Win ME wasn't NT. Vista was NT.
Win ME was pointless garbage, Windows 98SE was better, if you needed stuff that didn't work on NT 4.0
We didn't upgrade NT 4.0 till after XP came out. Not sure if we waited for SP1. Win2K was the unfinished XP, as Vista was unfinished Win 7
As the article pointed out, Vista and 7 aren't too different under the hood; I'm still impressed that if your system could run vista, there's a pretty good chance it would run 7, 8, and 10 just as well (and probably 11 in a very unofficial, unsupported way)
Vista had some odd design choices (The too noisy User Account Control popups, terrible file access performance Pre SP1, and a last minute sound re-rewrite pretty much breaking Creative's stranglehold on premium audio spring to mind), but with the rough edges smoothed off between NT 6.0 and NT 6.1, the solid design principles could shine through.
The biggest positive change I think was breaking the Windows-land user and developer assumption of having admin rights at all time.
Working in the XP Era, the normal workflow (Especially for programs with 9x lineage) was;
- Try running a program as a limited user
- It does not work
- call the support for the program
- get told running as a non administrator was not supported
- Sigh, curse, and try and trick the program into working in a Varity of different ways.
With user Account Control on by default (And viciously rabid in it's first iteration), this was no longer a sensible option, forcing many smaller software companies to get their house in order - It made things so much better for security in the long run.
I don't love Vista; I never did, but I appreciate what it did right, and allowed for it's successors to do better.
I once tried to delete some data that was on a network share from within Vista and it sat there and sulked and updated the times to say "About a millennium remaining" and it indeed sat there and did nothing for a while until I cancelled the operation.
I went to the command line and did the same command there and it took about 5 seconds, so clearly something was badly broken in the OS.
I've always wondered about the eye candy UI overhead on all sorts of Windows things since then as it seems this behaviour still remains in Windows - a lot of "being busy" rather than getting on with the job that I asked it to do. Its not a performance issue, since the command line can achieve the desired outcome on the same hardware.
and it took about 5 seconds, so clearly something was badly broken in the OS.
The biggest problem was that network file operations communicate with Windows Explorer, so that Explorer shows live updates. Couple that with the enormous latency of encrypted* and authenticated SMB1 over TCP/IP.
In theory, that should only make it slow, but in practice sometimes it didn't work very well at all.
*By default, encrypted and authenticated because the same channel is used for login, AD and group policy. Over TCP/IP because file servers need to use a routeable multi-site protocol, right?
The biggest positive change I think was breaking the Windows-land user and developer assumption of having admin rights at all time.
I used to run Win2k from a limited user account. It was annoying, but perfectly possible. Most things that required more privilege could be run using "Run as..." to access the Administrator account (so long as you knew the appropriate password). I used to create a "Developers" group that had (e.g.) the privileges necessary for debugging.
Methinks we should NOT have had to wait for Vista and UAC to train users and developers that limited accounts were the norm. Windows should NEVER have made new users members of the Administrators group by default. The mistake goes all the way back to NT3.1.
Vista typically came with to much junk pre-configured: utterly pointless widgets that just drained CPU cycles for no benefit whatsoever and a really, really inefficient File system interface (explorer) that it was often just unusable. Add in the metric ton of crap that many vendors shovelled onto systems and combine this with pathetically specified systems and Vista was something to avoid.
Windows 8 was definitely not a "good operating system" (article) - the driver model was half broken and the user interface was "designed" by a muppet who looked at a book on good user interface design principles and chose to ignore every single point in the book. From mystery meat navigation, to incomprehensible icons through ridiculously inconsistent interfaces and the utterly blinkered stupidity of trying to make a tablet user interface the only user interface for a desktop PC or laptop. In the end the interface was usable for no systems.
>the user interface was "designed" by a muppet who looked at a book on good user interface design principles and chose to ignore every single point in the book
As opposed to Windows 8, where the same muppet took very careful notice of every single point, and then did the exact opposite.
Or Windows 10/11, where he/she tired of trying to parse the book and instead simply shoved it up his/her nose instead.
The biggest positive change I think was breaking the Windows-land user and developer assumption of having admin rights at all time.A principle that the dumb developers within Microsoft often break. Install Windows from scratch, congratulations you must assign a user who will be given administrator access by default. Install Microsoft SQL server and where are the data files stored by default? In the Program Files tree of course, which by Microsoft's own guidance should only ever be written to by an account with software installation rights and definitely never used for data or configuration. It was this last point that often caught out incompetent developers and why they usually stated "must have administrator rights to local system" because data files, log files and so on were incorrectly written into an area of the file system that should only ever be read-only. After all, it's only been in the Windows API since about 1995 to ask for and be provided with a suitable data path (1995-ish, mangled by the horrors of Microsoft insisting that Internet Explorer as part of the Operating System to try and justify them abusing their monopoly position with it).
A principle that the dumb developers within Microsoft often break. Install Windows from scratch, congratulations you must assign a user who will be given administrator access by default. Install Microsoft SQL server and where are the data files stored by default? In the Program Files tree of course, which by Microsoft's own guidance should only ever be written to by an account with software installation rights and definitely never used for data or configuration. It was this last point that often caught out incompetent developers and why they usually stated "must have administrator rights to local system" because data files, log files and so on were incorrectly written into an area of the file system that should only ever be read-only. After all, it's only been in the Windows API since about 1995 to ask for and be provided with a suitable data path (1995-ish, mangled by the horrors of Microsoft insisting that Internet Explorer as part of the Operating System to try and justify them abusing their monopoly position with it).
One program the company I worked for used was did exactly that. It required those of us who had to use it, to have more rights than everyone else. It was pointed out that this broke company rules but it also wasn’t negotiable as the company had picked that software vendor. This was the software that one of our senior programmers described as “warped”. He asked if mind altering substances were in regular use there, because nobody but nobody structures databases like that - for a good reason.
I used my extra privileges to (amongst other things) change the awful compulsory we corporate desktop wallpaper to something a bit more pleasing on the eye.
That reminds me of some software (can't recall which, but some where building management type ones IIRC) insisted it needed sa access to Sybase (or in some cases MS SQL Server). I said "Not going to happen." In most cases you could work out what privileges they actually needed, but again crap developers developed stuff as sa instead of normal database account.
Of course it's true that Windows 7 was based upon vista, but to justify vista's existence on that is to miss the point. Vista was simply not finished, it was not ready for release and should never have been released until it was at a level of stability similar to that achieved with windows 7.
At the time it came out I was starting up a business and a long story short was that I simply needed a computer, any computer with a browser. In my haste I went to the nearest available place - a PC World opposite my premises. Yes, I know I should have known better. But as I said I was in a hurry.
What I got was an Advent(?) branded desktop with the world's shittest resolution widescreen monitor. Running Vista. I can't remember the spec but I know it struggled to render the UI, which it then tried to display on said monitor. A horrendous experience, made physically tactile via the £8 plastic keyboard and mouse that came in the box.
It did the job I needed it to do for almost 2 and a half years and then thankfully broke beyond repair. It was without question the worst computer and experience of using a computer I ever had. Yes the hardware was shit but the OS made the experience that much worse.
Starting a business is hard and that Vista desktop was somehow poetically appropriate to that period of my life. Everything was a struggle, but then everything moved on...things got a lot better.
(The machine has since been replaced by a Mac with a 5k Retina display).
It had a lot of significant under the hood improvements, even if they weren't as extensive as promised, it still laid a lot of important groundwork for Win7 and beyond. They rewrote the GUI using Direct3D allowing hardware acceleration from the video card. There were improvements to the process scheduler to take into account these new-fangled multi-core CPUs which are not the same as SMP from a process scheduling POV. They overhauled the driver model hardening it considerably. Yes, that meant that all the cheap low-end computers out there struggled with Vista and all that cheap Chinese third party hardware might not always work, but it was all a necessary change for the future of Windows development.
We already had direct hardware acceleration from the video card. I had 3D direct hardware acceleration in Win98. And in XP. They re-wrote the video stack because ... dunno. It was a long time ago. Was it part of the 'driver hardening' you also mention?
In fairness, at the time MS would have been saying "Direct 3D now provides hardware acceleration", but that's just normal MS marketing. They *never* refer to previous technology when listing the characteristics of new releases.
It used a ton of RAM to render those windows though, a copy in VRAM and system RAM, the system RAM copy was removed in Windows 7, and so became lighter.
Also, it removed all GDI hardware acceleration, which meant that almost every application's drawing became markedly slower. Some things returned in 7, but it's never been the same. XP/2003 are much, much more responsive than every version after even on far superior hardware, when it comes to drawing basic controls.
It's so tragic that every time I hit an old server and log on, I'm blown away by basic things like resizing windows or opening folders.
Vista was definitely the time Windows got bloated, and it never truly recovered. They did some work to that end in 7, but every subsequent release has just poured molasses in to the code to a comical degree.
Hardware designed for Vista globally worked well. OK the UI was a bit unexpected since coming from XP and 2003 server, but fine, no issues really.
Hardware designed for XP but officially categorised as "Works with Vista" was a lot more haphazard and not as stable... Actually for me a lot less than Win11 - which you can force install over Win10 on hardware that will not self-update on "non compatible" CPU's...
Given I was running a pretty solid version of Ubuntu (I can't remember if it was Dapper or Edgy) at the time I had Vista foisted on me at work, it should have been the easiest sell ever.
And still they blew it.
Even now with Win11 creating schisms, where is Mint ?
The codenames were derived from the Whistler Mountain ski resort, featuring Blackcomb mountain.
Sure, name your project after big tall mountains, that's an idea.
But the third iteration is named after the Longhorn Saloon, a bar at the foot of Whistler mountain. Somene must have had a very good time there, once upon a time.
It was... better than trying to find drivers for XP64?
IIRC that's the sole reason I installed Vista when I built my Core 2 quad with 8GB RAM; 32-bit XP would have made poor use of the RAM (assuming one could tell it how to use more than 4GB at all), and XP64 had a serious lack of drivers.
That's entirely accurate and scary.
The slightly insane thing is that there was nothing really stopping a 32 bit OS from using more than 4GB RAM, nor realistically any single application either (the limit is 4GB contiguous block of RAM).
Easier to just support a larger addressable space without messing around, however this was very common in early MS/PC systems with the "fun" of EMS/XMS memory management.
The absurd published RAM requirements were the worst thing about Vista (well, one of the worst things). Manufacturers blindly made machines with just 512GB RAM and those machines were actually too slow to be usable.
And it continues today. Windows 10 on machines with 4GB RAM and a spinning rust drive are just not usable for real work by ordinary people. We just took on a new customer with half their machines, despite being only 3 or 4 years old, having a mechanical hard drive! And they are SLOW... 45 minutes to boot sometimes!
Microsoft should just say 'minimum requirements 8GB and an SSD' now.
Manufacturers blindly made machines with just 512GB RAM and those machines were actually too slow to be usable
Someday we will look back in fond nostalgia at systems that only needed 512 GB RAM. Such tight, clean programming and minimalist requirements.. Its art I tell you..
That is as long as you had 2x the recommended RAM, or preferably 4x (which is a general rule of thumb for Windows anyway tbh!).
Then just disable disk indexing and it would speed up no end.
Unfortunately it followed the Microsoft trend of OK/bad/OK/bad versions of Windows, being the bad one in between XP and 7. 11 looks to be contiuing the sequence so maybe need to wait for 12 :)
I still remember reading "So Beautiful, So Disturbing", a blog entry that imagined Vista as an unexpected new wife.
I started pre-Windows, used Windows 1.0 (program manager or file manager camps and Mah-Jong appeared), then 2.0 (a bit more colourful and more fonts), then 3.0 (big changes because you got the Smartcache under DOS 5 which really helped it move, then 3.11 which was more business focused.
After that I think Bill realised that he had to start taking stuff away at the same time as giving new features, so that something that used to be simple to change required 2 sub-menus and an Advanced button press. He needed to keep giving people a need to upgrade to keep the cash rolling in.
I downloaded Linux at that point in 95 ( 13 hours over a 33k modem with X11 ) and have been running them both side-by-side.
I liked XP especially for games, but it was 32 bit and a half with that annoying 3GB memory limit (why not 2 or 4 signed or unsigned I don't know).
The other feature of Vista of course was the very annoying double-confirmation pop-ups, which was thankfully fixed in 7. I liked 7. 10 I think was a step backwards and my jury is still out on 11, apart from of course yet more things being unconfigurable in Control Centre.
MS always seem to have a "good release", "bad release" thing going on with Windows. (And other things, TBH).
Windows NT 6 - loved
Windows 2000 - not too many happy with it
Windows XP - everyone liked this
Windows Vista - almost nobody liked this
Windows 7 - again, pretty popular
Windows 8 - I don't know anyone didn't hate this
Windows 10 - a definite improvement over Windows 8, if not Windows 7
Windows 11 - not seen it yet but I suspect it'll be another Vista...