Linux's moment
^ That's it, really
If you're running a very old PC but have managed to persuade Windows 11 to boot, it looks like the rug could soon be pulled from under you, judging by a post claiming that Microsoft's code will now require an instruction not found on old CPUs. The instruction in question is POPCNT, which first turned up in Intel Core …
If the target were not constantly moving, Linux would ALREADY have a REAL working Wine implementation that was 100% compatible, and SAMBA networking that had all of the Windows features, and an ENTIRE userbase of former Windows users giving the finger to Micros~1.
Unfortunately NOT the case. Instead, Micros~1 gives the FINGER to the CUSTOMERS, tells us we MUST COMPLY, and SETS ALL OF THE RULES.
It depends what VM software you use, but usually it doesn't if you're already running on X64. However, I really doubt you're trying to run modern VMs on a computer with an X86 processor that's old enough to lack this instruction. While there are boxes with older CPUs in production out there, they usually don't run the latest software versions whether Windows, Linux, or something else, so this doesn't affect you. If you are trying to run latest OSes on a computer that old, there are emulation options that will run those instructions, though expect there to be slower performance if you run those instructions a lot.
Sadly sometimes that's true like if most of your day job uses SW with no Linux equivalent.
Plenty of my Windows programs don't run on Win10, there will be no new version and in some cases the developer is dead.. That's why Win10 comes with a VM option.
Run the OS that runs most of your programs. I've not used Dual boot either since 2016. At worst I only need to run one XP program every 11 days. My weather station program that saves the actual weather station data to Access.
I agree in theory, but Betty who comes in four days a month to do the accounts isn't going to fire up QEMU on her Linux box so she can get the payroll done in QuickBooks or Sage in a Windows VM.
Anybody who says otherwise has never dealt with the dear semi-retired ladies like Betty who do the books for small businesses. :D
And medium to large businesses simply don't use Linux on the desktop, so there's no discussion to be had there unfortunately.
you are not wrong about the VM. However Micros~1 is also making it HARDER to run windows in a VM as well.
I abandoned my Visual Studio subscription because of Win 11. If I must run that OS in the future I'll buy the cheapest hunk o' crap CPU I can find and wire it to the KVM as needed.
Seems to me that the $800/yr became impractical and unaffordable. $200 for a cheap hunk o' crap makes more sense.
Never experienced any compatibility problems with Samba, at least on FreeBSD with the appropriate zfs acl stuff enabled. Figuring out how to get Windows to recognise zfs snapshots as shadow copies took me a while, but I got there in the end.
Even on Linux with a less capable file system (not all Linux file systems are less capable), not having the full range of extended permissions doesn't really cause a problem.
One thing I've noticed is that Samba's performance is utterly terrible. This is long standing across versions, platforms, Windows version and so on. If I copy a 20GB file from Windows to a Samba mount I'll get reasonable speed, e.g. 60-70MB/s over gigabit. If that same 20GB is instead 10,000 files I'll be lucky to average 5.
It was enough I was convinced there must be something wrong with my installation, until I talked to people who use Samba extensively and the response comes back "Oh, no, that doesn't happen". Until you demonstrate exactly the same issue on their systems. Windows<->Windows doesn't seem anywhere near as bad, nor does either Windows or Unix<->Unix over NFS.
I'm open to suggestions, but no I really do suspect it is that people accept shit as par for the course.
Have you got a practical replacement for all the infrastructure and application space that all those users *think* they need? Nice cuddly UI that has consistent functionality and everything in one place? No, because of the fragmented effort that defines Linux.
You and I don't need windows, but a behemoth marketing machine with 50 years history and thoroughly embedded in school IT classrooms is hard to push back against.
A young-ish friend of ours bought the bits to build a PC and had a ton of trouble getting it going. Being knows as a computer nerd meant I inevitably got dragged in to get it working - nothing difficult to do really - but the deal was I would sort the hardware only. I adamantly refused to install Windows for them - if they want the hassle of the damn thing they can learn how to troubleshoot for themselves.
I did put Mint on the machine along with Steam, and I believe they haven't installed Windows despite their initial objections... Whether that's because they can't work out how to change it - or found out the hard way that it actually did what they wanted is not a question I've had a straight answer to.
I really don't get why people have such a hard time with the ribbon. It was the same basic concept as the toolbar system it replaced. Things that were under "Formatting", for example, were generally under the Formatting tab. It's just instead of a vertical text list you had buttons going horizontally... which makes sense given the shift to widescreen for monitors where you have more horizontal pixels than vertical. Sure, sometimes you have to check 3-4 tabs to find what you're looking for, but replace "tabs" with "dialog boxes" and it's the same scenario as before. Besides, if you work smarter instead of harder, you can create your own custom tab and populate it with all the commands you frequently use, so then you almost never have to change tabs. Maybe 10-15 minutes of effort up front saves you a lot more on the back end over time.
"It was the same basic concept as the toolbar system it replaced."
So why inflict a change on the users? (see below) And drop-down menus disappear when you've finished with them, the ribbon just sits there occupying more vertical space from what I can see so it's not a good way to make use of screens shallower in relation to their width.
"Besides, if you work smarter instead of harder, you can create your own custom tab and populate it with all the commands you frequently use, so then you almost never have to change tabs. Maybe 10-15 minutes of effort up front saves you a lot more on the back end over time."
An IT pro or power user might well. At the other end of the spectrum it'd probably take a day and end up in a worse situation. All for something that was never necessary except for one reason:
Having been forced into a corner where they had to stop breaking old versions of Office every time a user got sent an older file and, being faced with OO & LO who could read an XML as well as anyone else, they had to put new users in a position where they might not understand the old-style interface. Stuff the existing users, as ever with their hostages, they had to like it or lump it.
So why inflict a change on the users? (see below) And drop-down menus disappear when you've finished with them, the ribbon just sits there occupying more vertical space from what I can see so it's not a good way to make use of screens shallower in relation to their width.
In the before times, people would have like 6-7 layers of toolbars on their screen taking up 2-3X more space than the ribbon, and at lower resolutions to boot. All so they would have access to maybe one button that they actually used. The ribbon does an excellent job of exposing a lot more functionality than the old menu/dialog box combo while being quite efficient in terms of screen real estate. Flipping between tabs on the ribbon is less visually disruptive than dealing with menus and dialog boxes. Plus, you have visual cues in the button icons to help you locate the function you're looking for. This is much better from a UX perspective than the old menu/dialog box system. I get that a lot of people "grew up" on the menubar, I did too, but the ribbon is actually pretty well thought out and improves usability.
An IT pro or power user might well. At the other end of the spectrum it'd probably take a day and end up in a worse situation. All for something that was never necessary except for one reason:Having been forced into a corner where they had to stop breaking old versions of Office every time a user got sent an older file and, being faced with OO & LO who could read an XML as well as anyone else, they had to put new users in a position where they might not understand the old-style interface. Stuff the existing users, as ever with their hostages, they had to like it or lump it.
What exactly does one have to do with the other? MS deciding to change file formats and the ribbon UI vs menubars and dialog boxes do not have any connection that I'm aware of. Could you maybe dial down the hyperbole and melodrama about hostages and state your argument in a factual manner?
No. With the previous system I rearranged the WORD menus for my staff that had stuff in the menu where, in their workflow/world view, they'd want and expect to find it, because to them certain items naturally went together, For example, a few things fitting into Formatting, which Microsoft hadn't originally put there. .
Once the cursed ribbon arrived it all became much more time consuming. Too much so to be practical. I couldn't just move things into a different menu any more. Now you have to create a complete new custom version of that menu putting in all the original items that you wanted to retain and adding copies of the ones you wanted to transfer, and then completely hide the original menu. For example, if some item isn't considered part of the formatting menu options then I no longer can just add it in, I'd have to create a new menu, from scratch- if I could even remember what was in the original.
This is because the ribbon embodies Microsoft's belief that everyone has to work they way they think we should.
No. With the previous system I rearranged the WORD menus for my staff that had stuff in the menu where, in their workflow/world view, they'd want and expect to find it, because to them certain items naturally went together, For example, a few things fitting into Formatting, which Microsoft hadn't originally put there. .Once the cursed ribbon arrived it all became much more time consuming. Too much so to be practical. I couldn't just move things into a different menu any more. Now you have to create a complete new custom version of that menu putting in all the original items that you wanted to retain and adding copies of the ones you wanted to transfer, and then completely hide the original menu. For example, if some item isn't considered part of the formatting menu options then I no longer can just add it in, I'd have to create a new menu, from scratch- if I could even remember what was in the original.
I read that, and all I see is a curmudgeonly comment that boils down to "change is bad." Not because you have any particular reasons, just that you personally were entrenched in your way of doing things and don't want to have to change. If there's one truth about life aside from death and taxes, it's change. You can change with the times, or you can be left behind. It's why I've always focused more on the why than the how. Which is to say, I don't memorize where every function is in MS Word, or what it's called in MS Word, I learn the things common to all word processors. So, if some day I'm left with only the option of WordPerfect (Corel even still developing that?) I'm not just sitting there staring at the screen going, "Uhhhhhhhhhhhh...." I may not be as efficient as I would be with Word, but I can at least do something besides sit around with my thumbs up my ass.
Some of the more recent versions of Office have gotten even better. You can use the search bar to find functions and don't even need the ribbon at all. Don't remember what some feature is called, just search for what you're trying to do and there's a good chance it'll come up with the correct function. It takes a bit of commitment on our part to unlearn the old way of doing things and adopt the new one, but it's helpful to take a dispassionate view of things. That way you don't miss out on legitimate improvements just because you don't want to have to go through the hassle of learning new habits. Besides, learning new things is good for you. Helps keep the mind sharp. Feel free to do a little light reading on neuroplasticity if you don't want to take my word for it. People who seek out new experiences tend to remain mentally acute well into their senior years, while those who stick with their ruts... don't.
Also, yes, you can "just add it in". The ribbon is quite customizable. Granted the interface for doing so is a bit clunky, and sometimes trying to find the feature you want is a hassle, but you can remove items even from the default tabs and add in others that you want. So, if you're adamant that things like bold/italic/underline belong on a Formatting tab, not the Home tab, you can remove them from one and add them to the other, or just add them to the other.
This is because the ribbon embodies Microsoft's belief that everyone has to work they way they think we should.
Can't any of you people make an argument that doesn't rely on a bunch of melodramatic hyperbole? Pro tip: It's not about you. The developers at Microsoft do not sit around all day thinking, "How can we screw with Terry 6 from The Register today?" Seriously, take a step back and just look at it from a UX perspective and you'll see it's a lot more consistent and generally better than the old method. It's not perfect, but it's closer to perfection. These kinds of changes are generally run through all kinds of focus groups and case studies. They will bring people into a lab to test it and give feedback. I've actually done one of those before. Not for Microsoft, but another household tech company name. They were testing different ideas for a new version of their mobile app. I was in a room with someone and there was a 1-way pane of glass so a couple of other people could observe my actions. It's not like someone just wakes up one day and says, "I think I'm going to change <Feature> to work a different way." Any developer that did that would likely find themselves out of a job rather quickly and their changes being reverted, and I think you know it.
I read that, and all I see is a curmudgeonly comment that boils down to "change is bad."
Then you aren't seeing anything.
In short, for your benefit, the Ribbon turned a simple task that was a useful time and cognitive effort saver for the 10 or so highly skilled people I managed and who quite rightly begrudged every second spent away from their substantive professional roles to be sat at a computer, into a complex task that ceased to be practical.
"Besides, learning new things is good for you. Helps keep the mind sharp."
Some of us get more than enough of that thanks to the sort of work we do, and would really appreciate not being given even more opportunities to have to learn new crap just because the software tools we use to help us do that job have decided to randomly change their UI designs for reasons that, to those of us who've used them for longer than some of the design team now employed to keep them updated, make absolutely zero sense.
You can be as insulting as you like towards us fuddy-duddy stick in the muds who'd be happier with our UIs continuing to look like something from the 80's, but that doesn't change the fact that there are sometimes some damn good reasons why those older UI concepts genuinely WERE better than what we've got today, and it completely fails to accept that in some cases, what we're forced to use today is based around some truly awful decisions made by those wielding the power - e.g. the way in which later versions of Windows initially adopted, and have then failed to entirely shake off, a touch-centric UI design because everyone at MS thought Windows for phones/tablets/other touch-enabled devices was going to be the next big thing and therefore we all ended up with that style of UI even when running on a bog standard desktop system where all the input continues to be keyboard/mouse based.
There was SO much work done back in those earlier days to research how humans interacted with UIs, and what therefore made good or bad sense to implement in such a UI. Where's that level of research today? Because I genuinely refuse to believe that anyone with any level of scientific legitimacy could have come up with the idea that making the Windows UI a largely monochromatic affair with (next to) no contrast between adjacent elements, making it damn near impossible at times to tell which bit of the screen relates to which bit of the UI, was an even remotely sensible thing to do. Yet here we are, still largely stuck with it after so many years despite the slight improvements made in more recent iterations.
Then I fire up the emulator for one of the older systems I used back in the good old days, and I'm struck by just how immediately useable those systems were. Not because of how familiar I am with them - in some cases my memories of how to use them are so eroded that I'm as good as being a beginner again - but simply because of how obvious they made it to the user as to what they could do. Things you could interact with looked like things you could interact with, options were all laid out for you either on a giant toolbar or within the non-dynamic menu layouts, and there was next to no effort placed on making the UI *look* nice vs making it *work* well. Some people might consider those old UIs to be rather unpleasant/unpolished to look at, people like me see them and think "there's a UI designed to actually be used, not merely admired from afar".
Yup. I spent 40 years of my life working in specialised areas of education. Part of and managing teams of incredibly specialised teachers, with long waiting lists for our work.
Not one single member would have wanted to spend an an extra second on working out how to do something on the sodding computer.*
*My IT skills preceded my teaching, from when I thought of making that my career, and went into the background as a relaxing hobby once I became a specialist teacher.
What about the ones you need once or twice a year? A good menu system is important.
Then the frequently used buttons on one or two tool windows
The ribbon, hiding active scroll bars, removing least used commands from menus, flat, text that might be button or a browser link or open a window or a setting or text, slide switches in stead of a checkbox, etc are all retrograde UI UX decisions. Vista "aero" was one kind of stupid. Win8 "Phone GUI for a Desktop" a third and Win1x with only really a totally flat dark or totally flat light theme with only one accent colour is ghastly. Like Window 2.0 on a Hercules mono card or CGA Mono. Sack all the so called "graphic designers" that think they are designing corporate laser printed stationary and know nothing about GUI design.
The problem I had with the move from menus to ribbon, and which is STILL an issue in the current Office suite and other things that have adopted that UI style rather than continuing to provide a legacy/old fashioned/user friendly (delete as per your personal opinion) style of UI, is that the way the UI is now intentionally designed to self-modify based on the operating mode of the software means that it's damn near impossible to simply learn what the software can do by randomly looking through all the available menus and seeing what options are listed. It's not simply a matter of clicking on a particular ribbon tab, because the contents of said tab may well still then differ depending on what you're actually doing at the time.
In contrast, with few exceptions, drop down menus tended not to rewrite themselves based on the state of the software, other than to perhaps grey-out some entries to let you know they weren't presently useable, so a) the menus actually WERE a menu of all the options provided by the sofrware, and b) you always knew which part of the menu tree to go to for a given option. Consistency of UI design, enabling users to start committing commonly used options to muscle memory, what a radical concept...
It's now getting to ridiculous levels where, even when you DO know the software provides an option, because it's one you don't use all that often you've now forgotten which sequence of events is needed to expose that option in the dynamic UI, and so you end up having to ask your favourite websearch engine to tell you how to get to the damn option again.
In an age when we have what feels like infinitely more processing power at our fingertips than we had back in the good old days, it simply beggars belief at how much harder it now is to get computers to do even basic stuff simply because of the trend for moderns UIs to focus on style over substance - they're rapidly approaching the point at which they'll become the desktop equivalent of a Hotblack Desiato-approved spaceship control panel, and unless you know which seemingly blank part of the screen to tap/click on, you might as well just dig out the abacus and notepad, because that'll be the only way you'll get any work done...
I don't recall hearing anyone say they had "a hard time with the ribbon". I recall plenty of people, including myself, saying they hated it: a significant UI change that broke a lot of keyboard accelerators, wasted screen real estate, and offered no real advantage in return.
Personally, I never use toolbars — using the mouse means taking my fingers off the home row, unless I have a proper (pointing-stick) device, and my work machines haven't had those in a long time (because IT insist on purchasing rubbish). The ribbon was just a way to make menus worse.
It’s an efficiency issue. Remember, the Ribbon was introduced many years ago, before modern widescreen monitors became popular. It takes up space at the top of the screen, along the shorter dimension of the monitor. And in addition, most text documents are created in a portrait, not a landscape layout. So the Ribbon reduces the amount of your document you can see at once, while wasting space at the side of the screen.
This is why the Sidebar in LibreOffice is a more efficient UI. It sits to the side, taking up space where there is more space available, letting you see more of your document.
It's more consistent than generally anything you find in *nix land, but I agree it really needs that top-to-bottom treatment it got with Windows 95. Windows 11 was probably the single biggest push towards unifying the Windows UI since Windows 95, though it still fell well short of the goal. For years I've been saying MS needs to lock some developers in a room and not let them out until they've revamped the entire UI to whatever design spec they want to use. I'd prefer it not be the flat drab monochromatic fluid look, but that would at least be better than the mishmash of different styles from over the decades.
I'm happy to use any OS as long as it runs the app I REALLY need,
Linux refuses to even think about running the installer, WINE refuses to touch it.
The developer has retired through ill health, so NO chance of a "native" version, though that might have been difficult because it relies on an MS database system.
If you can drive windows 7 you can drive 10 or 11. 8 was quickly backtracked on after the scale of the clusterfuck that it was became apparent. The changes that there are, are generally minor and annoying. It can certainly be described as consistently annoying!
MS Office is the killer application, let's be honest, and the new paradigm of search for everything largely "solves" the complaints of the ribbon interface. Many other applications now are web based and so that problem goes away. For games, the only other holdout fortress for justified Windows usage; Proton is getting very, very good. I won't say I can run everything, but it is not far off.
Picking up Linux to use basic desktop functionality, launching programs, moving icons around etc. has not been a problem for well over a decade now. However, the moment you suggest adding a hard drive to use as a data dump, and what that involves setting up mount points etc. That's a problem that, if you've never encountered it before in Linux, will almost certainly require a google to figure out. It's not *hard* but it is mildly inconvenient; especially when there are 200 different ways to accomplish the same task and will vary by distro. You also need to know what to look for to solve that particular problem.
In Windows; there are more or less only two ways to accomplish the task; either the disk management console or a couple of command line entries. That is, love it or hate it, consistency.
Windows for home use died for me when MS got incredibly heavy handed about upgrade paths. I am a hardware tinkerer; somewhat routinely buying and flipping components. This does not sit well with their activation model. I should say I was very surprised that the thing that chucked it over the edge was of all things, a gfx card swap that the activation algorithm had decided counted as a new PC. I still keep a copy of Server 2019 around for running certain programs, but beyond that I've had it with them.
Unlike Android, which is only sort of using Linux Kernel, Chromebooks are basically a Linux Distro.
Some schools entirely use iPads instead of text books.
Other schools use Chromebooks.
The School IT teaching MS Stuff was always pretty useless and wasn't universal. Mainstream education isn't dominated by Apple Mac in US and MS on PC in UK and Ireland today.
Home use of PCs/Laptops has collapsed due to phones, tablets and consoles, esp. Nintendo Switch, Xbox and Playstation. PC gaming is a niche now. A lot of the corporate stuff is now practically using laptop as a terminal.
Quote: "cuddly UI that has consistent functionality and everything in one place"
Who are you kidding!!!
Windows 7/8/10/11 along with the various version of Office and IE since in the same period all seem to have completely different user interfaces that are not just inconsistent between applications but almost schizophrenic - the high point for a consistent Windows/Application UI was probably in the Windows 200/Office 2000 era and it has been down hill ever since...
If I had a nickel for every time someone has said that this is the year of Linux on the desktop or something similar, only for usage rates to remain basically flat.... If you do your homework ahead of time, so you know exactly what to expect, Linux is great. You just say, "Screw you, Microsoft," and slap on a copy of Linux in a fit of rage, it's going to blow up in your face.
It's not laptop vs desktop, it's the fact that ChromeOS is not able to run arbitrary apps, it's basically just the bare minimum needed to get Chrome to run. That's not really what people mean when they say "Linux distribution." Android technically meets the definition, but since it's Linux with a custom windowing environment, it also doesn't really qualify for what people mean when they say "Linux distribution."
Android and ChromeOS are operating systems with Linux underpinnings, but they are not Linux distributions, in similar way to how MacOS is an OS with FreeBSD underpinnings. If you dig really deep, you can get *some* of the functionality of Linux or FreeBSD, but the OS as a whole is not intended to be used in that way and even Android/ChromeOS/MacOS don't function like they're based on Linux or FreeBSD on the whole. You can't make any of them act like an arbitrary Linux/FreeBSD distribution.
"Chromebooks are Linux."
ChromeOS is based on the Linux kernel, so yeah kinda.
Can you install any Linux application? No.
You can install *some* Linux applications if you use the "Linux (Beta)" feature, but support wasn't great the last time I checked.
ChromeOS also lacks the userland you typically see in a traditional Linux distribution.
The only reasons that ChromeOS machines are attractive to organisations like schools are because the devices are cheap to buy and easy to manage. That's pretty much it.
>> Linux on the desktop or something similar, only for usage rates to remain basically flat...
Ahh, but Linux usage rates being flat is largely due to Microsoft having put an arm-lock on PC manufacturers to pre-install Windows on their user facing products.
What about the server world, where machines do not normally have a pre-installed OS? I understand servers mostly run an enterprise grade Linux.
There is no single Linux distribution even nearly big enough to combat the MS pre-install lock-in on consumer machines, and few retailers offer an OS-free product range, probably even fewer offer a Linux pre-install.
Hmmm, I wouldn't count on Linux being a long term alternative to Windows for this reason.
If Microsoft has finally started making SSE4.2 a core requirement on Windows, one reason to have done so will be that performance of the OS is faster as a result. And, that makes sense; SSE4.2 has some nice instructions in it, and Intel (and AMD) went to some effort to design it so that could be used to make your software faster.
Linux too does care about performance. Linux too has been dropping older CPUs, albeit getting only so far as chopping i386 off their supported list. At some point, they too are going to want to start making use of SSE4.2 instructions. Whilst the Linux kernel project does not directly care about competing with Windows, a lot of its customers do care about speed / efficiency. If Windows started looking substantially faster; well, it would be embarassing. Linux is potentially quite vulnerable to such pressure, as its network stack is in the kernel, and therefore quite a lot of pressure for cryptography to happen in the kernel; WireGuard already is. And if the kernel doesn't go that way, libraries might easily do so.
Blame Intel
If you want the real reason why this is a problem, look no further than Intel's haphazard approach to SSE over the decades. They did MMX, SSE, SSE2, SSE3, SSSE3, SSE4.1, SSE4.2, AVX, AVX2, FMA3, etc. This was lunacy.
Whilst they tentatively wobbled towards all of those up to FMA3 (kind of the bare minimum standard that's actually useful to software developers using this kind of thing), other architectures went straight there far earlier. For example, PowerPC got Altivec, which was a zero to everything you pretty much need in one mighty leap. I can't speak for ARM, but I'd hazzard a guess that they've not cocked about as much with what's in their SIMD extensions.
Had Intel actually sat down and thought it through properly and gone straight there, none of this would now be a problem (or, an excuse). One of the reasons FMA3 took so long to introduce on x64 was because Intanium had it, and Intel were using its presence there and absence on x64 to try and drive up demand for Itanium in the HPC community.
Their haphazard approach to this has pretty much meant no one has used SSEx in any software ever. And, if one considers how old SSE4.2 now actually is and how much older Altivec is (another 10 years or so), you can see how badly Intel has managed all this.
I think the general portability of Linux would mean that there is a decent chance that older processors will be supported for quite a while. With compiled code one could support most of the various SSE, AVX CPUs at the cost of extra binary bloat. The hand rolled assembly would likely be the biggest obstacle.
Those wanting the most performant Linux have tended to compile their own kernel at least for quite some time. Not that I noticed a massive improvement when I last did it.
Windows Vista drivers powering printers, webcams, game controllers, scanners and many other peripherals still in fact work on Windows 11 without any modifications to the drivers themselves. You do in some cases have to enable compatibility options and disable Credential Guard, HVCI and VSM, and that will degrade system security back to the inferior models provided by those older releases, but that is only a small price to pay. For those who want more security, Microsoft also offers replacement generic drivers providing core functionality for a large number of old peripherals using standardised interfaces (e.g. UVC for webcams, IPP or PS class for printers). In the case of printers, Windows maintained official support for old fashioned NT 4.0 kernel-mode drivers all the way through to Windows 10 (provided you were working with a 32-bit install) which is an unparalleled level of backwards compatibility.
They even kept component driver compatibility between releases if you've got working Windows 7 drivers for old network and sound cards. For graphics cards, a WDDM 2.0 driver designed for Windows 10 will "just work" on Windows 11, keeping very old NVIDIA cards working long beyond their supported lifecycles, even if there's no officially supported motherboard/CPU combo for it. For example: An NVIDIA GTX 480 card from 2010 still technically works on Windows 11 even though you're unlikely to be able to get it to function properly on modern UEFI-only hardware due to the old-fashioned BIOS requirement needed to initialise it.
I think what Microsoft does when it comes to hardware compatibility is more than fair, with decades-old stuff still working just fine.
You only think it's "fair" because you don't know anything better. We enjoy better compatibility for older hardware.
We have userspace driver frameworks for those kinds of peripherals too...
Windows Vista? They made perfectly good sound cards go in the garbage because they banned hardware processing because they couldn't control that with their DRM (at that point there were some models you could force to use a software driver if you edited .inf files but that was shitty compared to what you had). Windows 7 removed gameport support from sound cards.
Nvidia knows how to eat a bag of dicks, but they provide legacy drivers for older cards on Linux. Further, there are open source drivers that will accommodate them. Other, more open graphics hardware has even better support for old models (e.g. old Radeon cards). The kernel can still use old, non-mode setting drivers too with a few parameters set. Old Matrox cards and an AGP bus? No problem.
There are windows-only printers and devices that just can't work, but you'd likely be having trouble with those on newer windows as well because they didn't stick to the API frameworks.
You only think it's "fair" because you don't know anything better. We enjoy better compatibility for older hardware.
You left out the bit where the hardware supported is a very small subset of Windows. What Linux supports, it tends to support well, but what it supports is pretty limited.
Windows Vista? They made perfectly good sound cards go in the garbage because they banned hardware processing because they couldn't control that with their DRM (at that point there were some models you could force to use a software driver if you edited .inf files but that was shitty compared to what you had). Windows 7 removed gameport support from sound cards.
Just... no. Vista introduced a new driver model that included a lot of security improvements. There's no reason SoundBlaster or anyone else couldn't have written Vista drivers for their hardware, but they chose not to. Probably because they saw the writing was on the wall and the AC97 chips were more than good enough for the vast majority of users, so why throw good money after bad trying to support legacy products?
Nvidia knows how to eat a bag of dicks, but they provide legacy drivers for older cards on Linux. Further, there are open source drivers that will accommodate them. Other, more open graphics hardware has even better support for old models (e.g. old Radeon cards). The kernel can still use old, non-mode setting drivers too with a few parameters set. Old Matrox cards and an AGP bus? No problem.
The open source drivers, last I looked, tended to be rather shite. They were software rendering only and didn't support DRI, things like that. They also tend to lag well behind in their support of newer cards. You want to use that shiny new RTX 40XX card, you have to use nVidia's kernel mods. Unless you're content with using it like video cards from the 386 days where they were little more than relays for the signal to the monitor.
There are windows-only printers and devices that just can't work, but you'd likely be having trouble with those on newer windows as well because they didn't stick to the API frameworks.
There's nothing inherent about these devices that makes them Windows-only. "Winprinters", "Winmodems" and the like just offload some of the processing to the computer instead of using dedicated circuitry on the device. There's absolutely no reason why someone couldn't write the necessary control logic into a Linux kernel mod or CUPS. However, when Linux use remains a pretty static 3% (give or take) it just doesn't make a lot of financial sense. It's the old 80/20 rule of business. About 80% of your revenue comes from about 20% of your customers. You may be perfectly willing to sell your wares/services to the other 80%, but you don't spend a lot of resources trying to attract them because it's not a good use of those resources.
It is true that by moving a lot of the procesing to the PC instead of the device, it makes for more work updating the drivers, and especially porting them if the driver model changes, but a lot of times the mfr has absolutely no intention of ever providing any updates to the drivers. They just make X number of units, sell them, and then move on to the next model. If people are concerned about e-waste, this is where their efforts should be focused, not on Microsoft forcing people to get a new PC once a decade or so.
My i7 3770 with 32GB RAM is a totally usable computer unless you want to play the latest games. Sure my newest computer is about 10 times faster, but most of the time that is the difference between waiting 10ms and waiting 100ms, not really noticable in real-world situations outside of animation.
To make sure applcation developed a long time ago still installs and runs on later versions, including insider preview builds. Not everyone likes to put their perfectly fine PC at the landfill and be forced to spend money just to run windows. Not like every developer has buckets of space and money to replace hardware when microsoft dictates it.
To be fair, a computer that old with 32 GB of RAM is extremely uncommon. Last I checked, Microsoft was still lying to people, saying that Windows 10 & 11 will run on a system with 4GB of memory. Yeah, it will boot, but if you so much as launch a browser, it becomes unusably slow. I always tell people 8GB is the bare minimum, (ok for browsing, streaming video, e-mail, etc...) while anyone who uses their PC for work will benefit from 16 GB or more.
I was wondering who these supposed Windows 11 enthusiasts were, and what sort of chemicals were in their drinking water.
I do have a Thinkpad that's more than a decade old. Don't remember what CPU it has, though, and it's at the other house so it's not convenient to check. It has Win7 on it, as I haven't gotten around to transferring everything off it onto my newer Thinkpad before wiping it and putting Linux on.
Newer Thinkpad has Win10 because I needed it for TurboTax, and while I don't mind running Windows under a VM, the Windows came preinstalled so I don't have installation media, etc; switching it to Linux and putting WIndows under a VM would take some time and care, and I just have better things to do. So Linux (SUSE or Kali, depending on what I'm doing) runs in a VirtualBox VM on it.
Slightly off topic, but I remembered that the Alpha processor came in versions with differing instruction sets. Tru64 Unix would emulate the missing opcodes in software when running on the less capable processors. I came across this, as the emulation caused some software compiled on machines with the more capable processors to run incredibly slowly when sent to clients with older kit.
Not at all. You can implement POPCNT using some shifting and adding. It doesn't rely on any peculiar aspect of the processor. Yet, if you do implement it in software, it will turn one instruction into about seventy, so if you find yourself wanting to do that frequently, you might benefit from the CPU doing it for you.
I'm not sure what this was meant to tell me. Its number is similar to mine if we let them use the generalizations that don't fit the situation: their algorithm assumes one iteration per bit, and their loop will need more instructions. If the loop is not unrolled, there will be jumping. Even if it is unrolled, they need two instructions per cycle. One to shift and one to add. That comes to 128 instructions for a 64-bit register or 64 for a 32-bit one, which is close to the 70 I estimated. I'll stick to 64-bit ones for the other approaches.
Their alternatives aren't necessarily better. The lookup table cuts it to eight cycles containing three instructions (assuming my mental compiler isn't as rusty as it probably is) but it uses 256 bytes of memory which will need to be cached and originally calculated.
The third method loops through each set bit and performs three operations (subtract, and, add). So for a value that's mostly zeros, it's great, but for all 1s, it's 192 instructions.
All of these are also destructive to v (the value being checked), so budget in time to replace the original value from cache. Of course, actually telling how fast these are will require figuring out how fast the instructions are and how fast POPCNT in hardware is. I only estimated instruction count, not running time, but I'm guessing the hardware one is faster or they wouldn't have added it.
it uses 256 bytes of memory which will need to be cached and originally calculated
The initial calculation is amortized over the lifetime of the program, and if you're doing enough population-counting to matter, then it's negligible. But the caching cost could definitely be significant, as could the indirection. Of course you can trade off between caching and indirection plus addition, by using a table of 16 entries and operating on nybbles.
The third method ... for a value that's mostly zeros, it's great, but for all 1s, it's 192 instructions
Yes, the Kernighan/Wegner/Lehmer algorithm has input-sensitive performance. That also means it's not constant-time, which will make it unsuitable for some cryptographic applications.
Doing it in hardware should always be faster, if you have enough gates available. That's why population-count machine instructions have been around since the 1960s.
It does seem like an opcode that would be invented, and used, specifically to "unpatchably obsoleteify" older hardware.
Population count is a fairly widely used primitive. Its history goes back decades.
Given the age of the computers involved, I doubt this will have much impact.
It's easy to get W11 to install on anything with Secure Boot and at least TPM 1.2. Anything without that is more challeging so you'd have to be very determined to have got it to work so far. Computers with CPUs as old as those mentioned probably aren't going to have Secure Boot or any TPM.
Windows 11 has been out for a few years now, and they did publish a list of supported hardware. You can complain about $CPU not being on the list for $REASONS, you can even say MS did a piss poor job of explaining the rationale behind it's supported hardware list, but you can't say MS wasn't very up front and clear on what hardware is considered supported by Win 11. The fact that it could be used on other hardware was just sort of a bonus, but you had to know you were living on borrowed time. We're also talking about chips that are over a decade old. I'm all for using older hardware that's still good, but you have to balance that against a developer wanting to be able to use more recent functionality. Where the fulcrum lies I don't know, but roughly a decade seems like it's well on the conservative side. It used to be the norm that every time a new version of Windows came out you more or less had to buy a new computer. We've gotten spoiled by the fact that the requirements haven't changed a whole lot since Vista. Something that deserves a little recognition really, but not that any will give it.
I'm sure plenty will try Linux and then quickly go out and buy a new computer instead once they realize what they've gotten themselves into. Linux is a great OS... IF you're prepared going in. If you know you can't just plug in any random bit of hardware and expect it to work, that you may not be able to run the latest game on release day, that your favorite bit of software may not have a Linux version or even a very good analog, and all the other little pitfalls, you'll be fine. You just slap a copy of Linux onto your computer expecting it to be Windows by another name, you're going to very quickly find out the error in that assumption. Which, IMO, is probably why despite all the proclamations about how this time Microsoft has gone too far and Linux will finally have its day, usage rates have remained largely static for decades. If people defect from Windows, it's usually to macOS, not Linux.
"We're also talking about chips that are over a decade old. I'm all for using older hardware that's still good, but you have to balance that against a developer wanting to be able to use more recent functionality."
I have relatives running much older hardware than you'd expect. One is a laptop still running on W7. The other is a tower box probably from the same era if not older.
That is on Linux largely because she got hit with early ransomware. Bless the innocent little lambs, they just wrote out the encrypted files and dust deleted the real ones, still there to be recovered but we took no risks, I partitioned the disk and installed Zorin. The only problem was the vast number of jpegs recovered, all the little bullets and buttons from the browser cache. I think she's in her late 80s now, wasn't going to bother getting a new computer then and certainly isn't now.
To be fair, that's a pretty edge case and not really representative of most people out there. Even if we take out all the business computers from the equation, and only consider individual users, scenarios like yours would probably need to more than quadruple to even be considered a rounding error. It's great you were able to get that person set up with something that they can use, but they have you to call if they need support. How many random individual users do you suppose have someone like that they can call up whenever they need it?
..."that's a pretty edge case and not really representative of most people out there"...
It occurred to me the other day that the phrase above could actually apply to anyone choosing to use a laptop/PC as their main computing device.
Most people I work with tend to use their phones for most things now. An actual PC sitting on a desk with an external monitor on top of the case is seen as a tool provided by their employer and mainly used to access the MS 365 account via a Web browser.
Interesting times.
That's probably true. At home we have a nice, fairly powerful family PC with bunch of SSD and HDDs and partitions, for admin tasks ( like my wife's Brown Owl and school governor roles) or stuff like editing and storing photos and videos and anything creative.
Then I've a convertible Lenovo Yoga mostly used to review stuff I've done previously and light editing.
When the main PC is being used or if just I want to work at the table downstairs instead of in the computer room use just want to use Linux I might use an older laptop which I find is better for typing on than the Yoga.
Because we seem to be doing a lot of stuff that demands a proper PC.
But for everyday life stuff we just use smartphones. I don't need a PC for email, listening to BBC iPlayer and so on.
I guess they don't use CAD or do video editing then. There are quite a number of creative and technical tasks that demand a high resolution display and precise pointer control. For example I do a lot of PCB design and couldn't manage at all with less than a 24" screen.
"I have relatives running much older hardware than you'd expect."
And they are far from being alone. There are a LOT of ancient PCs out there in domestic use. They are doing all that their users want / need them to do, and likely will continue to be used until they fail completely.
Windows 7 is very common, Vista much less so, 8 is practically non existent and XP is still hanging in there in a few cases - just a few weeks ago, I nursed an Athlon XP back to health so its owner could continue checking their email on Outlook Express. Sure, there are other more modern laptops, tablets and phones on the premises, but they're happy with Outlook Express and Office 2003.
I haven't seen Windows 95 / 98 / ME for a long time, but I have no doubt that they are out there, quietly doing some routine tasks that haven't changed for decades and probably never will.
Yes, I know, there are a million reasons to upgrade / replace these dinosaurs. But the people who actually use them see no need to spend money replacing something that does what they need.
So, in twenty years time or later we will still see W10 or older being used in ATM and announcement board systems. I still fireup XP now and then just to remind me how fast a slow machine would run with a small OS. I also have DOS that runs serial ports on old machines. W10 or W11 wont do thar.
Popcnt is not supported on pc's of around 15 years old, but back then the majority of pcs sold were only 32bit anyway even Apple and Android mo longer support computers of that age and genre. At least microsoft have only just ditched these old machines, apple began removing their support of older pcs and ipads back in 2005... Times move on 64 bit systems have to become the norm because more people expect better quality and faster apps and games, which need more processor power and multitasking able cpu's.which the older 32 bit system would struggle to do. so its was really inevitable.
Android doesn't support PCs period, and the expectations for mobile devices are different from PCs. And Apple is an entirely irrelevant comparison due to it being a closed ecosystem where you have limited choice of hardware and OS, and they can arbitrarily cut off support anytime they like with no promise of backward compatibility. Backward compatibility is an expectation in the Windows world. The only difference between a 32-bit processor and a 64-bit processor that is inherent is how much memory they can address. It has nothing to do with processing power or multi-tasking capability (other than needing more memory for the bloated applications and OSes). When comparing 32-bit and 64-bit versions of OSes and applications which are capable of running within the sub-4GB limits, there is no performance difference. All of the architecture improvements and performance increases could have been done with 32-bit-only processors, if RAM limits weren't a factor. Virtually everything moved to 64-bit as a practical requirement several years ago, with 32-bit just there for the edge cases, and many others made 64-bit a hard requirement in that time. Microsoft made the change because there was no longer enough need for 32-bit among customers.
Easy!!
(1) "The Ribbon": Julie Larson-Green (born 1962) just knows what you need SO MUCH BETTER than you do!!
(2) "Windows 8.0": Steven Sinofsky (born 1965) just knows what you need SO MUCH BETTER than you do!!
(3) .....plus the corporate aim of Microsoft to change things TO LOCK CUSTOMERS IN TO THE NEXT BIG THING!!! $$$$$$
Except if it's seven years old, it's got POPCNT. It's going to have to be about fifteen if it lacks it. I'd prefer if they hadn't done it as well, but while the restriction on 7th-gen Intel parts is a problem, restrictions affecting Core 2s are less concerning to me. In my experience, people with hardware that old have to be dragged to update their operating system anyway, and they're going to need a new computer when something breaks in their old one and isn't economical to bother repairing.
So you can have a problem with them not supporting things as far back as 7th-Gen, but other people can't have a problem with not supporting 6th-Gen, 5th-Gen, or older? Other people can't have a problem with them completely disabling installation on older hardware? All that matters is what affects you?
If it's 7 years old, it's not able to simply install Windows 11 and expect that it will be supported and that updates/upgrades will continue to work. Having POPCNT isn't the only issue. You can only install on a 7-year-old CPU by using unsupported workarounds, that may stop working at any time, and which may disable updates at any time.
Plenty of people do still want to upgrade their OS without replacing perfectly capable older hardware that can run the newer OS if there weren't arbitrary requirements for non-essential features. Plenty of people have older hardware that they can't afford to upgrade until they absolutely have to, but still would like to have a current and supported OS that gets security updates. Plenty of people can't afford to get a new system with a processor that is less than 4-years-old (the requirement at the time of Windows 11's release; now 6.5 years). Microsoft didn't need to disable installation on older hardware to begin with, they could have just made it "unsupported" but still functional since there was nothing in the actual code that prevented older CPUs from working for the actual OS, and if something didn't work, too bad. That would make this new change more palatable, to me; even more so if we really knew what the POPCNT instruction was being used for and whether it was actually necessary and useful. My guess is it's related to all the unnecessary graphical features and acceleration.
I just noticed that the release dates for both Windows 11 and the first 8th-Gen Intel processors was October 5. They chose to support EXACTLY 6-year-old hardware on release.
A CPU register instruction that has lurked in CPUs for 15 years is suddenly required for Windows 11, which conveniently disables unsupported hardware. Seems like Microsoft went hunting for a requirement specifically to eliminate unsupported hardware. Is it actually used in Windows 11 24H2 and what function does it serve in this update that it must be present? It is as arbitrary a requirement as all of the others Microsoft has demanded for Windows 11.
All that being said, I also believe that hardware should be upgraded after 8 - 10 years. Expecting support for 15+ year old hardware is frankly ridiculous, backwards compatibility for 8 - 10 years is plenty of time for people and companies to upgrade and migrate to newer platforms. While I agree with the argument that old does not mean useless and there are other functions and purposes these older systems can be used for and the world needs to develop infrastructure to reclaim and recycle this e-waste. If we truly are wanting to be "green" and protect the environment and not just shipping e-waste off to a third-world country to dump it. The only reason reclamation and recycling are not booming industries in this era of "being green" is that there are very little profits to be made. That is the only "green" behind this movement to save the planet.
Even switching to Linux is a stop gap and even Linux will inevitably drop older hardware compatibility. SMBv1 is 40 years old and still lurking in Windows 11 because people won't move away from it.
You contradict yourself. Backward compatibility mostly just means not adding instructions that aren't available on older hardware (or that have known bugs or performance issues on older hardware), but you complain that they added arbitrary and unnecessary instruction requirements that eliminate older hardware support.
While this particular change likely won't be a large direct benefit to most folks due to the age of the CPUs involved, more generally it seems like Windows 11 has been a bit of back door help to the hobbyists and people running home labs and such, since every new iteration of Windows tends to kick a generation of hardware out of the Microsoft ecosystem.
I'm thinking of getting a newer small form factor or NUC or ITX or something for some light duty, and it seems like there's a newer hand-me-down generation of CPUs and whatnot these days, available at somewhat better prices than I used to see.