
Alarmist?
I saw no lies in KDE's statements.
As for installing, if you can install Windows, Linux is even easier these days.
Linux desktop darling KDE is weighing in on the controversy around the impending demise of Windows 10 support with a lurid "KDE for Windows 10 Exiles" campaign. KDE's alarming "Exiles" page opens with the text "Your computer is toast" followed by a warning that Microsoft wants to turn computers running Windows 10 into junk …
This.
The REAL problem for people to transition to Linux was never usability. People always go on about "Linux Desktop has come a long way" ... yeah, newsflash: So has Windows. I used Windows 3.1 and 95, early NT and 2000, and guess what: Non-IT people were about as comfortable in those interface as they would have been in early versions of Linux Desktop Environments.
There is a reason Apple became popular among creative types who aren't in IT, long before the hardware itself became a lifestyle brand: Apples Interfaces were, to many people, more intuitive and easier to navigate than the alternatives.
Something lots of people forget when they (justifiably) fondly remember these older Windows Interfaces, is that they remember them *as IT people*. To us, these uncluttered, no-nonsense, no-frills interfaces of early windows versions, where things weere where you expect them, and information density was high, were good, they were solid. We like all these things.
But to the average user, they weren't. They were scary. Anything beyond the "Start"-Menu, and even half the stuff in there, and the 2-3 Folders IT pre-set for people to appear as Links on their Desktops, was a no-go-area for most non-IT users.
So no, what limited Linux Adoption was, in my opinion, never how user-friendly the desktop was.
What limits the adoption, boils down almost entirely to the fact that *MOST PEOPLE NEVER SETUP THEIR OWN OS FROM SCRATCH ON BLANK HARDWARE*, and windows comes pre-installed on almost any desktop-machine under the sun.
There has never been a technical reason for creatives to prefer Apple. Creatives prefer Apple kit for the same reason arty types use cigarette holders and wear berets.
At best its familiarity because they used Apple kit when they were taught and just starting out. Not one professional designer can look you in the eye and give you a straight up technical reason for choosing a Mac over anything else.
Before I was a techie, I did a work placement at school with Yellow Pages where I designed ads using a Mac. The experience was so shit (I got fucking fed up of seeing those two fucking arrows) that I decided to become a network engineer.
To be fair I was already a massive nerd and I built a LAN at home but I just couldn't get past how the network at Yellow Pages was so much worse than my own LAN at home.
People seem to forget that Macs were absolutely awful for decades when it came to networking. Nobody seems to remember AppleTalk and how shit it was, the garbage implementation of SAMBA and the absolutely useless printer and scanner support.
They were a living nightmare for designers for a long time.
>” the absolutely useless printer and scanner support.”
Provided your needs could be satisfied with an Apple LaserWriter and an Apple Scanner, you could just plug in and go; remember back then the IBM/MS PC world was still on MS DOS et al…
Remember Microsoft have effectively followed Apple, just that they repeatedly make a botched job of it. Microsoft’s early attempts at networking were just as proprietary, just that with an open platform third-parties could deliver the Real McCoy. Microsoft’s TCP/IP implementation wasn’t and isn’t the best, it’s just bundled with Windows.
I connected a Laserwriter II to the serial port of a WfWG 3.11 machine networked off a NW 3.11 swerver using only the DIP switches and a manual written in French. It was horrifically slow, worked on French time ( 4 hours out of 8) but the print quality was great.
This remains my greatest achievement in the world of IT.
> There has never been a technical reason for creatives to prefer Apple. Creatives prefer Apple kit for the same reason arty types use cigarette holders and wear berets.
You obviously never tried colour-calibrating Windows 3.1-era PCs — it'd be via the card-specific video driver if it was available at all — or typing much on them that wasn't directly printed on a key.
E.g. Mac emdash: option+shift+-. Windows: hold down alt and type, in sequence on the number pad 0, 1, 5, 1. Mac, anything with an umlaut or a diacritic, e.g. ö: option+u to pick that particular diacritic, then o to type the character. Like a not-as-good version of the compose key on other systems. Windows: alt plus a four letter code again, memorised depending on the particular character (via answers.microsoft.com here's some of them).
Windows 95 didn't exactly solve these problems, but — believe it or not! — it crashed a lot less than contemporaneous Mac OS so in terms of getting your day done, there's a solid technical reason. In the six-or-so-year period after Windows 95 but before OS X the most likely cause for Mac needing to be rebooted was a single bad application whereas the most likely cause for a Windows PC was a bad driver.
I'm not saying Windows is any good either...I'm not suggesting a *better* solution, I'm just trying to highlight that there is no good reason to pick a Mac in pretty much any circumstance. Back in the early "skinny" iMac days, the OG Intel iMacs...most companies that chose to buy Macs bought them for the look. It was the white desk, minimalist office and iMac phase...virtually nobody picked a Mac because they thought it was better...anyone claiming they did just looked like a clown...this applies to the whole "Intel Era" because while Macs were Intel based, it was clear as day that you were getting inferior specs for larger sums of money. Now that Apple has moved to ARM, it's ambiguous again...like the PowerPC days...because you can't really drag race ARM hardware vs x86 hardware. It's not...pardon the pun...an apples to apples comparison.
Going back to the dark ages you mention. The main difference for me was the *way* they crashed. At least with Windows, if it crashed, you knew about it...it was pretty explicit. Macs of that era though, they could crash and you wouldn't necessarily know it...stuff would just stop working for no apparent reason. No errors, no messages...just that fucking annoying "DUNK" noise.
The lack of useful feedback is what made MacOS a fucking nightmare to use, troubleshoot and support. Fuck Error 0...and the other one, Error -24 or whatever it was.
100% style over substance...it's the lack of useful feedback when problems occur that probably led to the myth that MacOS is more stable than Windows. You can't meme blue screens of death if there are no blue screens of death...know what I mean?
There has never been a technical reason for creatives to prefer Apple. Creatives prefer Apple kit for the same reason arty types use cigarette holders and wear berets.
The early Macintoshes was pretty standout (particularly the 512k). Jobs was obsessed with design and typography, and this showed in the Mac's support for fonts and type. Before Windows was a thing, MS released Excel on Macintosh as the launch platform, and then Photoshop was Mac-only from 1990-1993. So in that era of 1985-1995, there were strong and compelling reasons for visual-creative businesses to run Apple for both business and colouring in (in the audio world of course, some key applications like Sibelius were Acorn-only). Apple had a growing momentum in the design sector, plus credible business alternatives to Lotus1-2-3 (the "killer app" for the IBM PC and compatibles).
There have always been alternatives, but there have indisputibly been periods where Apple was the technical leader in some sectors. The OS-stability helped as well - fewer BSODs. For music, Pro-Tools didn't come to Windows until 1997.
These days of course it's much of a muchness - Adobe/CO/Serif/BM/Avid. Cross-platform, all acceptably stable. Pick your poison, unless you've wedded yourself to FCP/Motion or Logic Pro. Which is a valid choice - but as you say, not a technical choice as such. You're not getting inherently better output from FCP than cross-platform alternatives.
"MOST PEOPLE NEVER SETUP THEIR OWN OS FROM SCRATCH ON BLANK HARDWARE"
That might be right for new systems, but it's not the general experience of *most* users for most of their "technical" lives.
Depends on how old they are. I'd say with anyone with no experience prior to Windows 7/Vista (though on Vista, recovery partitions weren't the default yet, some machines had them, but most didn't, because Microsoft hadn't fully figured it out yet), you're probably right. Before that, recovery partitions were not a given and it was likely in the lifetime of your computer, you'd probably do at least one reinstall using the factory CD/DVD that came with the machine (with drivers pre-merged) that was essentially a full OS install.
Anyone born after around 2003 is unlikely to have ever installed an OS.
For me, born in the 80s, it was perfectly normal to see people installing an OS, especially after a new version came out. The only way to install anything XP and older was a clean install. You might have been lucky with XP if you had a sysprepped install disc, but most of the time it was just an OEM disc, with a key stuck to the bag with a warranty booklet inside. They were pretty generic for the most part (except for Dell, weirdly, they had their own OEM drives that were occasionally machine specific).
For my Dad, for example, installing an OS from scratch is very much something he's done quite a few times in his lifetime, and he's not even a techie. Windows 3.1, Windows 95, Windows 98, Windows ME and Windows XP all required manual installs at some stage or another...so at one point, before I became a professional techie, he probably had more experience installing operating systems than I did.
I moved him to Linux a long time ago now, back when Ubuntu 12.04 was the LTS du jour from Canonical. He recently upgraded to 24.04 from 20.04...well I say upgrade, it was a clean install...and after a decade or so of using Linux, he's perfectly comfortable with installing it, I just toss him a USB with the installer on when a new release comes out (creating bootable USB drives is probably more complicated for most than installing an OS).
Moreover, he's been using the same Xeon E3-1230v5 setup with 32GB RAM since 2015 (well it started out an i5, but it's had a CPU upgrade, GPU upgrade (AMD Radeon Pro WX3100 or something, it's a low profile GPU, bus powered, GDDR5) and RAM upgrade since, he has ECC RAM now, it's an HP Z240, which is an absolute banger of a machine)...fine it's the triggers broom of PCs, but the point is, it's lasted and it's still going and he installs his own OS.
Which brings me to another point. The hardware. Linux will run on pretty much anything without issue these days, but some manufacturers go out of their way to build machines that are heavily locked down or contain weird components that only ever have one driver version released...ever...that will only work on Windows. Something dumb like a fan controller, RGB controller etc...and it will have some locked down custom BIOS that has only a few options exposed.
For older relatives, people that don't have massive demanding requirements etc...always, *always* buy a 1-2 year old used Enterprise machine if you want to give them a consistent experience for a decent stretch of time. Either a Dell Precision, Lenovo Thinkcentre or HP Z series workstation, even Fujitsu workstations can be pretty solid...you can usually pick them up cheap and there are generally parts available for years....it actually gets cheaper to upgrade them over time and for the vast majority of non-specialised workloads, you can get a good decade or so out of one. They are never locked down or restricted in any way, and you don't find "gimmick" hardware in them that won't ever have drivers beyond one version of Windows in them.
I myself, when I was going through a "skint" patch ran a Dell T1600 for ages...it was an absolute banger of a machine. You can usually get "workstation" machines with entry level components second hand for buttons.
There is a listing right now on eBay for a Dell T5820 with an i7-7800X in it, 32GB RAM etc etc which is currently at sub £200...it will probably finish around £150-£200.
https://www.ebay.co.uk/itm/277154511256
That's an absolute banger for most folks. Can you run AAA games on it? Sort of...does it need to? No!
It's the perfect entry point for Linux though for an average punter...absolutely everything in that will work with Linux no problem.
The trouble with Linux creeps in with weird consumer PCs that come with strange custom designed components and weird motherboards. That's where Linux gets a bit tricky for some. Mostly because of something daft like not having the ability to boot from USB, or having a restricted BIOS or something...
That Dell T5820...without even seeing it in the flesh, I could give you a 100% guarantee that it will run Linux tickety boo with no issues. Some random 8 year old special edition Acer PC bought from PC World on a special promotion...not so much.
There is a reason Apple became popular among creative types who aren't in IT, long before the hardware itself became a lifestyle brand: Apples Interfaces were, to many people, more intuitive and easier to navigate than the alternatives.
To quote my younger niblings, "This is the way".
When the iPhone came out, I compared it to my Nokia 5800, and the iPhone was inferior in every way. It cost more, it had terrible call quality, battery life was terrible, and it was unreliable. The Nokia had every feature the iPhone had, and was often far more advanced.
And it didn't matter in the slightest, because only geeks could use the thing.
A female friend bought an iPhone and was talking about how awesome it was. One of the features was the weather. Sure, the Nokia had that, too. All you had to do was go into this app, enter your city code, get the GPS co-ordinates and city code, remember them, exit that app, go back three screens, then down two menus to the weather app, start it up, then enter the city code or GPS co-ordinates, then query the weather.
On the iPhone, she clicked the weather icon, it asked her if she wanted to enter the city or just use the local weather, she clicked local, and she got the weather. No digging through layers of menus, no GPS co-ordinates, no codes, just instant gratification.
Sure, you only had to do it once on the Nokia, but you had to do it. But every app was like that. Settings were sprayed all over the place. The calendar and the alarm clock were completely different apps. The iPhone had a common design philosophy that end users could use without training.
It doesn't matter if Nokia, OS/2, Linux or other technologies "are better" on technical merit if the users can't figure them out. It's like point and click cameras. Professional photographers despise them, and will talk about how horrible they are in terms of picture quality. But an unskilled person could pick one up, point it, and take a picture with it without having to configure the F Stop and aperature of the DSLR that was simply too much effort for them.
Agree, though DSLR analogy a bit wrong - I have a couple of DSLR cameras: Both have the ability to tweak things massively, however both can be used without adjusting anything in default mode - so essentially point & click.
Auto mode is far from perfect, but usually gives an OK image (if I have limited time to grab a shot, often the case as I mainly use cameras to record wildlife sightings so something that is often quite mobile & may be often concealed by vegetation etc. then quite often just use auto* as, if I tweaked things to give a better image, strong chance the target of the shot I wanted could be long gone by the time I had adjusted settings)
* you can normally tell by creature behaviour if you will only get a fleeting chance, if it's obvious there is the luxury of a decent amount of time to grab shots then I do either flick to one of my "presets"** (if a suitable one exists) or, if no suitable preset, manually adjust settings for best results. Obviously my use case is a bit unusual, but I have been surprised on a few occasions on how good a job automatic mode managed.
** most DSLRs have ability for you to set up a few commonly used "favourite" settings on "presets" of some sort (depends on camera model & make how quick & easy (or not!) these are to select)
I don't think "alarmist" means dishonest. You can tell only the truth and still be alarmist if you describe things as more serious than they are. As long as the things you're describing are all real, you're view on their seriousness is only your opinion. I don't have much of a problem with KDE's phrasing of the situation. There will be vulnerabilities in an unpatched Windows 10 system, Microsoft would make similar claims if anyone was reading them to encourage people to update, and so I accept them. I do find the phrasing a bit more strident than I'd have written it, but that isn't a thing they need to do anything about.
This post has been deleted by its author
This is a misconception. Vista was actually very stable, it was just really, really fucking unbelievably slow...which gave the impression of a full system crash, but it wasn't actually crashed.
It didn't help that Vista was also initially launched on super under powered kit. This was the time of Centrino and Atom. Two garbage mobile CPU solutions.
If Microsoft didn't come bursting into the room, chewing it's tongue, dribbling it's dinner down it's front and trying force Windows Starter Edition on to Netbooks like the lumbering demented spastic that it is, I suspect laptops would look very different today.
Asus: We've designed a Netbook called the EEE PC, it runs a stripped back lightweight version of Linux to save battery power, make it lighter and create a new class of laptop for light work on the go.
Users: Derp it'd be nice if we could run Outlook on it.
Microsoft: Neeerrrrrrr!!!! NNNGGGGH!!! Let's just put Windows on it.
Users: Netbooks are slow and shit, what a waste of time.
Asus: :'(
I doubt it. Netbooks running Linux may have been a little faster, but they wouldn't be if people tried to do Linuxy things with them. For the nontechnical user, the Netbook was essentially the forerunner of the Chromebook, a computer designed mostly for remote applications with specs to match. The 2-8 GB storage on the first EEE PC, for example, wouldn't have worked very well if people tried to run much local software. It was snappy as an SSD though, so as long as you didn't get close to filling it up, what it did would go quickly.
The situation is very different for those of us who were at home with Linux, though I knew many people who were big fans of Linux and hated netbooks nonetheless because they considered them underpowered and lacking important features. Still, our technical knowledge means we're better at fitting things into constrained areas and knowing when we can't accomplish that. Nontechnical users are less aware when the hardware they have is going to be incapable of doing something well enough and, when it is but you need to work at it to succeed, they don't have all the tools we have.
"It didn't help that Vista was also initially launched on super under powered kit."
It wasn't that the kit was underpowered; it's that the OS used too many resources. Same with every version since.
Every laptop I've ever owned has been given to me for free, typically because the owner found it "too slow" and bought a new one. They run just fine with Ubuntu on them, but definitely are slow under Windows. Hmm...
Well, maybe. Here's an interesting feature of Linux Mint installation which I discovered recently ...
* Boot from a Live CD
* Start the installer
* Choose to install multimedia codecs
* Cancel the installation before committing (in my case to check how big a UEFI partion I needed, but there could be other reasons)
* Attempt the install again
* Discover that you can't, because when you chose to install multimedia codecs the installer told UEFI that they are needed, but they are not on the Live CD version and so cannot be installed when UEFI demands them
And that's it. You now have a computer on which Linux Mint's installer simply will not work, ever again. I worked round it by using Ventoy, by the way.
I reported it as a bug to the Linux Mint people - installers should not be committing anything to disk until the final go-ahead is given, and certainly not during information gathering. I got a sympathetic reply agreeing with me but saying that they didn't write the installer so couldn't do anything about it.
And that exemplifies what is, to me, the biggest problem with Linux: the diffuse nature of responsibility and support and the need to find the random person in Nebraska who could sort a problem.
Yes I love my Linux MX boxes.
I have three running now,
as one won't play DVD's,
another boasts only 800x600 screen resolution available on my BenQ 16:9 HD monitor. so sqaures are rectangles and cirles appear - well, oval....
The third box is fine, er, but however my external 2Tb hard drive recently lost functionality whilst using it with MX Linux.
All the Linux tools under the sun could not read the disc or rescue the Hard Drive, or the data on it. The OSes on each Linux box say "drive is MOUNTABLE", but none were able to mount it AGAIN.....
Eventually, I got the damnable thing working again using Windose 10 dusc management tool. At last I could reformat the drive Ext 4 again and bring it back into use........
Still no idea what happened to it whiilst it was attached to that Linux MX box !!!! The Linux OS is fine
Oh, nearly forgot - this box refuses to print anything even though it can see the Canon Pima printer and has the drivers installed, several times now......
Oh no, this Linux box is unhappy. It won't Boot Up anymore without my interaction. What's that ? Enter device password, or Press Ctrl D to continue ..........I haven't set a device password, I checked to make sure - so I GUESS Ctrl D will have to do.......
F00kin' SHEESH, only a nutter like me would put up with this rather than go out and buy a working for now Windose nightmare.
ALF
I'm excited for this. I have a ThinkPad that I need to get over to Mint when I get the SSD upgraded; I'd settled into Win10 under duress but its time has come. I'm expecting to like one of the desktops Mint provides more than anything else, but used to use Kubuntu as my daily driver back when Windows 8 was trying to break everything (I do love configurability and a bit of transluscence). If they play this right they could turn the tables — I got onto KDE and Linux as a desktop through word of mouth back then; if they can get this manifesto out well then I have a reasonable chance of getting off Win11 at work!
After all these years, I am still forced to read Penguinistas propaganda about how easy it is to install and run Linux........
What utter BOLLOX.....
It really depends upon your hardware as much as it does for Windose 11.
No, the hardware doesn't need to be as up to date, however the drivers for things linke sound cards, graphics cards and monitors needs to be in the Linux you are installing's library, otherwise your 16:9 monitor will only run in 800x600.......
etc, etc, etc........One of my Linux MX boxes will not play a DVD, another has only 800x600 available for the monitors resolution, another refused to record and play through in Audacity........and I have neverhad this issue previously using Audactity since year 1.
So no, peeps, Linux is nont plain sailing no matter what the Penguins think.
A good idea changing from Windose to Linux YES, HOWEVER just to prepare you, it maight not be as simple as some Penguinbs make out.
ALF
Your problem isn't that Linux doesn't work. Your problem is that you don't know enough to get it to work. But as a Reg reader you must be far better informed than the average Windows punter. That's why I don't see a mass movement to any form of Linux after Windows 10 becomes unsupported.
...then you just pay the price and get a new computer. Once of those small boxes that you can attach to the back of a monitor with double sided tape would do.
Meanwhile, if you don't pay The Man every month to use your system your're probably 90% of the way there anyway. If you need to keep the Windows system for a specific application then just move all your daily drive stuff -- mail, web, word processing etc. -- to the Linux box. If you find you just have to use the Windows system all the time then an investment in a new system will be justified. If its just a security blanket, you need it "just in case", you'll eventually be able to retire the old system.
Then there's also dual boot.....I have no idea why this sort of article never mentions this, its almost as if its written by someone from the Microsoft marketing department.
Then there's also dual boot.....I have no idea why this sort of article never mentions this, its almost as if its written by someone from the Microsoft marketing department.
The article doesn't mention dual boot because dipping your feet in the Linux world isn't it's point. The point being, for most people Win10 needs to go away. KDE is offering a solution that doesn't require buying new hardware.
And really, how many people 'dual boot' any more? My grub rarely has anything beyond one OS install and the recovery tools. If I have Windows and want to play with Linux, there's WSL, Docker, or a virtual. And if Linux is my base, QEMU, etc. That way there is no need to shut down everything and boot to another OS, only to find out you need something on the OS you just left.
"My grub rarely has anything beyond one OS install and the recovery tools."
That's you. Linux is your world like many of us here. But this isn't aimed at people like us. It's aimed, as I said above, at people who don't necessarily differentiate between the computer and the Windows OS running on it. Short of a friend's Mac running something named after a big cat or whatever very idea that a computer wouldn't have Windows is next to incomprehensible to them. Martin's quite right, dual booting will be reassurance - a way back if they fear need it and a way of running what they need until they find replacements. On a practical note, leaving the Windows partitions in place lets them be mounted on Linux, say as /c-drive, so they can access their old data until they move it across to their home directory. Computing isn't just about H/W and S/W - it's about data first and foremost.
I think dual booting is too complex for most people.
If they need to keep Windows, why bother with Linux?
"as I said above, at people who don't necessarily differentiate between the computer and the Windows OS running on it"
I doubt those people will be able to understand having two OSes on a computer.
I too dual boot Windows 10 and Linux. After October my Linux setup will replace my Windows one as I switch more of my activities, such as they are, to Linux.
I am not a fan of KDE, Deepin is my distro of choice I've used it for ages as it runs Microsoft Edge and Thunderbird perfectly as well as many other Windows applications. That said Win10 will remain on my box running games as I am bone idle and cannot be bothered to reinvent the wheel, I won't let it near the web though.
I have W10 and Linux Mint dual boot on my laptop. It came with W10 and I kept it for very occasional work use. I could get rid of it now, but the process for doing so is so horribly intimidating that I just let it take up 40GB of SSD. Haven't booted it for years and have no intention of ever doing so again.
I have a 10+ year-old laptop with a Mint/W7 dual boot. The W7 boot was botched during a Mint update many moons ago and I never invested enough time in managing to repair it, so it's been sitting unused since. I rarely use it now anyway but keep it as it's the only device left capable of reading DVDs. Like you, I haven't reclaimed the space yet but keep promising myself one day I will!
I used to shrink and preserve the Windows partition whenever I installed Linux. Dual boot was possible, but obviously a PITA switching wholesale between two different environments. Being able to cross mount files from Windows to Linux was useful, being able to run a VM in Linux, on top of the Windows install was even better. Until you tried to boot the Windows partition in native mode, and then back again to VM mode. Bad things. Last 4 laptops I have been given were never booted into Windows start up, before plugging a Kubuntu install USB stick.
Some of the extreme reticence to switch from Windows 10 to 11 surprises me. Forced obsolescence I accept is awful. But aside for that, from an admin point of view there’s not much to learn, and if you don’t like the start button in the middle just move it to the corner. They’re both Windows. They’re both dull. But in terms of the difference between the two it’s quite minor really. I’ve been using both for years now. Feels like old news!
I’ve switched gui in Linux so many times lately I keep forgetting which one is which. I loved KDE until I didn’t, Cinnamon was ok, XFCE was great but a bit glitchy rendering video on my system. I didn’t like Gnome but once it was installed I forgot to keep trying others. And after a reinstall to KDE the week before last I miss it. So maybe Gnome is my favourite? Didn’t see that one coming, but then again I thought Metro was quite good as well…
I don't really care whether W11 looks like W10 or not. We have three main computers in the house, 2 running only W10 and 1 dual-booting between W10 and Linux. Not one of them can run W11, whatever it may look like, because they don't have the required TPM, and I am not going to pay to replace 3 perfectly good, working, computers just to satisfy some Microsoft salesman.
TPM is an expensive issue for sure. My issue is more the push for Microsoft to control the login security on your systems. They continue to make it harder and harder to use local accounts, expecting your systems to authenticate through them. Both on the Home version and the AzureAD (Entra ID) push for businesses. I don't want my logins controlled by SaaS. I don't want my logins dependent on Internet access. I don't want Microsoft determining who can log in. I don't trust Microsoft's security.
It's not that Win11 is "bad", it's completely "fine" as far as versions of Windows goes. It's the arbitrary "retirement" of Win10, that's running on hardware that's still perfectly functional. It just seems that in previous "out of service" times, the hardware has been basically obsolete by the time the OS was. Win XP machines would struggle to run Win 10 and modern software etc. Now I'm collecting a pile of perfectly good Intel 6-8 gen PCs, with SSDs and 8gb of RAM. But MS says they are now e-waste. Anyone that's used Linux knows those are perfectly good desktop PC specs, for your average home user. Heck drop a GPU in them and they can play games.
So yeah, any friends / family that ask about the Win 10 EOL for their home PC, I just offer them a newer Linux box, usually for a case of beer. From the user point of view, switching to Linux Mint from WIn10 is no more of a step than switching to Win11. They want Web browser / e-mail / Youtube / FB Messenger and maybe something that looks a lot like MS Word / Excel. Heck my 70 something MIL has been using an e-waste linux laptop for years, and hasn't had a problem (except for forgetting how to do updates, and getting warnings from Chrome).
The "Linux industry" needs to seriously get its act together and start appealing to ordinary users who wouldn't know a distro or a command line if it bit them. 1. Recognise the potential. 2. Stop having so many damned distros, each crap (or highly specialised) one reflects on all the rest. 3. So people can leave the Win ecosystem but keep their Win applications, make a Windows emulator part of Linux, don't rely on WINE or VMs, make it easy, and make it darned well work properly.
Remember, an OS should simply let people use their programs to Do Work with, then get out of the way.
But yes, I hear your comment about the "Linux industry" not being a single coherent entity. But what really is needed is someone (Linus?) to recognise the potential and work to make that the obvious option for ex-Windoze folk. Doh, that would be another distro. Aaarghhh.
It has been tried, and in my experience unfortunately failed. I was able to buy a netbook (remember them) with Linux preinstalled, and also more recently an HP all-in-one machine running Linux. It was a overall a disappointing experience in both cases. The distros chosen were nothing mainstream, had some unwelcome hackery to make it work on the hardware, and generally unsuitable as a Windows alternative. I only chose the hardware as they were cheaper for not having a Windows license, and were already certified to run Linux (however awful the distro). After installing something more usable, they were absolute fine for everyday computing.
It is definitely the "just install Linux" part of the process that is the big hurdle for many.
I'm not so sure about that. Chromebooks sorta worked, but MS failed miserably with the two entirely different version of Surface, although mostly because they made two totally incompatible OS look the same at the desktop and called them both Windows. As other have mentioned above, the majority of home users see a "PC" as running Windows and that's where their experience stops. Apples are NOT PCs to most people, they are "Apple" and so of course are different. Apple built that impression with their "are you PC or a Mac" advertising back when Apples actually were a quite different beast to a "PC"[*]
* IBM co-opted the term "PC" to mean their take on a personal computer. The term PC or Personal Computer existed long before IBM joined the club.
I don't think too many distros is a problem. Really, there are just a handful that most people would be looking at.
The problem is too many have issues with installs. If you are new to Linux the last thing you want to do is play whack-a-mole with things that don't work after installing. Few people install Windows, all those annoying little problems are resolved by the hardware vendor. They know what components are most compatible with the OS they are installing, and choose accordingly. If you are repurposing a Windows machine, you're using the hardware you have and hoping it's all supported. Then there are the problems that crop up from the options that you choose. I've done a lot of Linux installs lately as I searched for a new distribution. And there are a lot of little things that need fixed. Example: package manager worked fine during install, but once booting under the OS has intermittent problems contacting it's servers. Turn out the the IPv4 DNSs were never making it from DHCP to the network config. And DNS servers are no longer in resolv.conf, it gets wiped at every boot. An then KDE Discover has trouble talking to the package manager because of some library version issue... Somehow it knows that there are updates pending, but it can't pull a list of installed applications.
Lots of distributions does dilute the developer pool some, but I suspect that quite a few that work on niche distros would simply not work on Linux at all if their only option was contributing to RedHat or Ubuntu.
"An then KDE Discover has trouble talking to the package manager because of some library version issue... Somehow it knows that there are updates pending, but it can't pull a list of installed applications."
This is something KDE really need to get a handle on. I can only imagine that this is a serious case of "works for me". The fact that that site says "trust your package manager" or the like means they really have their eyes closed to all they queries online asking how to make Discover work along with the occasional post saying that this is how somebody (thought) they'd got it to work and it still doesn't work for others trying the same fix. No, we can't trust it because the only thing it can be trusted for is to not work other than running updates.
A bit of thinking helps here. The OP mentioned that Discover (it's KDE so why not Diskover?) somehow knew when there were new updates. How would that happen?
1. The system tray raises a notification when apt update run manually finds upgrades available so it's just watching there the results are being listed. It also raises notifications without apt apdate being run manually. That's easy to understand because
2. Cron or at are periodically running apt update, either set up by Discover itself or by the apt system - it doesn't matter which. So that
3. When the user runs the upgrade in the Discover UI all it needs to do is run apt upgrade.
In essence Discover isn't really being a package manager to do this, it's a wrapper for the system package manager.
Turning to the other side of the UI where it's supposed to check for uninstalled applications, perhaps it depends on something built in which isn't implemented properly.
But hang on - Discover isn't yet another .deb package manager, it's a KDE package manager for distros running KDE but which might have .deb, .rpm FlatPak or something else as their packaging format. So either the distro maintainers have to set it up specifically for the distro (in which case we have to look to Debian) or, more likely, it determines at run time which package manager(s) it can find and use the tools provided. As KDE produces the Neon distro (a Ubuntu spin) I'm pretty sure that if I run up a Neon instance Discover will be working - they'd notice if it wasn't which means that it can work properly with Ubuntu's implementation of apt (or it could be dpkg) but not Debian's. The error it reports looks like a network error but the reality is likely to be either:
a. A missing or differently spelled utility
b. A difference in options
c. A difference in formatting of the tool's output
If it's a. it can't be reported at install time as a missing dependency. It's dependent on one or more package management systems but they can't be declared as formal dependencies of the Discover package because otherwise it would fail to install because a distro's repository might not include package managers not appropriate to it. It really is - unavoidably - a "works for me" situation for the original developer because it needs to be tested and debugged on each distro in which it's included. What it should be doing is reporting what actual errors it encounters.
I now have a few ideas as to what to go hunting for.
But hang on - Discover isn't yet another .deb package manager, it's a KDE package manager for distros running KDE but which might have .deb, .rpm FlatPak or something else as their packaging format.
Discover uses packagekit-qt6 to talk to the various package managers. I have seen some threads about issues with Arch and pacman from last year. The system that is causing problems is Devuan with apt. I haven't had time to track it down. I just wanted it to browse to see what apps are in the repository. I was sure there'd be a JDK that wasn't 3 years old.... I'll drop to a command prompt for updates anyway. I've got another system to build next week on a completely different hardware spec, I'll see if it has the same problems.
>> If you are repurposing a Windows machine, you're using the hardware you have and hoping it's all supported <<
I have Linux Mint Mate installed on four different laptops:
1. A moderately recent Lenovo ThinkPad.
2. A somewhat older Lenovo ThinkPad.
3. An much older Dell Latitude.
4. A positively ancient Unibody Macbook.
It installed and worked on all four, without any faffing around.
Granted, installing Mint Mate on my Asus Chromebook wasn't an unqualified success. It works if you have plenty of time to spare and don't need sound. But then, I'd categorize the Chromebook as an "edge case". And good luck installing Windows on it...
I have Linux Mint XFCE installed on two identical Lenovo ThinkCentres, attached to two identical Brother wifi-equipped printers by USB. One will only< print via USB and the other will only< print via wi-fi. I have three ThinkPads with Linux Mint XFCE as well. Two of them print fine, the X220 can't even see the printer, whether as wifi or as shared.
I think there's two things at play here. Mint seems to be one of the better distros for out of the box hardware support and Lenovo explicitly support Linux. As I've mentioned here a number of times, their own stand-alone bootable USB diagnostics tool is Linux based and so by definition needs to support all of their hardware on Linux.
"If you are repurposing a Windows machine, you're using the hardware you have and hoping it's all supported."
If you have a Windows machine with only 802.11b/g/n and you want 802.11a you can plug in an external USB/WiFi adapter - they mostly work OOTB. Now install Linux. External USB/Wifi adapter will probably not work without finding the driver source, compiling either directly or via DKMS - but first you need to install compiler and associated cruft - not part of the initial install (except Slackware?).
It's not always that easy to repurpose a machine.
I think you mean 802.11ac or 802.11ax not the ancient 'a'.
Many USB WiFi adaptors will work with Linux out of the box, no diver / kernel module or compiler needed.
How do you find one? Easy, just search for Raspberry Pi compatibility, if it works with that, it will work with any Linux distro.
"2. Stop having so many damned distros, each crap (or highly specialised) one reflects on all the rest."
Could you provide a car analogy?
"3. So people can leave the Win ecosystem but keep their Win applications, make a Windows emulator part of Linux, don't rely on WINE or VMs, make it easy, and make it darned well work properly."
This emulator; what would it look like? Might it look a lot like WINE? How would it differ, apart from calling itself not an emulator?
This emulator; what would it look like? Might it look a lot like WINE? How would it differ, apart from calling itself not an emulator?
Simple, it would actually work! It would allow provide the same functionality as say W10 and even allow the use of W10 drives and FULL access to the hardware.
Could you provide a car analogy?
Sure. There are, for all practical purposes, only two interfaces in use for cars: manual and automatic. Sure there are a lot of different engines (processors), bodies (laptop vs desktop) and styling cues (desktop images) but the car industry realised a hundred years ago that all cars had to be basically the same, with minor decorative variations.
Oh, and nobody gets all pissy about people saying "VW Golf" instead of "Bosch/VW Golf".
There are, for all practical purposes, only two interfaces in use for cars: manual and automatic.
Really? On an automatic is it a centre console or column mounted shifter? Is it a handbrake lever or foot parking brake? Are the indicator stalks on the left or the right? Windscreen wiper controls - different on every car.
There are a lot more differences between cars than you realise, and people can generally cope with them, the same with differing desktops.
Win applications, make a Windows emulator part of Linux, don't rely on WINE or VMs
Totally AGREE 100% the only reason I have not moved to linux is it will NOT run the applications or use the hardware (printer and colour calibration) that I have. Wine is a failure and I tried a VM, but it will not allow me to use the full features of my screen hardware.
For me linux is not yet fit for purpose, no matter what the linux lovers say.
"3. So people can leave the Win ecosystem but keep their Win applications, make a Windows emulator part of Linux, don't rely on WINE or VMs,"
What do you mean by "a part of Linux"? Build it into the kernel?
There have been multiple distros over the years that have come with WINE integrated, but none have been successful.
The only real solution is to sell machines with Linux preinstalled. It can be done with marketing (Chromebooks actually did it!) but it needs that marketing budget and big company backing to sell anything to the larger markets.
"Remember, an OS should simply let people use their programs to Do Work with, then get out of the way."
Does that mean using the same programs as on WIndows? You can already use most of the same web browsers. Of other common software are people really that reluctant to use Libre Office instead of MS Office?
What part of Linux? I really don't care, as long as it works to run Windows programs from the menus. Asking whether it would be in the kernel is *exactly* the sort of question that puts people off Linux.
And as for Libre Office, a decent sized complex Word document such as I used to build pre-retirement is *not* exchangeable with LO in a corporate environment without glitches, format problems and others.
>” The "Linux industry" needs to seriously get its act together and start appealing to ordinary users…”
That is effectively what Canonical has been doing since 2004…
To really move things forward needs Linux systems available on the high st. Which means breaking the MS stranglehold on PC manufacturers. That Dell and Lenovo now do sell a few systems with Linux pre-installed is both testament to Canonical’s efforts but also just how well embedded MS is in the PC supply chain.
(This was intended to be a reply to Nematode)
It's a tricky one. I know most/many Linux users welcome the choice, I do now after using Linux exclusively for a decade. But I remember the first step away from Windows (8.1) and it was tricky, I took the advice of a Linux using friend who suggested KDE Plasma. It was OK, but I eventually switched to Linux Mint Cinnamon. But on first ditching windows I did find the range of Linux alternatives quite bewildering and somewhat off-putting. All I wanted was something that was easy to install, use and maintain and didn't require learning arcane skills. The learning curve for Mint was minimal and the software I used the most was included in the installation: Firefox, Thunderbird, LibreOffice, VLC media player etc. The one thing I missed the most was Visual Studio, having earned a crust as a software developer I was in the habit of knocking up software for my own use. The closest replacement was Qt but never really got into it.
At least KDE understands that M$ is handing Linux a better window of opportunity than it's likely to ever get again. And this window of opportunity won't remain open indefinitely. Depending on individual needs and personal preferences, some good Linux-based distro suggestions for Linux beginners are as follows:
ChromeOS (Flex or Chromebook),
Fedora (Bazzite, Aurora, Budgie Atomic, Kinoite, Nobara),
LMDE, MX Linux, Kubuntu, Zorin OS, Vanilla OS.
If you think this window of opportunity is unique, it’s possible you weren’t around the last time. I recall very similar discussions about the widely-hated Vista & Windows 8, and I confidently predict the outcome will be the same this time.
Plus ça change, plus c'est la même chose.
Well done to the KDE folks for driving a bit of engagement though.
I disagree on most of that, but agree about the likely results. I think this window is much better for Linux's chances than any previous one set up by Microsoft. When older versions of Windows went out of support, you could try to update any hardware you had to something later. Most of the time, that hardware was sufficiently out of date that it wouldn't matter. For instance, if you tried to update something that came with XP when XP went out of support in 2014, chances are it wouldn't run it well, but you could still do it and find out why 2002-era computers were considered obsolete. Some other times didn't even have that; Windows 10 was often more efficient than Windows 7 and so upgrading those generally went just fine. Either way, you had the choice to run something patched on your old hardware without having to work too hard to bypass requirements. Windows 11 won't install by default without meeting the arbitrary requirements, meaning that there will be people who would have updated but will be unable. That wasn't present before, so I think more people will be in the situation of trying to find a way to make use of the hardware they still find useful. Linux has the ability to meet that need, so I think it is an opportunity.
And, like the previous opportunities, I expect this one will have such a tiny effect that nobody notices. Many will run unpatched. Many others will see the warnings and decide that, regrettably, they have to buy new hardware. Many of those who have heard of Linux will Google it, read some web pages that talk about things they don't understand and provide CLI instructions they don't know how to use, and give up. For those who want Linux to be more successful, there are, unfortunately, no massive problems in the way. Massive problems could at least be focused on and tackled by a concerted effort. There are, instead, lots of little stumbling blocks. One person can often guide a few users around those to a successful result, but it takes manual effort, and the people who can remove one block often don't bother because it's such a small problem they don't think it will matter.
TPM chips nearly always are a non-optional standard fixture on recently mass manufactured PCs and Laptops. Encryption related matters served by these chips are not intrinsically threatening to device operators, however the chips' benefits are not universally acknowledged.
Microsoft's Windows OS, by virtue of market dominance, gives Microsoft a strong hand in persuading manufacturers to adopt particular configurations. Motivations underlying the initial suggestion for TPM chip use, ostensibly, were virtuous.
The de facto almost compulsory fitting of these chips has enabled Microsoft to oblige their activation by people choosing to use Windows. This, as with the chips' introduction, need not raise alarm over nefarious motivation; yet, requiring an active TPM chip as a condition for software use, does appear to be heavy-handed.
From the perspective of authoritarian governments, and of powerful lobby groups, Microsoft has set the scene for a clever, yet not obviously intrusive, means to monitor/control use of devices. It is timely because some legislatures are pulling hair out in anxiety over the wrong people (e.g. children and unwary adults) accessing 'harmful' data, or of anybody being influenced by 'content' on sites beyond jurisdiction and where 'unacceptable' facts and opinions are promulgated. If it is feasible to adapt onboard TPM software and key use, this level of control would have immense benefit to financialised economies reliant upon rentier defined pseudo-markets. That is, publishing and entertainment industries alleging harm from copyright infringement would have a powerful tool; the handful of not-Windows-users can be forgotten about because their behaviour has little overall impact. One assumes that Android devices, Apple phones, and MACs could be caught up in this.
Perhaps, in its present form, TPM lacks sufficient versatility to expand its functionality to cover the aforementioned extensions of remit. Nevertheless, the general feasibility of the approach, and the ease of rolling it out, is clearly established.
Just for some clarification, TPM has been around for a long time and many if not most devices in use will have a TPM chip. The problem is MS deciding that ONLY TPM 2.0 is the minimal acceptable so all those TPM 1.0 devices which can't be upgraded[*] to 2.0 are now obsolete in MS's eyes.
* I remember when the TPM chip was a plug in module and a failed system board could be replaced and the TPM module moved over so the encryption keys etc also moved over and a whole OS rebuild could be avoided in those situation where the TPM module was actually being used :-) We still do that on some current HP printers if the formatter board needs to be replaced.
The first thing I do when I install Linux off the live USB is whether or not I can print on the household’s network printer. I recently installed mint with mate and while the printer was found, it was a no go. Thankfully, I was configuring the computer for a family member who doesn’t have a printer. I don’t have time to read through cryptic of posts on support boards from folks who don’t really know what the issue is to troubleshoot this problem. So I’m hoping that the people behind this website will take situation like this under consideration and not provide overly optimistic expectations.
100%.
Though I'm a happy Mac user, I'm in the Raspberry Pi ecosystem as well, and would always have trouble getting my Pis to print to my wifi printer (without some command line interventions).
Pi OS - nope.
MX Linux - nope.
Pi OS Lite + XFCE - nope.
Until KDE. I made a standard Pi OS install, installed KDE Plasma desktop, bingo: instant printing.
And it's a much better desktop than all the others, UX-wise.
This is one of the reasons my printers are always PS or PCL compatible and on a wired network. It's far, far less hassle 99% of the time. I would imagine a "WiFi" printer that just appears on the network would also work, but "direct to WiFi" seems to be a bit a black art that can vary from one device to another. MFPs are a different kettle of fish :-)
TBF had exactly same issue with partner new mac, so printer issues not just Linux - it would not print to an old printer connected via USB.
Put the blame where it lies, printer manufacturers being lazy & only bothering to support windows (an old printer & did not provide any drivers for recent mac OS versions).
..Ironically I did get success by setting printer up as a network printer* (as could get drivers & software for Windows) -so it sits next to the mac but not USB connected!
* a sub optimal solution, as network printer obviously makes network less secure & only partner does any printing!
Recently migrated most of my day to day computers to Kubuntu. Been in IT for years. Even though something like Ubuntu has been around for ages and made Linux far more accessible, it's still a lot for the average user. I've been compiling from source since the 90s, it's easy enough if you're used to it, but therein lies the rub.
Ill give a super quick example: I used Discover to install docker. Rebooted the machine, and it refused to boot. Turns out docker from the Ubuntu repo isn't correct, and I still needed to go to docker docs, and find the commands myself. I'm not suggesting that the average user would be setting up docker, but this is still more of the norm.
Snap and flatpak have helped, but even non repo snaps require command line to install.
The problem in the 90s is the same as now: the technical expertise required is a massive barrier for entry. Most people don't want to fiddle with command lines or resort to documentation to install software, ESPECIALLY when Microsoft offers functionally one-click installs.
Linux, to be successful on the desktop, needs to break away from the Linux paradigm. Most people who use Windows don't actually care about the "evils" of Microsoft, they just want something simple that works. Linux STILL isn't there.
Don't take this as me bashing Linux. 5 out of the 7 computers I use regularly are Linux. My two Windows machines are exclusively for gaming (another sore spot for Linux, though massive gains there thanks to Steam/Proton). I'm very much pro Linux, I'm just pragmatic.
I realize I haven't put forth much here aside from blanket statements and generalizations. But that doesn't make it any less tangible.
I do love me some KDE though. Been a fan since RH 6! It really does help the transition, despite all I've said.
> The problem in the 90s is the same as now: the technical expertise required is a massive barrier for entry. Most people don't want to fiddle with command lines or resort to documentation to install software, ESPECIALLY when Microsoft offers functionally one-click installs.
Context: I am a Linux/KDE user, but I have used Windows for many many years (since Windows 3.0).
You seem to forget the vast amounts of problems that a Windows user must face in his/her entire life, problems that they cannot solve by themselves, problems that require a technical expertise:
- Driver problems
- BSOD
- Infinite reboots
- Registry problems
- Update problems
- Installation problems (e.g. DLL hell)
- Configuration problems
- Printer problems
- Virus/Malware
- Login problems
- Filesystem problems
- Application problems
- Etc, etc
How many Windows users have the knowledge to install the OS or to solve the above problems? IMO, nearly zero. I solved/solve all of them for my family/friends/customers (in many occasions using a terminal with batch or powershell commands), so it's unfair to say that for Linux "the technical expertise required is a massive barrier for entry". The only Windows advantage over Linux is that it comes preinstalled.
PS: C'mon, with all respect, your anecdote about installing docker on Linux is ridiculous... docker is for advanced technical users (*), not for the average ones, they cannot install Docker Desktop on Windows either.
(*) you, like it or not, must read the documentation about the installation process in the docker website.
PS: C'mon, with all respect, your anecdote about installing docker on Linux is ridiculous... docker is for advanced technical users (*), not for the average ones, they cannot install Docker Desktop on Windows either.
Installing docker on Linux shouldn't be difficult though. Using it, maybe. But it should be as simple as telling your package manager to install it and start the daemon.
> Installing docker on Linux shouldn't be difficult though. Using it, maybe. But it should be as simple as telling your package manager to install it and start the daemon.
Well, ideally yes, but depending on your distro, the installation has it gotchas, that's why this long documentation exists: https://docs.docker.com/engine/install/
Come on. A lot of those are overblown or so generic that they could apply to anything. For example, what "filesystem problems" are you referring to? Because I don't see filesystems failing routinely on any OS, but when I have broken them, I have done it on Linux. But that doesn't work as ammunition against Linux either, because almost universally, when I have broken them, it was because I chose to monkey around with something I didn't fully understand, something the average user wouldn't do, and so it's my fault.
Many of the other categories you have there ("driver problems" and "configuration problems", for example) apply to Linux as much as they do to Windows. Others ("BSODs") apply to Windows, but far less than they did to Windows 95, and if I wanted to be anti-Linux, I'd just replace it with "kernel panics" and pretend that the average user actually sees those routinely even though most people probably haven't. Others ("infinite reboots") pretty much don't apply to anything, although you could break either thing enough to have that happen. The rest ("application problems") are so vague that they might as well say nothing. I've seen similar scare lists of why Linux is awful and you should just not try it, most of them completely unconvincing for similar reasons, such as mentioning the need to manually recompile kernels which almost no normal user would ever need to do, but if you tell someone who doesn't know that, it looks terrible.
I haven't used Kubuntu for years, but did find while the CLI package manager was pretty solid, the gui one leaved much to be desired. That is definitely a quality of life issue, particularly for less technical issues, but not inherent to Linux. SUSE has long had a "one click install" option, that will automagically add repos and integrates with its package management system, and *is* the sort of thing other vendors could do.[1] It's little different for the end-user than running an installer .exe on Windows or a .pkg on Mac. Personally, I haven't been using the CLI for anything but convenience's sake for quite some time. I read here that they're getting rid of YaST, but there will be other GUI configuration tools to make life easier as well, which makes a lot of that stuff more accessible. And in many ways, the reason I did initially switch distros was that, at least however many years back, Kubuntu specifically had a few things that bugged me going on.[2]
I think that your point regarding "something simple that works" is much more of (and please excuse the pun) a root issue with using Linux daily. There seems to often be a lack of cohesiveness in overall user experience; the KDE folks do a *lot* of work to attempt that, and provide an integrated DE and set of apps, but not quite 100%. In some ways, it's an "uncanny valley" situation. It's most jarring for me when I open up and run any GTK stuff. I don't know how much Windows users would notice, however. My primary machine is a Mac,[3] so I am rather accustomed with, say, the menus of nearly every application following a consistent pattern and other small touches.
[1] Some might have something similar. I'm not much of a distro-hopper and am content to not fuss with things that I don't have to.
[2] Although that PC was my first Linux box, I was already fairly familiar with KDE, having buit/run it with Macports on the G5 that it had replaced.
[3] Where oddly enough, I'm forced to use the command-line for more things, because the GUI can be rather limited.
Oh nice. I haven't used it in years, even using KDE almost daily. I'm glad that things are getting much nicer for users, not just non-technical people, but even overall convenience and comfort. I'm always eyeing certain developments in the hope that eventually I'll at least have the option to ditch the Cupertino cult without sacrificing too much.
… are these windows users supposed to find KDE and read this marketing if they aren’t aware of what Linux is or even what version of windows they are using?
I’m not sure this was thought through!
Perhaps the target is curious geek’s looking at moving to Linux in which case that’ll be an absolutely tiny number.
Microsoft in the meantime has direct access to the eyeballs via windows update messages.
The vast majority of computer users are totally unaware of Linux and will remain so.
I founded and run a small group of technology companies. We do some Linux and some Windows because our clients do. I don't tell them what tools or versions or OSs to use, I meet them where they are. Many of them use relatively ancient or highly customized versions because the machines and the computers that run them were built to run those versions and only those versions - full stop.
I'm not an evangelist, I'm not the all-singing, all-dancing, do-it-my-way pitchman. I'm here to help folks do the work they need to do with the tools that work for them. They appreciate it, and they show their appreciation both financially and personally - they keep inviting us back.
Separately, somewhere along the line my father and his wife got old. They both have some cancer scars, and a high probability of more to come. They aren't in a position to relearn Windows in its latest, distractingly awful version, and learning Linux is right out, too. They just want to pay their bills, chat with friends and family, share photos of the grandkids, and keep up with their medical appointments.
Operating system providers routinely fail them and always will because unnecessary novelty and exploitive manipulation only solve problems for the exploitive manipulators.
My older rig is not W11 compatible and I am unimpressed by both W11's functionality (or lack of it in many cases) and MS continual faffing with the UX and it's data grab.
I've worked with Linux before and maybe more importantly I quite enjoy using Linux vs MS.
I picked PoP!_OS as a distro that was flagged as being particularly good with steam and the GeForce card in the box.
Installation was easy..until it wasn't. It didn't detect the GeForce card properly or pull down drivers. I had to work with a GPT to work out how to get the right driver installed then that process caused a complete failure to initialise the desktop. Turned out the driver install had corrupted the user profile and I had to drop to a recovery mode to create a new profile and delete the old, then hey presto all was well.
But that process it utterly beyond most non-IT folk. The description of that to them will casue them to run out and buy a new W11 laptop and thrown the old kit straight into the trash.
Unless the OS is preinstalled I don't know how to get round that, even a one button conversion is likely to induce too much fear.
what is sad is that after all that I like the Pop!_OS and think most windows/mac/phone users would be very comfortable with it. There is a shop/app store of stuff which handles your apt-get needs, updates are automatic. The UX to me is very pleasent I like it a lot.
My Mrs is a charity trustee and they are going through some trauma after MS just withdrew the 365 licenses they were using to force them onto basic with web apps only. I described how I could give the charity some free computers by putting Linux with Libra Office onto some older kit, how it was easy to use, free for life, very reliable......nope not interested. Got be windows or they'll run away.
Until Linux is a brand of sorts its got no chance. MS knows this so it's quite happy to do the bait and switch on charities for an extra few coppers knowing they'll likely cough up.
What is needed is a set of instructions in one place for people on how to migrate all the key apps and Google to keep them going in the same way
M365
OneDrive (inc auto offline)
Teams
I - things (tunes / cloud)
How old to import pictures from an iPhone to a local folder, etc
Most people these days are probably mainly browser users with the odd app here and there
All my friends make fun of me because I like KDE. They're all -- LXDE, XFCE, Gmome 3 -- and I just like it. It feels familiar and easy. My first computing was on DOS 4.1, so I'm quite happy with command line. My second was with Win 95, so I like the whole "Start" menu at the bottom left, little system tray at the bottom right. The clock and date, little icons to say what's going on.
I've been on KDE since v2, and I've been very happy with it. Is it perfect? No. Will your grandma like it? Probably not. Maybe more than Windows 11 with it's weird (hey let's make a smart phone desktop) interface. But if you're looking for a replacement for Windows 95-XP, it's as close as you'll get.
I have always been a fan of KDE, and CDE. On Debian, they are what I expect in a PC. However, they have gotten a bit crazy in the past few years. Some of that seems to bring new devs/users, but isn't really my thing.
KDE is really what the folks want, IMHO, and there's a positive way to say all this without calling anyone's machine obsolete. TBH, the machines are not obsolete and "KDE+Linux will show you that your machine remains quite usable and even faster than it was under windows as you concentrate the resources on what you want to do." Easy peasy word smithing. Besides, they are competing with new machines, not M$
Have you tried keeping a 32-bit netbook up to date? It's bloody hard work finding a decent 32-bit linux these days and KDE strongly recommend not even trying to run KDE on 32-bits these days. I have perfectly good 32-biit hardware that these Linux Nazis are trying to force me to upgrade to newer modern kit. Bastards are all part of the harware/software industrial complex!!!
I've switched the family's PCs over to Linux Mint. I thought my decades of using Linux in a server environment would come in handy. That experience wasn't needed at all. Download a distribution to a memory stick (or CD), book from the media and install it. Even my wife's desire to use Office rather than Libre Office was not a problem, she is perfectly happy using the web hosted apps on Linux Edge Browser, which Microsoft has thoughtfully provided. The UI is very close to Windows 10.
What about all the software i bought over the years and licenses ? Rhino, Adobe Creative Suite, Solidworks , Altium to name a few. They don't run on Linux ( Some may run on Wine albeit with a lot of difficulty and severe limitations )
There's the rub for people. Switching to other tools is not an option. I have invested decades of time and money in those tools and make a living from them. Not interested in "equivalents".
So it's Win11 then.
The problem is that it's no longer 2001.
Linux users (of which I am one) are right when they say that installing Linux is easier now than it's ever been, and it is indeed easier to install than Windows was in 2001, or 1995, or 1990, or 1980. And it is definitely easier to install than OS/2 ever was. I say that as an OS/2 developer who worked at IBM on OS/2 projects.
The point they are missing is that it doesn't matter. It's no longer 2001. "Linux is easier to install the Windows used to be" doesn't mean anything to users who have never installed an operating system in their life.
The days when PC users knew their memory map, configured their UMB and LIM settings in config.sys manually, set the dip switches for the LUNs on their SCSI devices, and knew their memory map are long gone. The modal PC user of today would likely not even understand the previous sentence.
Most Windows machines today come with Windows preinstalled by the vendor. Oh, they configure it, certainly. On first bootup, they're asked their location, language, and other things, most notoriously their Outlook network account, so in that sense, they're installing the OS. But they're installing the version that's already customized for their PC. It already has all of the correct drivers in the image.
Linux installs great for a lot of users, and a lot of machines. But if you have an Nvidia card, you'd better know what you're doing when you run the install.
And then there are applications. There are lots of people who absolutely can use Linux for everything they do. I've helped dozens of people who just use their PC for email, web browsing, online banking, Netflix, and things like that. There's nothing there that Linux can't handle, and in many cases better than Windows. But let's not kid ourselves that everyone is like that.
In 1992, IBM tried to sell OS/2 with the slogan that it was "A better Windows than Windows", because it had a more advanced kernel, better memory protection, and could therefore run Windows apps better than Windows. It couldn't. It did for a lot of things, but many Windows apps didn't behave correctly under OS/2, or didn't run at all. Long winded explanations that it was the fault of the app for not following proper coding standards and the like didn't matter to end users; what matter was that the apps didn't run.
Wine is a great technology, and it runs a lot of Windows apps. But it's not a cure-all. It won't run Adobe software. And forget getting LibreOffice to connect to Sharepoint.
I think Linux is fantastic (most of my machines are Mint or LMDE, with one switching between Kubuntu and TuxedoOS at the moment), and it does everything I need it to do. But let's not oversell it and pretend it's flawless and that it's a plug compatible replacement for Windows, because it's not.
Linux installs still have a long way to go as far as I am concerned. I recently (as in during the last month) tried upgrading one of my laptops from Win10 to a Linux falvour (Zorin in this case).
I will admit that the actual installation was painless. Setting up Zorin to be able to use Windows software was also failry painless. And the came the first of my requirements and it all fell apart.
I have a few NAS boxes around the house. They run AFP, NFS and (yes) SMB protocols. On Windows, I can go to File Exporer, find a NAS box, and then put a link to that NAS box on my desktop. I repeat: the NAS box itself, not the various volumes, filesystems and shares it contains. I click on the NAS box icon, and I get to see said contents. Great.
Not so under Linux. Even though the vairous file explorers I tried allow me to see the NAS boxes (and top-level listings of the contents) there was no way for me to create a desktop icon to do the same. Instead, everyone I asked kept pointing out how easy it is to automount each volume/share and put a link to those on the desktop. Great. So instead of just 5 icons on my desktop (one for each NAS) I would have a plethora of icons to each individual share.
Yeah, failure on the first of my requirements. So I restored a clone of my Win10 installation (I sure wasn't going to test Linux without making sure I could undo everything) and I am now looking at other solutions. Which is a shame, 'cause I have several home servers, automation servers, and (yes) home-built NAS boxes happily running various flavours of Linux. But the desktop still isn't there.
Linux is not ready for *my* purposes. Of course, YMMV.