The main problem with making a mobile OS...
Is that Smartphones just last a few years and the hardware keeps changing while you can take a computer from ten years ago and revive it using Linux then use it for almost anything.
Patent assertion entities: do not pick a fight with open source. It won't end well for you. This is the message from GNOME Foundation executive director Neil McGovern, who will speak on the subject at the Open Source Summit Europe next week. McGovern talked to The Register ahead of the event on patents, Microsoft, and more. …
Google's Project Treble helped a lot, but the basic dilemma remains that the SoC and other hardware bits require binary blob drivers and companies like Qualcomm End-of-life their support for SoC around 18 months after release. That's pretty insane and also one of the underlying reasons why becomes really hard for anyone higher up the technology stack to promise much better than that. It's pretty much current and next major Android release and then you're done.
Yes, but: I have phones more than five years old that still work fine, hardware wise. I would say the problem is that there are so many different phone hardware combinations, and that even within the same range of phones (e.g. Moto G's) you cannot rely on the next iteration being even vaguely similar (i.e. as far as I can tell from info on e.g. gsmarena). So the all people that know how to compile linux-for-phones end up switching platform all over the shop, possibly depending on whatever they happen to buy - because buying their previous Vendor's Thing+1 phone is not necessarily any help - and so it's hard to build up useful momentum.
I'd happily buy nothing but Samsung, or Moto, or any one vendor's phones if only I knew I could - after the vendor has given up with updates - get lineage/whatever installs for a long(er) time. Last year I resurrected a J1 using a not terribly formal internet -sourced rom ... but there are no more updates. I assume this is because why should that one guy keep wrangling a customised Lineage onto an old phone they no longer use, when their current phone is presumably not (even remotely) the same at all?
Sony has been a very good Open Source citizen. Well, as much as they can within the confines of Project Treble, the hardware blobs and Google's licence requirements. You can find a list of devices, kernels and development guides here
I'd love to keep using my Motorola Photon Q phone, released 8 years ago. Main reason I can't is that Android 4.x is the equivalent of Windows XP, so modern secure connections fail with most software, and most applications updated in the past several years won't run on it, either.
If I could install a plain Linux with X11 on it, and just compile open source apps, I'd use it until the hardware failed. Lots of people did just that with their Nokia N900s. I don't even need "real" voice support, just cellular data. Linphone (VoIP) will do nicely.
Only secondarily, I'd also like it to have more 4G band coverage today, but I'd still be happy to live with it for many more years.
I have been running RedHat or Centos since about 1995, currently on CentOS 6 - which is dropping out of support soon. CentOS 8 would have been the natural upgrade but it comes with Gnome 3 - which is, as far as I am concerned, unusable. Mate (AKA Gone 2) is not available, XFCE doesn't quite make it; Fedora - I don't want to have to upgrade every year; so I am installing Debian - Mate is an option.
Agreed except that Mint seriously messes with the default mate layout to make it look and feel much more like Windows than GNOME 2... There's a fair bit of work to put it back to rights whereas Ubuntu MATE edition delivers it exactly as its developers intended and you can optionally theme it or make it more "windowsy" (or like several other platforms) using MATE Tweak.
I do wish that Mint wouldn't twat about with it.
If you're after a stable Linux distro that includes non-Gnome things, then OpenSUSE Leap is worth trying out. :)
It's lifetime is measured in multiple years, receives timely updates, and has several desktop environments available.
Personally, I use KDE, which CentOS 8 doesn't have (thus my switching). If you're not tried KDE out then it's worth a go as well. It's very much a full featured "traditional" desktop environment, that's very configurable if that's desired.
"Fedora - I don't want to have to upgrade every year"
Out of curiosity: why not? I mean, I'm obviously biased (I'm the QA lead for Fedora) but these days a typical Fedora desktop version upgrade barely takes longer or is any more disruptive than a regular system update. Have you had bad experiences with upgrades?
I want a stable system. I put a lot of work into getting it just as I want, then get on with other things.
I will have an instance of Fedora running as a virtual machine.
@MacroRodent I did consider Mint (which I run on a laptop) but its installer had problems with me wanting LVM over RAID-1.
Time waste: with 2 x 4TB disks I should have used gpt partitioning & EFI - but naff firmware refused to boot. I know that I got it right as it worked under qemu :-(
What does the Fedora upgrade process involve? OpenSUSE's "zypper dup" can be run while productively using the computer (even though on workstations I do quit the desktop, and on both workstations and servers I give it a reboot as soon as convenient to pick up the new kernel). That's full version upgrades, patches are applied online transparently to me.
You can do that on Fedora, but it's not a very good idea. (On SUSE *or* Fedora). You should at least run the command from a screen or tmux instance running *outside* of your desktop environment, so the chances of the environment the upgrade is running in being crashed by the upgrade process are lower.
Still, it seems the concern people have is not really about the upgrade process itself but about the impact of any changes introduced by the upgrade. Which is a perfectly reasonable concern. If you prefer an OS with a slower pace of change, that's a good reason not to use Fedora (or one of the faster-moving SUSE editions).
Like alain, I don't want to spend significant amounts of time upgrading a computer.
So, the canonical anti-pattern for me is Windows 10: twice a year it demands your attention, it requires shepherding through multiple reboots, and then makes each user sit through a faux-jolly passive-aggressive "Hi, sit there and wait, because it's not your computer: be grateful we're doing this for you".
And Ubuntu's LTS + HWE (but skipping every other one) is what currently works for me: we're on 18.04.5, and I don't have to bother myself until 2022.
Gnome 3 is indeed shite. What? You don't think so?
* You think those stupid slide bars on all the windows are GOOD? When the output is 30,000 lines long?
* What the fuck is that stupid thing across the top? And how do I get rid of it?
* And that thing on the side? It's like you looked at Windows and Apple and said, 'we must make it look like that' forgeting that 'like that' is actually shite.
* And then the hot corner. FUCK OFF.
* And auto-maximising is just god awful.
gedit -- how do I use the keyboard to do things? I used to save with alt-f-s, but now I am forced to use the fucking mouse, when I think the mouse is probably the worst invention in the world. I work from terminal sessions and think bash rocks, and you seem to be trying to make that world as distant as possible.
and you've changed the way the file manager behaves. Why? It was good in Gnome 2, why change it?
Then there is...
Oh, never mind. I've found a way to turn off the silly sounds you seem to get everywhere.
I use it because I'm forced. But that doesn't stop it being utter pants.
BRING BACK GNOME 2!
But all the other Alt- mnemonics don't work any more in those applications that have lost the menu bar.
I like the Firefox compromise (the menu bar is still there but hidden by default. It becomes visible when you press the Alt key).
Icon: for those involved in the patent case defence
I disable scrollbars entirely and use a mouse wheel. Can't relate to your scrollbar woes.
That's the top bar. It can be disabled or hidden easily.
Automaximising? I've never experienced that...Windows open up as windows on every Gnome3 install I've tried.
I'm a massive shell fan too. I'm a ZSH guy. Gnome3 in no way makes it difficult to live the shell life.
ALT-F-S isn't a proper keyboard shortcut, it's a menu navigation shortcut.
CTRL-S is, was and always will be the default keyboard shortcut for saving.
Living in a shell and using menu shortcuts to navigate an application is a weird contradiction. That said, you can re-enabke those old school menus...I personally find them to be a massive waste of space.
Actually, CTRL-S is the XOFF terminal control character for VT100 terminals and the like.
Yes, I remember when such things were new and exciting with instant (nearly) responses to your inputs.
(Usually syntax errors.)
The alternative was punch cards and batch processing.
(And they almost invariably included syntax errors too. It just took a lot longer to find out how badly you had stuffed up.)
What's the problem with XFce? I customize the layout so it looks like Win95 and put it in front of some of the least computer literate users anywhere, and they manage to pick up on it with a bare minimum of training. You can continue to run any GNOME applications you're accustomed to.
I've been running Ubuntu MATE (after starting with Mint MATE) for several years, and it has been trouble-free. You can opt for the LTS (Long Term Support) releases, which only come out every 2 years, and thus provide a more stable platform. In reality, I've found that MATE provides its own stability, so don't find the 6-month releases that disruptive. But I've adopted LTS on all my systems except for one VM I use expressly to see what's currently going on, simply to cut down on the update volume.
I'm curious. How is Gnome3 unusable?
I've been using Linux for over 30 years and have hopped to more distros than I can remember and in that time I can't remember a desktop experience better than Gnome3 (as it currently stands).
Sure it was weird when it first came out, but right now it is easily the smoothest, cleanest DE out there.
Yeah you need to tweak it a bit, but that's no different to any DE on Linux...nobody uses a stock config as far as I'm aware.
Gnome3 stays out of the way, rarely breaks, and be configured to be completely out of the way.
I personally prefer a minimalist desktop. I'm a heavy keyboard shortcut user and between Gnome2 and Gnome3 my keyboard shortcuts remained largely the same.
I've experimented with pretty much every DE out there including some of the mind bending tiling desktops and nothing comes close to the fluidity of Gnome3.
It is glassy smooth these days.
Literally my only gripe is the weird refresh bug where if you have a 60hz screen paired with something that has a higher refresh rate, both are forced to run at 60hz. From what I can tell though that big is pretty much universal across th board...I've not found a DE yet that doesn't do the same thing, I suspect X11 is likely the culprit. That or NVIDIA.
So yeah, no show stopping bugs, extremely easy to use, fluid, smooth and rock solid stable.
I'm using Gnome3 on top of Arch (btw).
As for your OS choice, if you don't like upgrading annually, why not try Arch? It's a rolling release that never goes out of support, the wiki is excellent and it's arguably the most flexible distro out there.
Every time I use CentOS, it feels so old and clunky compared to Arch.
Debian rolling is fine, but I always find I'm fighting it if I want newer versions of tools on it.
I find CentOS and Debian are way better for servers...where things shouldn't change too often.
That is what I tend to do. It's why I have a homelab (which is surprisingly cheap to do these days)...all my stability needs are met by a server with 9TB of disk space and an open source hypervisor. The stability of my desktop is largely inconsequential. My configs are backed up to my local GitLab server and I keep a local cache of frequently used packages...
But in the desktop...a Linux rebuild takes maybe 30 minutes if you're setup right so it's hardly a chore these days if you need to refresh things a bit. You can do it while you have a shower in the morning and he ready to go before you're dry.
"Sure it was weird when it first came out, but right now it is easily the smoothest, cleanest DE out there."
Cleanest? Oh, removing *everything* from screen and leave only some Windows 8 style coloured blocks (which just sit there, they aren't applications or anything *leading* to anywhere, just pretty pictures) there is *a good idea* to you?
No windows, no menus, no nothing? The prime function of a window manager is to *help user* instead of preventing everything he or she might do.
The *only* desktop that scales worth a poop is KDE/Plasma. I use a 9" 2560x1600 laptop display scaled 1.4-1.6 undocked, switching to 1.0 is annoying docked, but gnome is 1X or 2X. Useless. I'm actually ok with plasma. I'm not a luddite, and I keep trying gnome thinking I'll get the hang of it... Then I move back to mate, or kde on the HiDPI stuff. KDE scaling just kinda sold me. Multi display support is also tolerable: it behaves and remembers configs for the home dock display (2560x1600, lt display off), and work dock display, (1920x1080, lt display on). YMMV, but mate/cinnamon/gnome/lxde/xfce all fell flat & useless for scaling.
Quote: "....Fedora - I don't want to have to upgrade every year...."
$ sudo dnf upgrade --refresh
$ sudo dnf system-upgrade download --refresh --releasever=33
$ sudo dnf system-upgrade reboot
$ sudo shutdown -r 0
So.......four commands and maybe an hour of your time.....what's the problem if this happens once "every year"? In my experience (F24 up to F32 so far), this is painless and mostly without any aggravation at all. Much, much easier than a bare metal install and data restore for each fedora release.
Speaking for myself, it's not so much the ease or otherwise of upgrading. It's that I have a stable system I'm using to do useful work and then a major system upgrade breaks something - trivial things like a change of a version of a component (eg mysql) breaking my config, a program losing a feature I've been relying on, a piece of hardware no longer working. I've experienced all of these things, on Linux, macOS, and Windows. I used to do all the updates all the time but was spending so much time fixing the things the updates broke that now once I've got a something stable and I'm relying on it for doing things it gets nothing but security updates. (This is why I don't use Windows and my mac is still on High Sierra).
"...it's not so much the ease or otherwise of upgrading. It's that I have a stable system I'm using to do useful work and then a major system upgrade breaks something...."
In all the iterations of Fedora I've used (as one of several distros in my various desktops), Fedora either makes everything work or nothing works when you update it. I've never had an issue where I got to the login screen and then something after that point didn't function properly, it's more of a thermonuclear situation where all of the sudden it fails to boot and I need to fiddle for a bit. It keeps a copy of the old version anyways so it's never a real hassle.
As far as my main system (PopOS!/Ubunutu) goes, I find that as long as you stay on the LTS releases that basically never happens.
My most recent upgrade from Ubuntu 18.04 LTS to 20.04LTS on a standard ThinkPad with a single SSD resulted in an unbootable system because it failed to update Grub properly.
That was extremely disappointing. Not a big problem for me, just an inconvenience but I know loads of people now who are less technical or even not technical at all, who would be totally screwed by this.
It's a real shame that a GUI initiated upgrade resulted in a borked machine. More testing needed..
I had a similar issue. After several evenings of messing about in busybox trying and failing to fix things: Boot-repair, with a redo of the EFI key sorted my troubles.
However, this incident caught me out so much because I was used to easy and painless upgrades.
I've done a dozen ubuntu upgrades, and two have not been boring.
It's not the effort or even lack thereof. It's the niggling worry of what will be broken when it comes up again.
In my case it was the upgraded release refusing to recognise the camera when I plugged it in. It might have been fine with a thousand other camera models but I only had one and it didn't work. Subsequently I read something that suggested it was just a type in a config file. By that time I was long gone, put off, ultimately, not just by the minor typo that should never have been there (the file was working, don't fix it) but also by the process that allowed it to happen undetected.
For all I know Fedora may be have a far more rigorous release process now but I've no great reason to go there (do they even have a systemd-free version?) so I'm never going to find out.
"For all I know Fedora may be have a far more rigorous release process now"
I mean, we've had a fairly rigorous release process for years, but I'm not gonna lie, we do not have a giant warehouse full of every webcam model known to man which we rigorously test on every release, no. I cannot undertake to guarantee your random webcam will work on every upgrade. Sorry :)
There you have it then.
An aggressive release schedule such as Fedora's would not allow for testing even if there were a big warehouse for all hardware.
There isn't even enough time to test for all software combinations that people have put into their machines.
Are you still wondering why such a release cadence makes people nervous?
As is mentioned by another fellow Fedora is in a perpetual beta. There are other more stable distributions out there.
Well, I mean, it's kind of a spectrum. Or a set of overlapping spectrums.
It's true we couldn't test a room full of hardware on our current release schedule. But then we also probably couldn't test it if we released every two years, either. There's a *lot* of hardware. We have like a dozen paid QA folks and maybe the equivalent in volunteer person-hours. We'd probably need till the heat death of the universe to test all combinations of hardware.
Ditto for "software combinations". There are thousands of SRPMs that make up Fedora. Just the possible combinations of those packages are effectively infinite - never mind the confounding factors of configuration and what you actually do with them.
If you spend much time thinking about this, like QA people do, the conclusion you tend to come to is that it's a miracle that anything works as well as it does. And there's kind of less difference than you might think between a six month release cycle and a two year one, in terms of what it's possible to test as a percentage of all the things that possibly *could* be tested: the answer is "infinitesimally small" either way.
Realistically, the only way you can really 'comprehensively' test software is to have that piece of software be very small and be required to do only a small and very clearly defined set of operations. This is how you do things when you *really really really need* the software to be reliable. Like when it's running a plane or a dam or a spacecraft, for instance. But you can't do this for general purpose operating systems, because they're...well...general purpose.
So yeah, stuff breaks. Fedora sure does. And so does RHEL, and macOS, and Windows, and Ubuntu, and SUSE. They *all* break. They're all in a perpetual state of some stuff being broken. It's not practically possible to achieve any other state, really.
I don't honestly think any given Fedora release would be substantially more "stable", in the sense you mean, if we went to a one year or two year release cycle. We don't run out of time to test the stuff we test, these days, and we haven't for some time. What would make Fedora more "stable" would be to slow down the pace of changing bits of it and adopting new things, but then it wouldn't be Fedora any more. And indeed if that's not what you want, you shouldn't run it; there are, as you note, other choices.
"So.......four commands and maybe an hour of your time.....what's the problem if this happens once "every year""
Have you ever actually tried to do that?
Despite claiming so, I mean. Even Red Hat says Fedora is and will be a beta version of RedHat and every upgrade/update *will break things*.
New updates are a weekly occasion, not yearly. Fedora is not to be used in a production system.
"Even Red Hat says Fedora is and will be a beta version of RedHat"
We do not say that, and it is not that.
"and every upgrade/update *will break things*."
We don't say that either. Of course *sometimes* *something* will break on an update or upgrade. This is true of all general purpose operating systems. We do not say all updates or upgrades "will break things" because it is not true, they do not.
The problem though is that “to fully defend a patent infringement case it usually costs about $10,000,000,” he said.
Its the reason why the only winners in these cases are the lawyers. The patent system is just broken in the US, and unless its fixed its just going to mean that independent developers will be driven out of business for fears of patent trolls coming after them and they will not have the fund to defend even when there is legitimate prior art.
A very quick fix would be to make the legal costs for an invalidated patent recoverable from the USPTO. With a good case the defendant would know their costs would be covered, the plaintiffs would be aware that the cases would be fully defended, there would be far fewer patents granted and the USPTO would have a great incentive to go through the back catalogues, checking each one even if it meant handing back fees. In the meantime the USPTO would probably keep popping up with amicus curiae briefs to stop the expense getting out of hand.
I was going to add "get damages from the patent troll". A great idea, but it would mean that a large, well funded corporate with expensive lawyers would use the threat of damages as a way of blasting small patent holders into giving them free use of what is patented. It is hard enough as it is for the small guy to get the corporates to play by the rules without giving them another cannon.
It is not just the patent system. Pretty much any civil case is about whether you can afford the legal fees to fight it, and not about who is right and who is wrong. I have read so many cases of acrimonious divorce cases, where a fortune of millions was almost entirely dissipated in legal fees, so neither party actually benefited. I have no idea what you would spend ten million dollars on, for what is essentially just paperwork. I could do a hell of a lot of engineering with a ten million dollar budget.
“I'm certainly very pleased to see that Microsoft are moving more towards encouraging open source development... “
You don't say.
I'm certainly very alarmed at your thinking that this is what is really happening.
Very much so, to the extent of thinking that maybe you've been living in a bubble for the last 20+ years.
From a post here at ElReg by AC (20180614) which sums it up nicely:
"Linux, Linux Foundation, R, Git, Atom/Electron, MariaDB, Python, Mozilla, RedHat, Debian, Gnome, KDE, ... are all being "disrupted" by M$. Their trojan horses infiltrate all important open source free software foundations and companies (EEE Nokia style). (M$ is a sponsor to all of these foundations! And Linux Foundation congratulated M$ for buying Github! WTF) ..."
I find it tragic that you do not or maybe just choose not to see it and then spew this utter BS.
I'm no fan of MS, but you sound like a Japanese soldier still fighting WW2 in 1960. Linux, Git, Python, Debian - disrupted and infiltrated by MS simply because they've taken their money?
I'm inclined to simply say grow up, but instead I'll point out that IBM, Oracle etc. had business models that competed with Linux and open-source. Now they work with them, and - as a relative UNIX newbie of only 25 years - I think it's been a boon for Linux and open-source projects as a whole. I don't see why MS should be any different - they're a business, and - frankly - the business case for open-source at some levels is pretty undeniable.
... sound like a Japanese soldier still fighting WW2 in 1960 ...
MS declared war on Linux many years ago.
Steve Ballmer made it quite clear in an "interview" with the Chicago Sun-Times, 2001-06-01.
"Linux is a cancer that attaches itself in an intellectual property sense to everything it touches ..."
That war is still going on.
Their style is the same as it has always been: EEE / Embrace, Extend and Extinguish.
The approach has changed: is a friendlier so for quite a few it is harder to see.
As they go around throwing money about with a smile, everyone thinks MS Reborn, having seen the light, is now their long lost bro.
Bullshit for the great unwashed and gullible IT journalists.
As to growing up, I've already grown up.
At least enough to clearly understand just what is going on.
Instead of living in a fantasy.
"... point out that IBM, Oracle... "
Upvoted you for the first part, but I think the second part is essentially wrong.
If IBM is a Crocodile and Oracle is a Great White Shark, which living dinosaur is Microsoft and what is their feeding ground? I think they're going extinct. Scrambling into the grounds of the other apex predators, other predators that in at least some ways have understood how to cohabitate. Microsoft just stood alone too long on their own little island, much like the Komodo dragon. I just won't be surprised at all, if rightfully and sourly there isn't some destructive plan to survive. I'm not sure what happen to the T-Rex, but I'm sure they weren't reduced down to Iguanas quietly.
On a side note: As a MS "hater" myself, I'm not too sure if the overly hateful of Microsoft have thought things through, I can't help to wonder about what massive disarray would happen if they magically disappeared... just vanished. Sure, the bigger they are the harder they fall, but the crater they leave behind still has to be cleaned up. But, don't get me wrong, fuck Microsoft :-P
" I'm not too sure if the overly hateful of Microsoft have thought things through"
The older people like me have been following MS since it was established in 1978 and judgement is very clear: Whole company is a professional criminal, in *very* large scale.
No more, no less.
A role Google also has adopted lately, no doubt because MS was showing the way.
"Linux, Git, Python, Debian - disrupted and infiltrated by MS simply because they've taken their money?"
Yes. That's literally the *definition of infiltration*: Money buys power and power is used for disruption.
I see you don't even know the definitions of the words you are using.
"I'm inclined to simply say grow up"
No, you grow up. You are talking to people with 40 years of personal experience of dealing with MS and you are naive enough to believe they've changed. That's dangerous level of stupidity, sorry.
"I don't see why MS should be any different "
I see you have no idea how MS operates either, since the beginning:" Embrace, extend. extinguish."
This is the Embrace-phase. Literally.
If you don't see it, it's because you have zero experience of how MS operates.
I remember the excitement of installing DRDOS 6 from Digital Research - brilliant OS. M$ were evil and still are and for those who haven't the background to what M$ got up to in those early days I reckon the following link provides some enlightenment.
"Androgynous Cupboard" spouted:
I'm familiar with the MS playbook, but they're no longer the all-conquering behemoth they were - there's no M in FAANG.
* You can't be conquering after you've already conquered everything.
* The smartest thing a huge predator can do is to keep a low profile; then the more gullible of the prey won't realise what a threat one is.
HTH, kiddo. Maybe still try a few more rodeos, mmkay?
Do I even have to write anything more? Trigger warning: Ranty text ahead
When Ubuntu went on their Unity walkabout, GNOME Flashback provided me with a stable a happy desktop environment while GNOME fought their trench warfare to stabilise GNOME/GTK 3. With Ubuntu 18.04 LTS, a new laptop with a HiDPI display, Flashback (based on GTK2) became unusable. Ubuntu had ditched Unity for GNOME Shell and I thought "Ahh, they must've sorted out the issues".
Well, yeah, maybe, no. It's so spartan by default, you have to resort to extensions to recover some of the old-style functionality. Sadly, it's become a crap shoot to find a minimal set of (quality) extensions which a) do what you want, b) still work on the version of GNOME I'm actually using and, c) which don't cause GNOME Shell to randomly lock up for minutes on end or just restart itself (sometimes multiple times daily). The desktop has become a Greasemonkey-esque playground (sandpit? swamp?).
I appreciate the GNOME project is much more than just Shell, but boy, it casts a pretty long shadow.
"... you have to resort to extensions to recover... "
At least you can... I'm in KDE camp, the camp that looked better 10 years ago. As far as displays go (especially multiple displays), I think everyone is F'd there (Linux == Mac == Windows == F'd... I live through that every week... week by week).
Mac OS is possibly the OS with the most solid implementation: different dpis per screen in multiscreen mode? No issue.
Only one screen, hidpi? No issue. Detach low dpi screen and use hidpi retina screen? No issue.
All smooth and painless.
Now, Linux (x11) or windows? Good luck: apps that become blurry and need restarting, inconsistencies in the gui, artifacts of all kinds. I could go on.
It's certainly the most sold I've ever used. We have a few hundred of the things at work and they've been rock solid. That said, when people took them all home at the beginning of lockdown we got the opportunity to test them with a massive variety of random display hardware all of a sudden, not so much. I now have at least a dozen cases of macs simply refusing to acknowledge the existence of various external displays, or in some cases setting obviously incorrect resolutions / refresh rates and then refusing to even display the controls you would use to change them.
Don't get me wrong, it's still by _far_ the best OS I've used for handling multi-headed setups, but it's not as flawless as I first thought when I'd only ever used it in a controlled envrionment.
The fact that a lot of supercomputers run Linux is certainly indicative of the usefulness of the kernel, but really it's not that big a deal. All supercomputers are based on one CPU architecture or other that Linux supports, but it's not inconceivable that those same architectures could run Windows (in the unlikely scenario that MS thought it worthwhile spending the effort to do a boot loader / cut-back OS stack to fit). For an X64 based supercomputer, the nodes probably can boot Windows, and the ARM ones aren't far away from doing that too...
Linux gets used because its free, well understood, takes an performance-first view of things like memory allocation and scheduling, and is pretty efficient with modern multi-core CPUs. Using, say, VxWorks instead would probably result in a slightly faster computer (faster context switch times, thinner driver stack) but would cost a fortune.
The thing that makes these machines "super" is really the fabric used to join up a load of nodes, the mix of vector / GPU units on or adjacent to the CPU cores, and the drivers and libraries that the manufacturers create to allow developers to access those things. Some of this is pretty exotic, and as far as I know not a lot of it is maintained in the kernel mainstream.
For example, I'm fairly sure that the driver modules for Fujitsu's Tofu are not subject to Linus's beady eye. Though it's just possible he might relish having a cabinet or two of the requisite hardware to allow him to adopt that code and test it out. I know I would.
AC has no idea of how much time and money Microsoft has spent in order to compete on super computers too, but to no avail.
The Windows kernel does not scale well for super computing, and never will, and MS knows it very well too.
A good example was when the NY exchange had to swith to Linux (as every major exchange in the world) as MS was unable to deliver the speed and capacity they needed.
Most people won't know a thing about matters like that because there is no PR or marketing department behind Linux.
In his book - The road ahead, Bill Gates wrote about how he expected Windows to end up on every device, like televisions and what not. But the embedded market has all gone to Linux, however, to find out you have to dig for that information. Sometimes you find a few lines about it on the printed manual, on modems and routers and such.
But I doubt there is one single device in this world with the word Linux on it, and it doesn't matter but if you are in computing I think you could be aware of it.
"The road ahead, Bill Gates wrote about how he expected Windows to end up on every device, like televisions and what not. "
Yup, but he is and was an idiot. *Every one* of the attempts to shrink Windows to embedded size has failed and will fail, it has way too many cross depencies from everything to everything else to leave anything out.
Embedding the browser into OS? Brilliant: One browser hole grants root access to everything.
Point of Sales machines are *still* using XP (special, very minimized version to boot) as anything newer than that won't run in them at all.
You know I had the same experience as Gnome users when KDE went to version 4.
I had been happily using KDE 3 for years and all of a sudden everything went up in the air and it confused the hell out of me. I mean it was a bloody mess much like the change from Gnome 2 to Gnome 3. I soon bailed out and after trying a few other desktop environments settled on MATE, which ironically is the lineal descendant of Gnome 2. I did quite like Trinity but when I first tried it PCLinuxOS was giving it the cold shoulder and it was a pig to install. In the end MATE did what I wanted so I have stuck with it.
It does make me wonder why the devs of the likes of KDE and Gnome think that pissing off their users is a good idea. After all this being Linux we do have somewhere else to go if things are not to our taste.
" devs of the likes of KDE and Gnome think that pissing off their users is a good idea"
They don't give a f**k, literally. They do code because it's fun and f**k the lusers.
Case example: Pöttering. The guy who *every time* claims 'users are idiots' when they show how stupid his ideas are and code full of fatal bugs. Also hunger for power over system is infinite and because of that systemd is spreading like a cancer overtaking every essential service: 1,3 million rows of tangled spaghetti code no-one can maintain. P. doesn't care, it's *his way to control users* in everything they want to do. Literal penis lenght competition he has created for himself.
"Who cares, I do care only about me!" is the motto. While actually competent people use KISS.
... code full of fatal bugs. Also hunger for power over system is infinite and because of that systemd is spreading like a cancer ...
But systemd is not just bad software with bugs: it is much worse than that.
It is a developer enabled virus implanted within a Linux installation.
Just like the registry in all MS OSes, from W95 onwards.
It is nothing but MS's Embrace, Extend and Extinguish and if you don't rip it out, it will end up rotting the whole Linux ecosystem from the inside.
Fortunately we have Devuan, for now.
Migrating to Wayland - why are they doing this again? I bet it's the same justification as it was for the "new" interface of Gnome 3 [which broke the panels in my opinion, on several levels] and so on? And the major changes to the gedit user interface (making it harder to work with)? Just because it is *NEW* ?
Here are some GOOD reasons to make SURE they do NOT abandon X11.
* Remote execution of programs, via DISPLAY environment variable, where the UI is presented on a remote desktop across a network.
* The ability to run an application locally, using a similar method, with a different logged-in user context (works playing video, too). I can do this to REALLY sandbox a web browser, including cache, settings, script/cookie enabling, and so on.
* X11 is by its very nature client/server, and therefore leverages multi-core (to some extent) already.
* glx and driver-specific OpenGL already exist for enhanced performance
* change for the sake of change - ALWAYS a bad idea - Arthur C. Clarke's "Superiority" comes to mind
Having "Wayland Too" is fine, as compatibility is a GOOD thing. They just need to make sure that X11 is _NOT_ abandoned!
Otherwise, GTK3-based programs, which seem to run 'ok' on my Mate desktop systems, will NOT be able to run outside of a Wayland-based system.
(and having GTK3 more compatible with Mate settings, like getting rid of some of the scrollbar weirdness, would be a nice feature, too)
(and make those STUPID phone-friendly-menus and icons OPTIONAL rather than MANDATORY)
yes, the Wayland folks keep trotting out the same excuse that "well, these features *could* be enabled, we're thinking of looking into them sometime in the future, they don't work anymore, and just **why** are you asking for them? NO ONE uses THOSE anymore... Oh, you do? Well you shouldn't".
Just love it when someone can't be arsed to make a replacement even close to feature-compatible, then insult you when you point that out.
You might like to add:
* Broken accessibility aids by pushing the responsibility for things like that back at the applications instead of being enabled by the display server
The remote execution made possible by X's client server model is fantastic I think, and much overlooked. The "modern" way of doing this seems to be horrid things like VNC, RDP, which are extremely dumb, low quality and heavy on traffic. Rendering local to the user, which is what X does, is a nicer result.
Overall I can't help but think that a sensible overhaul of X, even to the point of breaking some things and getting rid of some of the older stuff (like the ancient raster fonts) would have been the better bet. Wayland is breaking an existing and successful model in the pursuit of something that won't arrive. Being on a 3D performance parity with, say, MS's DirectX isn't going to suddenly result in lots of games titles showing up on Linux.
"Rendering local to the user, which is what X does, is a nicer result."
Not to mention that you can run a remote GUI program on a local machine without the overhead of an entire desktop GUI running on the remote machine, especially when you want every last bit of CPU grunt dedicated to your task. Why would I want a whole remote desktop in a local window anyway? I know some people do that, but I don't see a use case for me. Same as the Wayland folk can't seem to understand or take into account my usage methods.
I don't really care one way or the other about "a whole remote desktop in a local window", but I *do* want to remotely access the entire remote console GUI workspace that has many browser windows and shell windows each with many tabs open on tasks in progress, times several desktops for different task groupings. Just logging in to all the servers I touch on a regular basis would occupy 30-45 minutes of my day, never mind the time lost to manually reconstructing all of the rest of the workspace state.
Shonky as it may be, Windows Remote Desktop utterly destroys anything I've tried on Linux for accessing a remote workspace. I can do *almost* anything over Remote Desktop, on a grotty 2M DSL line, that I can do sitting in front of the console of that remote system. I can barely read email, trying to use VNC (or any similar *nix tool I've tried) to (try to) work in the GUI workspace of a machine that is literally right next to the one I'm "in front of", both connected to the same gigabit switch. The situation is marginally better if I connect to a headless X11 workspace/session, but then I'm stuck with the leftover grottiness ALL THE TIME, including when I'm sitting in front of the machine the headless session is running on.
Wayland is more secure than X11. Wayland has big performance benefits over X11. On my i5 laptop, I am seeing 50% better Glmark2 scores in Wayland than X11. Maybe that isn't important to you, but it is a huge benefit to people who game, do graphic design, etc. There was a huge amount of cruft in X11, that was slowing stuff down, but everyone was afraid to touch it, for fear it would break stuff. This wasn't change for the sake of change. The x.org programmers felt that starting over was the only way to fix fundamental problems in X11. For years people have been complaining that Linux can't match the AV capabilities of Windows, and it looks to me like Wayland is what we need to finally make Linux competitive.
Projects like Linux Terminal Server Project have figured out how to deal with Wayland, so it isn't impossible.
Yes, maybe X11 should have been maintained for GTK 2 and Mate, but I think people who want that should step up and volunteer to do it, rather than just complaining about it. Wayland has been in the works for 9 years, and nobody has volunteered to take over X11.
For years people have been complaining that Linux can't match the AV capabilities of Windows, and it looks to me like Wayland is what we need to finally make Linux competitive.
Only a very very few people have been complaining. Wayland isn’t making it competitive because it’s not working, and has broken a lot of stuff. And whatever it is that you mean by making Linux competitive, it’s not going to attract the major games, graphics tools, etc. simply because there’s so many other problems in Linux for the non-Unix admin desktop user. It’s probably better to improve X, trim out some of its dross, improve its security, than to throw it in the bin and make a mess of its replacement.
There will be no GNOME 4.0 because, “if we ever did release 4, then people would see it as a huge change in [that] everything's going to be broken again, and that's not really what we’ve got,” he said.
Broken *again*???? You mean Gnome3 ever *STOPPED* being broken?
Maybe the Gnome folks didn't like my mockup of what a GNOME4 desktop would look like... https://flic.kr/p/cJ2zB1
"Our priorities are around encouraging end user control over their own computer, end user freedom"
Oh no! Folks: Be prepared for more GNOME-inflicted horror. Decisions made by GNOME which are not at all configurable are now impacting XFCE and MATE users (in my case starting in Ubuntu 20.04 but the changes are much older)
In their blinkered pursuit of Wayland and an almost religious cult belief that long establish UI principles are to be eradicated they've not only rewritten most of their applications to PUT A FUCKING HAMBURGER MENU and other buttons INSIDE the title bar of the window but they've been on a crusade and persuaded other Linux application developers to follow suit.
It was apparently all done because Wayland was originally only going to support 'Client Side Decorations' i.e. no title bar, menus or window border drawn by the window manager (I'm informed that this has changed and now classic windowing can be done in Wayland)
It's ugly. It sucks, and if you use a GTK3 application it's going to affect you too. If not now, then very soon.
Aside from weird things they've done to scroll bars, and removing the classic menu bar (it's slated to be physically removed from GTK4), they've switched to using 'popovers' for all menus, which look and behave more like a phone menu: Oh you use a mouse and you want to explore the next level of a menu item? You're gonna have to click on it to see it. Want to go back to the previous level? Click the back arrow my friend...
Apparently, quickly hovering is completely out of fashion!
I'd post a bit more about this but it's 11pm and I'm currently trying to decorate a ukulele birthday cake for my youngest offspring.... For a long read about this problem which includes a lot more detail then start with this: https://ubuntu-mate.community/t/horrible-gtk3-gnome-ui-design-is-leaking-into-ubuntu-mate-applications-in-20-04/22028
But they don't prevent you from making it to better suit your needs.
Stop being so angry towards those people. They worked hard to offer you a desktop for free, without imposing you any obligation. Now they're taking it some where you don't like so do as I do in these kind of situations, say thanks and move somewhere else.
Alternatively, start from here http://www.linuxfromscratch.org/ , build a usable desktop and see what it means that after all your work, someone comes and calls you a jerk because you failed his expectations. I dare you to try!
AC - That's simply not true and by posting a link to Linux from scratch you show a fundamental lack of understanding of this problem - not least that if I built my own distro from scratch I would still be impacted by this as soon as I wanted to use Firefox or Chrome or Thunderbird or GIMP or Libre Office and many other major applications.
The GNOME team are making a concerted effort to completely remove support for some of the standard user interface elements that we've all been used to for 30+ years. They're so set in their opinion that the title bar of a window is a waste of space and that hamburger menus are the future, that they're ripping out support for even having the free choice to do it differently.
If I were to 'make it better' to suit my needs as you so helpfully suggest there is no way in Hell that those guys would accept a pull request from me to add the very features they've just purposefully removed..
I'm not unjustified in being angry about this. Just because it's 'free' doesn't mean that people have no right to complain, nor that we should just accept inconsiderate changes.
Let's also not forget that there is loads of commercial backing for GNOME - it's a big project and it's got money and paid developers. I'm not having a pop at the kind hearted individuals who give up hours of their spare time to maintain some crucial library or essential utility - it's only the corporate stooges who have the level of power and control to make changes this big.
I personally don't care what GNOME does to it's own desktop - they already lost me years ago when they suddenly abandoned GNOME2 and offered a buggy, experiential and hugely different environment in its place; but I care very much when they impact other desktops...
Like many others who aren't interested in fashionable GUIs I switched to the MATE desktop as a way forward. It's been a stable solution for me for about 10 years now. Others went to Cinnamon and XFCE and have enjoyed similar peace and stability. One thing these all have in common is a heavy reliance on Gnome's GTK3 library. The environments are built on it and it's an essential component because practically every major desktop application in the Linux ecosystem is built on it. So it is just plain wrong of GNOME Foundation to dictate to that whole ecosystem that they now have to draw their user interface in any specific way just because some dude thinks it'd be way cooler to have hamburger menus and popovers.
Firefox and Thunderbird are leading examples of applications which still give users a choice to enable a classic menu bar although there's a distinct danger of that opting being removed in a future version of GTK.
It really wouldn't have been very difficult for GNOME to design this new UI code to be configurable and to give people a choice of classic Vs alternative. It wouldn't be that hard to allow adaptable menu/UI code that you can write once and then render different ways but instead the few developers who have kept giving users a choice are having to maintain two separate blocks of code. Most have just deleted the traditional menu code and rewritten it the new GNOME way so there's no turning back.
So basically GTK based UIs are now screwed with no easy way of fixing them. Even if GNOME had a change of heart it would be hard to persuade all the various projects to redevelop their UI and menus again so soon.
The options are grim:
Stay on older versions of everything to avoid this, which is obviously unsustainable.
Persuade a _lot_ of projects and developers to re-add classic menus and title bars - Very unlikely.
Fork every single application involved so as to fix the UI - huge amount of work and ongoing maintenance which just creates fragmentation.
Or perhaps to hope that a GTK compatible alternative library (such as STLWRT referenced in the article I linked) is able to render these applications in a classic style mode.
Essentially, the gnome team in its arrogance (and possibly ignorance too if they really didn't foresee this impacting non-Gnome users) has caused a negative change which is already so ingrained as to be near impossible for any individual or small group to reverse.
I'm seriously angry about this and I've had something like 6 months to think about it. It's bad practice and it's the exact opposite effect of what free software is about.
"They worked hard to offer you a desktop for free, "
No. they worked hard to *stuff s**t they like to your throat* with 100% zero care if you wanted it or not.
It's not "offer" when it's forced up to you in every update of *totally unrelated* products: It's a cancer which spreads and you are slowly murdered by it.
"someone comes and calls you a jerk because you failed his expectations. "
I call anyone forcing their s**t on me a jerk. For a valid reason. Expectation is at level "keep out" and they won't do even that. They *must* have me to run their ideals because they must.
No other reason exists, really: Fame and glory for *them*.
Calling it GNOME 40 is even more confusing. Just call it GNOME 4.0 and everyone will figure pretty quick if it causes disruption or not. There is no reason for these version numbering games that simply confuse everyone.
When they said that they didn't have the money to develop mobile GNOME, the real issue is that Red Hat, SUSE, Canonical and Google have any interest in developing mobile GNOME, whereas Purism did, and Purism hardly has the resources, so it didn't try to create anything crazy like GNOME would have done. I'm personally glad that Purism did it in their own way, because Phosh was designed for energy efficiency and using fewer resources, which I don't think GNOME would have done. Phosh also got rid of Mutter and Phoc is a more efficient way to go, so I'm glad that we got Phosh and not mobile GNOME.
“In some ways, open source has won, it is ubiquitous, I think now 100 of the top 100 supercomputers are now running Linux,”
Err open source has not won anything, if the top 100 computers run Linux then that is great, but bit like saying cars have won because the top 20 cars in Formula 1 are are all cars, and the Buses, Van, Trucks, motor cycles and bicycles etc have all lost something they were not even competing in.
Linux is the shining start of the open source world and good example of how the opensource model can work. Open Source still needs to have active and quality support just like closed source.
There many Open source projects that are poorly funded and badly resourced as far as contributors are concerned and yet often form important or even key parts of other opensource software, a computer running Linux is not running just Linux (I suspect that those top 100 are far closer than your average desktop) but a number of other opensource components that also rely on other open source Libraries etc which is when your system can start to look a bit like a house of cards.
It would be good if there was some sort of Open Source quality Mark that you could apply to all these elements to ensure you have so you could rate a build of say Mint against RedHat for Code Security.
It looked modern, launched stuff, stayed out of the way for the most part, was sane out of the box, was extensible and didn't have a bunch of complex or esoteric settings. That's what a desktop should be - facilitating doing stuff without becoming a burden. I can't say I appreciated everything it did but I think dists are better off from staying the course and using it.
"GNOME versioning: no repeat of 2 to 3 disruption"
"Disruption"? It was a major f**k-up, in astronomical scale. Them dimwits decided that a 24" screen with keyboard and mouse should look and feel the same as 3" inch screen operated by fingers.
For me that meant that Gnome, as a platform, is even more dead than Windows 8: I won't touch it even with a 10-foot pole.
Let's kill it before it pollutes more brains.
There is by now The Pine Phone. from pine 64 which runs Linux distro's by default.
I think this is a great step, even though it uses really old hardware it is still made to work like a pc.
a pc that is a phone, it even has direct multiboot support with even a standard easy installation of 12 different distro's in just a few gb.
but It runs Arch, ubuntu, debian, mint, manjaro, and many more Linux os's including the desktop versions/functionallitys.
it runs desktop softwares and has easy keyboard and mouse support.
So while it is only one phone, and it uses a 4core SOC from 2014/2015 and has 2 or 3 gb ram, it can be used like or as a pc. and can be hacked/modded and modified/reflashed to do what you want it to do.
Biting the hand that feeds IT © 1998–2021