That's just what we need, more fragmentation. Terrific.
Canonical top man Mark Shuttleworth says that Mir, the company's ground-up replacement for the X Window System graphics stack, is almost complete, and that the technology will ship with the next version of the Ubuntu Linux distribution in October. Canonical announced the Mir project in March to much controversy, particularly …
Coincedence or irony, both Tux and Ballmer are bald.
No, Tux is bald but the appearance of Ballmer's shiny head is caused by millions of tiny feathers which form a water-resistant layer over the top of his head. This is perfectly normal on his own planet but on Earth it seems to cause overheating and impair his brain function.
Perhaps it's time for him to go home.
There is bad fragmentation. Bad fragmentation affects typical end users. Individuals who just want to get something done. This kind of fragmentation is things like Microsoft changing their own document format every few years, or online messaging systems refusing to deal with each others' formats.
There is good fragmentation. Good fragmentation is invisible to typical end users, but provides innovation and options that ultimately make things better for everyone. This is good fragmentation. It provides another way of doing something, fresh implementations that will carry lessons and improvements that can be carried onwards. Complaining about this is like complaining that Microsoft keeps putting out new version of DirectX, or that Intel keep trying to make better processors.
> Survival of the fittest is surely best? lots of projects and the best one gains traction and the others fade out.
Except it doesn't quite work like that, does it? You end up with a bunch of evangelists pushing their favourite dead technology at just a loud enough volume that people still pay a smidgen of attention.
And nothing ever dies.
"For example, rather than a rigid protocol that can only be extended, Mir provides an API."
What the hell do you think the X Window System protocol is? IT'S AN API!
Nothing in the X Windowing System tells you HOW to implement XDrawLines - just what the call looks like, and how to serialize it across a standard X connection.
It's not even as if Mir is going to solve the problem of API versions either. Every API that evolves is going to end up with different versions all the problems inherent with that.
I just wish someone would make X do sound, other input mechanisms (e.g. touch) and print. That'd be pretty useful.
Quote: "Thanks, I didn't know that; clearly I am a bit behind on the latest!" By about 10 years or so :)
Esound used to do remote sound even before pulse and network audio system was even before that.
I have been using pulse since ~ 2004-2005 on xterms and while it is reasonably good, I prefer to use its esound compatibility mode and set environment accordingly (trivial - just add an extra 3 liner in the xsession init) instead of the standard mode.
Actually, X is arguably more of a protocol than an API, in that it traditionally at least specified things like "server must report any and all mouse movement events to the client, all the time" which turned out to be a bad thing when people wanted to implement Low Bandwidth X and avoid wasting bandwidth and latency on event updates the client almost always does not need. While LBX could try to ignore these updates, the basic API did not know if something higher up in the client (than XDrawLines) wanted the event updates, which made this type of optimization hard.
At least in X11, both the API (libX11.so or libgnome.so or whatever) and the protocol over the wire are both specified.
Now, without going away from X, there would have been nothing to prevent the server allowing newer clients to connect over "X12" and have this based on a newer and simpler API, with the over-the-wire part hidden from the client. If you support over-the-network clients or really anything where the client might be using an older version of the API library than the server, you're still going to need some "protocol" to specify how they interact though, even if this is versioned to support changes.
Actually, specifying what is expected of a system is a part of a proper API spec. Having an API without those guarantees of behavior is what leads to all this breakage you see in things like Windows - an API won't specify what is expected, and so people base their expectations upon what happens - then it gets changed and things break.
REAL software engineers specify not just what a function is called, not just what parameters and return values it has, not just what exceptions it throws, but what its invariants, preconditions, and postconditions are.
> What the hell do you think the X Window System protocol is? IT'S AN API!
The application communicates with the X Window System over a socket using the X Window System Protocol version 11 (X11). If the display is local then it is usually a unix domain socket otherwise it is TCP/IP socket. Communications with the server are always via a socket (hence its a protocol) and this is one of the main problems with X-windows.
Xlib, Xt, Xaw, Motif, Athena, Qt, GTK etc are toolkits (except Xlib which the others use) are toolkits which allow developers to easily build GUI's and communicate with the X server using the X11 protocol.
Err, wrong - modern X servers have extensions that allow the use of shared memory.
The client creates a packet that gets sent to the Xserver requesting a shared memory object is created. The server creates the shared memory object and sends back the shared memory object key. The client can then directly access this shared memory object and implement its own drawing primitives on it . This is how it works.
The creation/destruction and display of the shared memory object is all done via exchanges of packets between the Xserver and the Xclient.
"Err, wrong - modern X servers have extensions that allow the use of shared memory."
Yes, ok , Mr Nitpick , that is the case but extensions are part of the X server so whats your point? And show me any modern X server that doesn't use them. The fact that X is so modular and hence so flexible is one of its major strong points IMO, one that competing groups would conveniently like to ignore.
> but extensions are part of the X server so whats your point?
The point is that everything occurs over sockets. You might be able to create an object with shared memory, but that does not change the fact that X is a protocol, not an API.
Here is how a request is defined:
Every request contains an 8-bit major opcode and a 16-bit length field expressed in units of four bytes. Every request consists of four bytes of a header (containing the major opcode, the length field, and a data byte) followed by zero or more additional bytes of data. The length field defines the total length of the request, including the header. The length field in a request must equal the minimum length required to contain the request. If the specified length is smaller or larger than ....
This is how the initial connect is set up:
The client must send an initial byte of data to identify the byte order to be employed. The value of the byte must be octal 102 or 154. The value 102 (ASCII uppercase B) means values are transmitted most significant byte first, and value 154 (ASCII lowercase l) means values are transmitted least significant byte first. Except where explicitly noted in the protocol, all 16-bit and 32-bit quantities sent by the client must be transmitted with this byte order, and all 16-bit and 32-bit quantities returned by the server will be transmitted with this byte order.
Does that sound like an API or a protocol to you?
> .a university student can create a better API/protocol than the entire X team did for decades
I can tell you that you sadly overestimate the skills of university students.
Unless you are a university student.
In that case, the Dunning-Krüger effect may apply.
I can agree that clean slate starting with the the new "what works/what doesn"t" laundry list gained since the 90's is probably a very god idea. Go Mir, show what you can do!
I think it's fair to say that X11 is obsolete. X is almost 30 years old, and it was designed to be the "VT-100 for the 1990s", when graphical terminals were to replace character-cell text terminals that used the VT-100 escape sequences. Hardly the way the world went. A more modern purpose-built API would be helpful. Of course getting the fragmented Linux world to adopt anything uniformly would be harder than herding cats.
Actually, I have an early '90s book on X11 programming upstairs somewhere and it starts with the expectation of the original X11 team that in two to three years understanding would have advanced to the point where X11 would be outdated. That doesn't sound like they wanted their work to be enshrined as some never-to-be-touched-again ideal. If anything, they're probably disappointed it took so long.
Now maybe Wayland is the better successor, maybe Mir is. I have not studied either in depth, so no opinion. I do find it hard to imagine that the existence of two alternatives can be a bad thing at this point, though.
This post has been deleted by its author
Fragmentation is a problem big enough by itself. Each time I see the 300 distros listed on DistroWatch it makes me want to cry for all the wasted effort in creating a different variation of the same 4 basic distros.
But Mir is worse. Mir is yet another attempt at creating a screen management / user input / graphics acceleration layer that can be reused across low power mobile devices, high and low end tablets, laptops of all kinds and desktops from the Atom based machine to the multiple core monster. For the wrong reasons, I have to add. Canonical could have backed Wayland and end up with basically the same product, only with better community support.
And is not like Canonical lacks many examples of this strategy not working: Gnome 3 is working very hard to add enough extensions, plugins or whatever you want to call them so that one can more or less reproduce the Gnome 2 experience in order to stop hemorrhaging its user base. Microsoft is backpedaling from the TIFKAM radical change to add more classic Windows features. Apple keeps separate iOS and OSX codebases for a good reason. And KDE is gaining acceptance just by extending and improving on the traditional desktop/window metaphor and adding flexibility for those that want a simpler interface.
And that of course assuming that the first release is not so bug ridden that sends Ubuntu back to the "Linux is too hard to install and configure" camp and loses all the credibility gained over the last few years.
Interesting times we live in.
Ubuntu has always been about a simple, clean off the shelf implementation of Linux with a bit more polish than other distros - surely if this accomplishes this for the masses then it is fulfilling it's goal, if one wanted to use Wayland as a power user - removing Mir and installing Wayland instead shouldn't be a problem surely? I have to believe that Shuttleworth has a reason for his choice that fits in with his vision of where Ubuntu is going in the future.
I'm afraid to say that nobody knows what "Ubuntu" is about nowadays. It started as you say as a clean, off the shelf Linux implementation with a bit more polish. Now, and depending on which forum Shuttleworth is speaking to, it wants to be anything from the best mobile experience to the more robust server available.
At least nowadays, and as far as I know, Ubuntu is removing itself from the "off the shelf" as much as possible. First shipping with a desktop environment (Unity) that no other distribution uses or cares about. Mir is another step to distance itself from the mainstream Linux distributions.
Whether this Mir move gets Ubuntu closer to mass adoption is another question. And one whose answer is likely "no" After the initial success of MSDOS and Windows, no non-server OS on earth has a chance to win any significant market share unless it is preinstalled with some hardware, preferably one that sells gazillions of units to create the snowball effect that finally attracts both application developers and mainstream public. Mir by itself is not going to change anything in that regard.
Since Mr. Shuttleworth is certainly not stupid, I'm sure he already knows that. So this Mir vs. Wayland war is completely irrelevant for him. The only remaining valid explanation for Mir is then that Canonical sees Mir as a prerequisite for gaining the favors of a hardware manufacturer (think Acer, Asus, Lenovo or...) to get their buy-in and start massively selling some kind of machine with Ubuntu preinstalled.
"A bit more polish"?
Actually, that's being a little unfair to other distros. Ubuntu has always been more about providing a simplified way for end users to get into Linux which it does well. It does so, however, at the expense of some of the flexibility that other distros have, mostly because first timers and other users may not have a need for it.
Polish (reverse or otherwise) doesn't really enter into it. What it actually amounts to is that Linux is evolving on many different fronts. If Mir is a successful project, then it will no doubt appear across the board, either as a replacement or as a default, in other distros. If it turns out to be a pointless effort by Conanical to turn Ubuntu into more of an Android than a proper Linux distro, then you never really know. After all, Linux is not Ubuntu.
Chika, "simplified" doesn't quite describe it. Ubuntu off the shelf insults the user's intelligence. I just tried DreamStudio, which packages a lot of A/V apps atop it. I hadn't tried straight Ubuntu in years -- I like Mint, but that offers several other desktops.
The Ubuntu desktop was bizarre, a mix of old Mac tropes sloppily implemented. And then it was missing things. No "list" mode in the file browser, because "simplified" means only icon mode? Eccchh. No menus in Firefox, because they'll confuse your pretty little head? Gaaag. The dumbed-down look made it harder to use than KDE or Cinnamon. No wonder Mint is getting more attention; it takes Ubuntu repositories and makes them usable by people who don't have pea-brains that happen to be of the shape Shuttleworth imagines.
It's never achieved that IMHO. Maybe that is the aim, but there are several distros that are much more mature, been doing the install/config process in a much more polished way since before Ubuntu's existence.
They do seem to have a NIH mentality though ... for good or bad
Microsoft is backpedaling from the TIFKAM radical change to add more classic Windows features.
Users KICKING & SCREAMING for the Start Button (i.e. The Start Menu!), and getting trolled by Redmond with a faux Start Button that takes you back to TIFKAM. IS NOT BLOODY LISTENING, or "Backpedaling".
Who cares what Ubuntu does anymore, I stopped using Ubuntu the day they forced Unity on me.
I shook my head at TIFKAM/TIPKAM, then damn near threw the laptop out the real window with Windows 8.1.
I want what I want, and neither of those companies seems to care what I want, so they can go the way of the Dodo for all I care. If it wasn't for XFCE I'd have no OS. Basically if you like Apple but can't afford one, buy something with Windows 8.1, if you can't afford that buy something with Ubuntu's Unit on it, and if all of those drive you nuts because they appear to be made for simpletons and grandmas, use XP or XFCE.
MacGyver - this is exactly why Linux has no real market share in the desktop / device world. Nerds looking down on the masses, and not being interested in making a finished product that just works. You are not Ubuntu's target audience. You are a technical user interested in a highly customized operating system. I'm not meaning to be insulting, just trying to make a point.
So your response might be "Who cares about them! It's for me!" Well, that's fine if you want to be stuck with OpenOffice, Kivio/Dia, and all the other really bad applications on the Linux platform - or worse, having no Windows equivalent in the Linux world at all. Getting the Linux desktop to the point of being as relevant as Mac OSX means you start to get commercial applications ported to Linux (Adobe, MS Office, etc), or you start getting a lot more sponsor dollars/resources to the open source applications. Could you imagine if Kivio had the funding that Apache has because of it's server relevance. There's a bigger picture here.
> Well, that's fine if you want to be stuck with OpenOffice, Kivio/Dia, and all the other really bad applications on the Linux platform
You're just shoveling the same old mindless Lemming FUD that people like you have been pushing since before Linux ever existed. Nothing is acceptable unless it's the herd anointed choice. It doesn't matter how good the alternatives are or what their business models are.
Each time I see the 300 distros listed on DistroWatch it makes me want to cry for all the wasted effort in creating a different variation of the same 4 basic distros.
Welcome to the planned economy mindset recovery clinic. How can we help you today?
Your "wasted effort" is another person's "learning experience". And until this sodden earth has been absorbed by the hivemind, that ain't gonna go away.
"Your "wasted effort" is another person's "learning experience". And until this sodden earth has been absorbed by the hivemind, that ain't gonna go away."
I recognize the value of diversity, and you're right that without these experiments things would have progressed little, if any. However, the nature and origin of these 300 distros is a different matter. I challenge you to pick one from DistroWatch and try to guess why it exists. 80% of them will fall into one of these categories:
- "ego/career boosting experience" yes, I can create my own distro, which nobody can tell apart from the originating one except for a few details, but who cares, my name is there and I can boast of having released my own distro. Just change the default desktop environment that is installed, preferably for some obscure and hardly used one, and call it a day. Bonus points for simply reusing the package from the originating distro, saves you a lot of time.
- "NIH experience" I don't like the defaults that come with Fedora, Debian, Arch or whatever, so I can create one distro with a different set of defaults. I could simply create a meta package or a script that changed these defaults, but why do it when I can also get a "ego/career boosting" exercise as part of the deal?
- "political/nationalistic maneuver experience" I can rename all the LibreOffice packages to local celebrities (Shakespeare instead of Writer, Turing instead of Calc, etc...) Look, I'm pushing the envelope and contributing to the nation's IT industry, and saving a lot from Windows licenses.
Very few "learning experiences" in all this. Yes, some of these are genuine specializations or radically different approaches (BSD, Solaris, low memory overhead, security related... ) The rest is people genuinely wasting their time by just changing default settings or packaging default permutations of default apps and desktop environments.
Now, this is not an argument for the "planned economy", but really an argument to hear the advice from older and more experienced people: for vanity distros, the "learning experience" is that the people behind these realize how they are wasting their time and stop doing it. However, there seems to be an infinite supply of people ready to continue the tradition.
Whatever comes out of this, I keep being impressed with Ubuntu's and more importantly Shuttleworth's vision and striving to do SOMETHING about the moribund state of Linux on the desktop.
For everyone that wants standardisation, there is Debian. Go for it.
But Shuttleworth is the wild card - he is like a gambler in Vegas that will keep trying to beat the system, and GET Linux on the desktop (and every other device) successful, somehow. It will take risks, beating his own drum to his own tune...and will quite possibly, quite probably, fail.
BUT MAN do I respect him for it, and I've been in this computer business for a long, long time.
The problem with linux on the desktop is not the UI or display system, its the market share.
Linux is seen to win easily when there are large numbers. Large numbers mean enterprise deployments.... enterprise means dealing with MS controlled lockin apps - Visio and Outlook. Visio doesn't even show up on Mac's, never mind Linux.
Rather than sinking billions into his own distro, messing with Mir and Gnome3, Shuttleworth would be better off improving Kivio and Evolution. Yes, we want a nicely integrated lync replacement.
There was a time I would agree that MS were relatively safe on the desktop because of Office and Outlook and other business favorites, but every week I see a new cloud service launching that not only mirrors that functionality, but often improves upon it. I would happily subscribe to Office 365 on Linux if I had to, but other services are improving quickly and probably offer better collaboration options. You may also wish to look at creately.com too if you want an alternative to Visio and although I hesitate to offer Gmail as an alternative to Outlook, I would be surprised if Google weren't seriously thinking about adding more business friendly features.
@P. Lee Wholehartedly agree. The only way to get Linux out of the shadows is for someone to come up with a suite of end user apps that have wide distribution, are useful with no 'ifs and buts' and a suite that has depth - to include stuff like Lync/SharePoint/Project and a thumping good database engine (a big one and a small one)
Continuously tinkering with the OS (Ok its a kernel) will not get people like me, who advise people who buy stuff by the 10s of thousands, to consider to switch
I do quite know Visio, last time I used it was 5 years ago, though ... inkscape, man ... It is a hurdle to get used to the ui, but once you master it, you create graphics that Visio users dream of. In fact, the good thing about inkscape is you can do almost anything with it. No boring red and blue boxes with arrows here or there ...
Plus, reliable SVG-support and you create crisper PDF's than Adobe Acrobat in combination with Office ! I had the guyz over here drop their jaws when I sent them the corporate template in PDF - redrawn in like 30 minutes with inkscape... probably almost a millimeter off at places - could be fixed with more time, agreed.
Outlook ? ROFL - we use that shit here as well ... sometimes, at 12:50 PM, I look at the reminders, nothing, get back at 12:55 PM and I have a meeting that is a week overdue ... the meeting had been planned a few days before ... Webmail on Chrome is shit, on Firefox so-so, don't have IE to test it. Search in Webmail is useless. Outlook desktop app is a resource hungry, bloated piece of shit.
Thunderbird users know what I am talking about ...
Word is a waste of time and patience, PowerPoint newer than 2003 is shit, they removed the only feature that was cool - moving callouts intelligently. Excel has a few cool features, agreed, but by no means worth the price! Plus, none of them except Visio support SVG ... better things out there, man.
Anon for obvious reasons ...
You mean Android ;-) ? That would be Linux running on most tablets and phones out there.
Incidentally, Canonical is one of the other linux distributions explicitly targeting mobile and tablets now. Wayland may be a nice project, but I can imagine the Canonical guys have a long list of non negotiable requirements that maybe don't quite fit with Wayland that come from their past experience of trying to make Ubuntu work on phones.
I can't judge whether that is the case but OSS infighting has consistently been a huge problem with Linux throughout its history and the only ones who succeed with consumer facing products and Linux are those who sidestep those issues and instead make pragmatic choices about what to reuse and what not to reuse. Wayland might be the best thing since sliced bread for all I know but it might still not be the right thing for Canonical and obviously they feel it isn't.
X is a horribly complicated stack that sits in between the UI framework and the hardware. Replacing it is long overdue, especially since most UI frameworks are cross platform anyway and don't need X to be there at all. The fact that replacing X has taken well over a decade so far is indicative of how ineffective the OSS community can be when they put their mind to producing the perfect thing. Mark Shuttleworth is right to not wait for the stars to align just right for some uncommitted OSS gurus to bring forth something worthy to finally replace X. He needs this stuff yesterday.
> Don't be fooled. Shuttleworth is in it for the money only.
Oddly enough, before I retired I used to work for money as well.
I was lucky in that what I did for a living coincided with what I liked doing. If I didn't get paid for it I would have had to do something I didn't like for the money.
"Whatever comes out of this, I keep being impressed with Ubuntu's and more importantly Shuttleworth's vision and striving to do SOMETHING about the moribund state of Linux on the desktop."
Since when did the state of Linux on the desktop have ANYTHING to do with the low level graphics layer?
Surely it couldn't be called after a space station which collapsed down to earth and burned up in the atmosphere? That sure gives some ideas for the future... ;-)
(yes, I know Mir didn't crash but got de-orbited in 2001 and their crash into the ocean was carefully planned; you're ruining my joke ;)).
So because Canonical has ambitions in the mobile phone market they are going to once again use Ubuntu as the testing ground for their technology. Didn't we have enough of this when they re-did the user interface so it worked better on tablets. And on netbooks before that.
Here's a thought. You've already got millions of users who want a nice desktop and laptop operating system. How about keeping them happy?
"They won't be happy with the same DE, just running on a lighter backend display stack if it supports it?"
Excellent, bring it on. MIR is impressive (if slow at present using nouveau on my Nvida card) for a new low level graphical system coded and tested in a short time (13.10 daily build installed to a spare hard drive and added the MIR ppa and the apt-pin tweak).
Now, can someone at Canonical or elsewhere spend a bit of time on bug 739184 and its many duplicates? I have to use Gnome Shell rather than Unity to get work done. An LTS release with this kind of UI bug will not be good for large scale adoption of Ubuntu.
"Here's a thought. You've already got millions of users who want a nice desktop and laptop operating system. How about keeping them happy?"
Because in his mind (and Microsoft's, mind) they're a dying breed. Soon they'll be niched and the big money will be in phones and tablets. He obviously is unwilling to concede the market to Google and Apple, so for him it's "adapt or die" time.
"Because in his mind (and Microsoft's, mind) they're a dying breed."
I think that's the central assumption -- that even if we still have ten trillion desktop computers in 2020 there isn't going to be any money in providing the OS for them and so if you are currently in that business and you want to be making profits in 2020 then you need to find a new market to sell a new (if related) product.
Personally I'm not convinced. Despite the best (?) efforts of the industry for a decade or more, I still haven't seen a UI that that fits on a phone-sized screen and doesn't need a keyboard but is still usable for content creation rather than mere consumption. Given the combined limitations of my anatomy and my typical working environment, I don't ever expect to either. I find it utterly bizarre that anyone in the industry thinks this is possible.
"We take a lot of flack for every decision we make in Ubuntu, because so many people are affected," Shuttleworth wrote. "But I remind the team – failure to act when action is needed is as much a failure as taking the wrong kind of action might be."
UND YOU VILL ENJOY IT! GUARDS, SEND ZIS MAN TO ZE SHOWERS!
Mir and Wayland will, no doubt, both have to support running old X11 apps for ages, so not sure you'll really care which is at the bottom of your graphics stack any more than you're not that bothered whether it's Intel, AMD or Nvidia graphics chip. I exclude games players - you're on your own :-)
Who says Wayland is better ... it might be, it might not be ... having more than one group try to skin the "what comes after X11" cat is a good thing. There may be space for both, or maybe not ... if not, one will win, the other will lose.
In the bigger scheme of things, Android doesn't even use X11, Windows doesn't use X11, I don't think my Chrome book uses X11 (though I don't actually know). Applications of all sorts still exist in some form or other on all these platforms.
Linux won't ever win the desktop wars, united or fragmented. That won't stop me using it in all its glorious variety ...
The API vs protocol argument is bollocks, though, I agree.
Good on someone atleast trying to forge ahead. If you don't like it don't use it!
Everyone bangs on about how open and free Linux is to be able to do what you like. So as soon as someone excercises that right everyone start whyning like little children.
Suck it up princess... if you you so opposed to this maybe you should create your own distro!
Part of the problem with X is conflicting goals. X was designed to be useable remotely. That's why it has a client/server architecture and is network-transparent. Which is a problem when you need to optimize performance because the best way to do THAT is to get close to the metal. A network layer is an obstruction in that scenario.
Yes I am aware of that and the X system definitely fits that purpose. However that is not what Ubuntu are trying to achieve so why should they not move onto something that is not chained to the past?
Ubuntu clearly has a vision, while it may or may not be agreeable with some but atleast they are doing something about it.
If it pans out, well we jsut have to wait and see... what is annoying is that every acts like ubuntu doing there own thing is killing the other projects. The other projects exist with or with out ubuntu and there are many other distros out there that one can use instead of Ubuntu.
> so why should they not move onto something that is not chained to the past?
...because something strange happened while X haters were locked in their little echo chamber.
The rest of the industry discovered the utility of some of the more "esoteric" features of X. The idea of tying things to the hardware is quite frankly a childish way of thinking that belongs back in the 80s with things like the Atari ST. Display technology tied to hardware is simply intolerably primitive.
Unix is a multi-user network operating system.
" The idea of tying things to the hardware is quite frankly a childish way of thinking that belongs back in the 80s with things like the Atari ST."
Until they rediscovered the simple fact that, when it comes to serious number crunching like 3D graphics, nothing beats dedicated hardware chips, and if you're gonna keep the beast fed, you want as few obstructions between the GPU and the rest of the system. So high-performance 3D drivers strive to be lean and mean and close to the metal: out of necessity. It's like hand-tuned code; sometimes, when speed matters, there's really no substitute.
And while Linux may be a multi-user operating system, a group of 1 is still a group AND the system must recognize that physical displays play by different rules to remote ones.
You comment is exaclly what I wanted to say... everyone bangs on about linux... that if you don't like something code something yourself... change it.... when someone does everyone moans about them doing just that. Same as with desktops.... for years I have been seeing linux users moan about not just copying windows and that the desktop should be pushed ina different direction... and what happens when everyone does this? Moaning that they are now different from what people had before
"Linux Graphics... is rubbish "
I didn't know that - here's me spent years building protein structures and manipulating thousands of atoms in stereo 3D and all the time it was rubbish. Same with editing 1080p/50 video and playing back the same with hardware acceleration - all rubbish -I'm shocked. Next time I convert RAW DSLR photos I must remember that it's all rubbish I'm experiencing
I think those of us who only use Linux for everything don't have the problem.
I'm sick and tired of the all programs I have that must be updated every fscking time my Linux OS of choice gets an update. I need them to run, as is, on this years, last years and next years Linux release without having to hope the programs will update to run on it or finding new re-compiled versions. If Apple can do it with their nix based OSX then why the fsck can't Linux after all these years! It's gotten so that it's pissing me off as much as Microsoft's OS shenanigans. If Shuttleworth somehow makes this possible then I'm aboard with it even if Ubuntu UI is weird.
Im a little confused.
"I'm sick and tired of the all programs I have that must be updated every fscking time my Linux OS of choice gets an update."
Why update your Linux distribution just because there is a new release? If stability is a major factor then there are plenty of LTS releases to choose from. If you are using something like LMDE, then the only question is 'Why?'
If you are talking about general updates, then usually, if you stick to a main release, the updates are well tested and should behave with your programs. Unless, again, you are using LMDE but then what do you expect from a release based on Debian testing?
I think you are suffering from upgraditus, the fact you must have the latest because it is there.
> I'm sick and tired of the all programs I have that must be updated every fscking time my Linux OS of choice gets an update.
If that's a problem then it's entirely your fault. We're talking about Ubuntu here. It doesn't get any easier than upgrading a Debian based distribution. Everything and the kitchen sink can be updated with a single command.
If you don't want new versions of your apps then why are you bothering to upgrade to begin with? It's Unix. You don't need to mess with it all the time. It's not some malware magnet that will be a menace by next week if it's not constantly patched.
Stop thinking like a Windows user.
Was referenced above when it was pointed out that the desktop is a dying breed. Disregard the projections all you want, the numbers from the last few quarters point to the slow bleed of the desktop world. Due to laptops, tablets, netbooks, smartphones, or the back of a shovel with a rock, the desktop that I know and love is fast becoming the province of code monkeys and those who can't play FPS games without WADS and a mouse.
Even laptops are being displaced in some workplaces by tablet and smartphone combos (though you'd have to pry my EliteBook out of my cold, dead hands). Sadly, that means a modern OS is going to have to either account for that (Win8 and the Ubuntu attempts) or divide and (attempt to) conquer (iOS and OSX). The former strategy promotes bloat and waste while upsetting the apple cart just to frustrate and irritate the current users, while the latter provides a consistent experience that tricks users into thinking interoperability should be flawless.
Transition points are often no-win situations for the incumbents. MS made their bones when the world moved from command-line to mouse-driven GUIs. Yes, the grifted and stole from existing products, lied to collaborators about their intentions, and generally begged, borrowed, and stole their way to the top, but if it wasn't them, it would have been someone else. Apple got out to an early lead with iOS, but they've stagnated and rested on laurels, much like they did in the 80s. Android is just a way for Google to make money on ads and will lose its support if the shareholders ever find a way to force a spinoff or end to the various pet projects that happen. It might be for short-term gain and kill the company long-term. but that is what today's shareholders do best.
You see you make the same mistake the media does when it reports so called experts state the traditional laptop/desktop is dieing. The fact new style devices take a lot of work away from the more traditional devices does not mean that they are dieing, it means there is just less need to use Laptops/desktops all of the time. Once you didn't really have a choice. Now you do and for those where it suits, it will be used.
This is, I suspect, the problem with modern tech companies where the board, the people who generally don't use a computer day to day in any complex manver, can't envisiage a reason for people to do so and seem to discount anyone tecnical as just 'too techy for their own good'. Microsoft are very guilty of this with the completely stupid, mental idiotic idea of removing the Start Menu, with nothing sensible to replace it. If you think about it, it woudln't have been hard for them to introduce a dual mode, with either or. If they had then Win 8 would have been, probably, lauded as it is, with Classic Shell installed at any rate, smoother and quicker than 7 to use.
Keyboards, mice, laptops and desktops will be around for more years than you or I.
That's my point: The incumbents have the option to either combine the two use cases into a single OS or create two parallel OSes that share a design language. Both options have merit and problems.
And again, the actual sales numbers point to declining desktop and laptop usage, not just projections made for 2019 by Gartner and others looking to provide consulting services to flailing companies. In fact, last night the Q2'2013 numbers were just published by both Gartner and IDC, and it looks like worldwide sales dipped another 10% from the same period last year.
It's a niche that matured 5 years ago. A software company can't spend billions of dollars on maintenance and incremental upgrades to a platform that stopped being a growth category. For years people took Microsoft to task for keeping so much legacy code and operations that stretched back to Windows 3.1. It was repeatedly claimed that if only Microsoft rebuilt from the ground up could they create a pristine OS that would rock out world. Well, they took a bite at that cherry with Vista and botched the implementation. Win 8 came around and tried again, but they were upfront about the design language, and people spent the next year whining about how things had changed and how terrible it was that their modern OS specifically designed for the new wave of tech (touchscreens, specifically) didn't work how they wanted. So MS capitulated and came out with 8.1. Hopefully for their shareholders, a lot of money wasn't spent to give the start button back.
I don't think there is much doubt that the venerable X isn't really up to the job of handling modern high performance graphics and being accelerated by modern GPU's. The problems with trying to accelerate X on the Raspberry Pi is a case in point - the Raspi has pretty decent 3D and 2D acceleration using standard API's but no one has managed to apply them to X (on Raspi or any other platform). X simple is too old and pre- acceleration. Weston/Wayland is the way the Raspi people are going, but if Canonical want their own back end, why not? Weston/Wayland is taking forever, and maynot be exactly what Canonical want. Canonical have knocked this MIR system up in less time, and of course to their own spec so it does exactly what they want and need.
If it turns out to be a mistake, evolution will deal with it. Or it may turn out to be a smart move, and Ubuntu gets even faster and easier to port to other platforms with acceleration. Either way, I'm not complaining about someone putting their money where their mouth is. That's refreshing, there's too much mouth and not enough action from a lot of alternatives.
I'm one of those who wouldn't use MS with a gun to my head, doesn't want to be hand-held by Apple, and is therefore using Android on mobile devices. I would swap android for a real Linux ( I know , I know Android is linux, blah blah -I mean a full fat, apt-get linux with a real terminal)in a heartbeat, even thought I don't really dig Ubuntu on a desktop. I just thought I would mention this - there are probably a lot of people who would do the same
You could always get minted, if you really insist on an Umbongo related or, at least, Debian apt-get style Linux. Actually, I don't mind Mint too much, since you end up sharing repos with Ubuntu so you get quite a range of stuff. Don't rule out the RPM kit, though, since they benefit from a lot of support from high power distros such as RedHat and Suse. I've been using Linux as my main OS for many years to the extent that, while I have a W7 PC on site for a couple of things, it doesn't get a lot of use.
Anyone who says the current state of graphical environments on Linux is good, is full of crap and preaching from the open source bible. X-windows sucks at dealing with monitors dynamically - especially multi-monitor, while Windows and Mac have been doing this for years with NO effort to the user. Wayland has been debated far too long in my opinion, and will be yet another battleground of Linux religion rather than practicality. I understand why Ubuntu is working on their own system - which will most likely end up being far more successful, and arrive quicker and more stable than Wayland.
Linux could actually use some more fragmentation - the right kind. By this I mean that Ubuntu is a great Desktop-centric distribution, with a thin server "core" distribution as well. It's a cohesive experience (yes I like Unity in its current form), and for the most part it just works and performs very well. I'd put openSuse as a distant second, due to the fact that its just another combination of dated tech glued together (X, KDE) with a pretty face - and not much innovation. Let all the other distros focus on being a server. Ubuntu isn't messing with the core OS, just the desktop GUI. LET THEM DO IT - because honestly everything else we've had on the Linux desktop has sucked up to this point.
Bad fragmentation is Linux is when we have like 500 Distros on distrowatch, with every Bob and Harry linux respin of the same crap over and over again. Even worse is when we get into purist Linux religious based choices. These approaches work at the core development - as they keep the basis of Linux on track, but when it hits the GUI, we end up with a Gnome 3 where one group of nerds makes an interface that is just insane to the other group of old school Linux nerds, and is COMPLETELY incoherent to the other 99.999% of the populous. I don't see Wayland being much different, and device support suffering due to it.
Ubuntu has the resources to bring a smooth and reliable Desktop to the masses, and finally conquer the Windows / Mac desktop oligopoly because they are concerned with simplification, productivity, and device support. Let's all just stop the squabbling over these Desktop issues, and get behind the only one that has a chance at becoming a relevant Desktop for the rest of the world.
"Anyone who says the current state of graphical environments on Linux is good, is full of crap and preaching from the open source bible. X-windows sucks at dealing with monitors dynamically - especially multi-monitor,"
Thank you for joining today to post this but I happen to have different experiences. Plug a monitor into my laptop or netbook and it's recognised and I can use it as a second monitor or display on both. No fuss -it just works.
This on OpenSuse 12.3
Meanwhile, trying to do the same on an old Dell laptop with Xubuntu has the opposite effect: I need an external monitor just to see the screen because installing it causes the laptop screen to go dark: using both free AND non-free drivers. A check at the logs shows neither driver can recognize the chipset/screen combination and basically falls out. Finally gave up on the whole mess and put XP back on (thanks to the OEM sticker), so depending on your hardware it CAN be hit or miss.
That's the problem with craptops: just barely compatible and with a slightly modified graphics/network/whatever chipset to stay within an energy envelope of shave off a few dollars per unit (Windows-only special sauce drivers thrown in, upgrades downloadable from the manufacturer's website using the manufacturer's special install program -- if titsup events don't occur, ain't it Fujitsu-Siemens?) ...
Perhaps, the key to giving Linux its push to the major adoption is finding a new way to make a buck of the development of applications?
Support models work good for the Server End. They would work great for the Desktop end also (Thinking RHEL on desktop), but I think application are missing the beat.
If I look at an application like Dia (which I use constantly), it has the functionality I need but its kinda of old to look at., While it achieves the goals, I'd like a little flash to my application and this seems to be only happening in the web sphere.
You want to make a buck in Linux applications? Just sell them commercially. Some companies actually do that, and nothing in the Linux license prevents this, as they're only interested in keeping the KERNEL free. If integral parts of the distro want to be free as well, that's up to them, but binary blobs sold commercially? Entirely possible. Otherwise, Valve wouldn't have dared to try to migrate Steam to Linux. Now, granted, you need a market for your software, but that's more a matter of market research rather than development.
Biting the hand that feeds IT © 1998–2021