back to article Ubuntu Wayland: Shuttleworth's post-Mac makeover

Ubuntu Linux spent the last few months of 2010 dropping bombshells on the Linux world. Founder Mark Shuttleworth is clearly intent on shaking the foundations of his popular Linux distro and pushing it, and Linux at large, in new directions. Shuttleworth is fast becoming the Steve Jobs of Linux - one man, one vision, one …


This topic is closed for new posts.


  1. Neil 7
    Thumb Up

    MeeGo are also considering using Wayland in the future

    But not just yet. The problem with Wayland is that it requires a whole new set of graphics drivers in order to fully reap the performance rewards promised by Wayland.

    And there will be a version of Qt for Wayland also.

  2. Sean Baggaley 1


    ... a worthy rival to Apple.

    It's a shame both companies want to foist variants of a 1970s OS on the world, but you can't have everything.

    1. Giles Jones Gold badge


      What's 70s about OSX and Linux? Linux is the OS core (kernel) and was started in the 1990s. The mach kernel as used by OSX was started mid 80s.

      So the filesystem layout underneath is like Unix so what. It is a decent file structure, it is tidy, disciplined and not a bloody mess like Windows.

      WIndows mixes exes, DLL and folders with logs all over the place. Windows is really messy.

      OSX and Linux use a lot of GNU software for the shell and so on, are these 70s too? do you really think they have remained static ever since they have been created?

      Unix may seem old and dusty because it hasn't changed dramatically. You know why that is? because they did the job correctly in the first place!!!!

      There was no stupid 8.3 filenames that needed to be bodged into longer filenames, Unix always supported long filenames. There was no need to add proper multitasking and multiuser as Unix has always done that.

      Windows = "32 bit extensions and a graphical shell for a 16 bit patch to an 8 bit operating system originally coded for a 4 bit microprocessor, written by a 2 bit company, that can't stand 1 bit of competition."

    2. tim-e

      If you don't like *nix...

      There's always Haiku.

      1. John I'm only dancing
        Thumb Up

        @ Giles..

        I love the description of Windows, very true, especially the last two.

        1. This post has been deleted by a moderator

          1. Anonymous Coward

            Shut up you muppet

            Its obvious that you have absolutely no idea what you are talking about.

            A: B: C: D: E: F: AA: XX: What a complete f**k up that is!!

            My ass, your so called modern file structure it actually dates back to DOS at least which believe it or not according to Wikipedia is: DOS, short for "Disk Operating System",[1] is an acronym for several closely related operating systems that dominated the IBM PC compatible market between 1981 and 1995

            So it doesn't matter how up to date you think your super modern OS really is, it has its roots way back then, like everything else.

            Oh yes except its still shite.

          2. Anonymous Coward

            @AC 12:01 GMT

            Mate, stick to posting comments to X-Factor type forums, you clearly are completely clueless to anything OS-related. Did you just read an MSDN magazine article and thought you knew everything there is to know about OSes?

            1. This post has been deleted by a moderator

              1. W. Keith Wingate

                It's all in what your used to

                I was reflecting on how much more comfortable I am with ~/Documents than C:\Documents And Settings\... or wherever it's been moved, and I realized that if all you've ever known is File -> SaveAs ... then it will be pretty obvious where some application has saved the thing that I seem to always have to Search.... for on WinDOS boxes. If OTOH, you simply think about where YOU want to put your stuff and put it there, without having to click thu half-dozen "parent folder" icons and such, you won't have that problem. Directory names w/ spaces annoy me. Multi-homed file systems (C:, D:...) annoy me even if wrapped w/ a virtual "folder" like "[My] Computer", in turned wrapped in "Network" .... Non-obvious icons replacing perfectly readable text menus annoy me. But that's because I'm not used to them.

                Architecture though is another matter. That the *nix file system has not changed much in the way it appears to the user in the last 40 years or so speaks rather highly of it, I think. The fact that it was (very badly) copied for the original MS/PC-DOS as well as (more properly) for modern Mac OS's speaks even more highly of it.

                User preferences are very subjective (maybe I HATE the Office 2007 ribbon; maybe that's what finally got you to upgrade that Office '97 you've been running at home) ; architecture somewhat less so.

          3. ricegf2

            I assume you're new to Linux?


            Click Places -> Home, and tell me what you see on the left column of the file browser. On my Ubuntu 10.04 desktop I see Home, Desktop, File System, Network, Mannheim Steamroller (that's what's in my CD drive), Trash, Documents, Music, Pictures, Video and Downloads.

            Not cryptic a'tall.

            So let's talk architecture (I happen to be a senior system architect, and work daily with both environments). Windows stores its application settings for users in many places - a central binary repository called the "registry", configuration files in privileged areas of the Windows folder, and under Documents and Settings on a per-user basis. If you want to ding Linux on "friendliness" for the internals, try teaching a newbie to use the Registry Editor and *then* we'll talk. :-D Linux always stores application settings for users under the user's personal home directory, and always in a consistently formatted text file. This is a "legacy" (in a good sense) feature, because Linux has always been multi-user, and has never been hobbled with a binary registry.

            Windows also maintains the "legacy" (in a bad sense) feature of labeled drives. To access a network drive, I must "map" the drive to a letter (say "N:") and call it that going forward. This makes sharing network-oriented scripts problematic, among other problems. I can also use a "UNC" path (friendly name, that), but that doesn't work in some use cases - hence the necessity of "reserved" drive letters in corporate settings. Linux simply places all file systems not on the primary drive in a standard folder - on Ubuntu it's the /media folder - and the /media folder is always listed with user-friendly names (as I show above) under Places in the file browser, in the file dialogs, and in the main Places menu on the desktop.

            Need I go on? The Windows OS architecture has some significant baggage left from its DOS days, as well as some poor (IMHO) design choices influenced by Mr. Cutler's legacy Vax OS, that make it less flexible, scalable, and friendly than Linux - which is one reason why (for example) Microsoft had to write a new OS for WinP7 phones, but Linux works just fine from Android phones on the low end to supercomputers on the high end. In fact, Linux controls 27% of the smartphone market (vs. WinMo / WinP7's 3%) and 92% of the supercomputer market (vs. Windows' 1%), and is first or second in every market in between except desktops.

            It is this remarkable success in every other computing field that best defends Linux against your claim that it has some technical usability problem hurting its desktop acceptance. Actually, Linux' struggles on the desktop have more to do with vendor support for games and certain peripherals caused by Microsoft's longstanding DOS / Windows desktop monopoly, and lack of commonly available pre-installs due to Microsoft's well-documented exclusionary business practices - not technical shortcomings.

            Please understand that I don't say this to trash Windows - it's certainly usable enough - but to defend Linux against your misinformed attack.

            Now if you could explain to me why Mr. Cutler didn't bring intrinsic file versioning to Windows from the Vax when he had the chance - THAT would be an interesting discussion! :-D

            1. This post has been deleted by a moderator

              1. ricegf2

                Did you read what I actually wrote?

                @registerfail: "'re wrong to suggest it stores all application settings in the home directory as, usually configs are in /etc, what it does store in the users home director is user specific settings..."

                I actually wrote, "Linux always stores application settings ***for users*** under the user's personal home directory,.." (emphasis added), which would be "user specific settings". You're agreeing with me here. Please re-read.

                @registerfail: "whilst you're also right that Windows maintains a lot of legacy concepts, very little original code is still there".

                Which is exactly what I said - "Windows also maintains the "legacy" (in a bad sense) ***feature*** of labeled drives". I said nothing of "original code"; we were discussing the user interface. Again, please re-read.

                "whilst you're right that Linux holds a far larger share, it's also still got a far smaller share than Symbian, which, ironically, it also a very well designed microkernel based OS"

                OK. The discussion was Linux vs. Windows, not Symbian, but let's look at Symbian. Linux' share grew almost 600% last year, whilst Symbian's dropped by a third or so, and Nokia is working with Intel to migrate from Symbian to Linux starting with their most expensive (and highest profit margin) smartphones. This doesn't actually support your "Windows and Symbian have better architecture than Linux" argument either way (as market share != architecture quality), but it certainly doesn't instill much confidence in microkernels having a marketing advantage! (Disclaimer: I'm a huge MeeGo fan thus far, but I like Android as well. ;-)

                " there are too many times the user has to end up at the CLI"

                OK, please name them. Whilst using the CLI is sometimes more convenient (and Microsoft underscores its importance by emphasizing their PowerShell product for power users), I know many Linux users who have no idea what a CLI is. To be specific, I assert that there is no task a *normal* user would be expected to perform on Ubuntu that would require a CLI. Please name one that I've missed.

                "people are jumping on the attack against any Linux criticism without seeming to actually have any valid points I thought I'd throw in my 2 pence- denying there are problems with something"

                Well, first, an attack against an attack is usually called "a defense". :-D

                Second, I seem to have had some valid points ***since you repeated several of them almost word for word*** in your post!

                Third, I didn't "jump on the attack against any Linux criticism", but rather against *invalid* criticisms. One of yours, for example - "The driver issue" - is one that *I* stated in my post - "vendor support for games and ***certain peripherals***". So I find it roundly unfair of you to claim that I'm "denying there are problems with something" when I listed two in my *defense* of Linux! Geesh!

                All in all, I think you need to re-read what I actually wrote and respond to that. Your post seems to have been written after a cursory skimming of my words, and doesn't address (or in many cases simply repeats) the points I actually raised. But I do appreciate the response, even if you do imply that I lied in stating that I am a senior systems analyst who works with Linux and Windows on a daily basis. ;-)

                1. Peter Gathercole Silver badge

                  @ricegf2 - Posts after my own heart

                  I could not agree more with what you are saying.

                  Some people in this comment trail have been saying that the names of the UNIX/Linux filesystems are cryptic. This is not the case, as they all have meaning, although like all things UNIX, the meaning may have been lost a little in the abbreviation. I will attempt to shed some light on this, although this will look more like an essay than a comment. Please bear with me.

                  Starting with Bell Labs. UNIX distributions up to Version/Edition 7 circa 1976-1982.

                  / or root was the top level filesystem, and originally had enough of the system to allow it to boot (so /bin contained all of the binaries (bin - binaries, geddit) necessary to get the system up to the point where it could mount the other filesystems. It included the directories /lib and /etc, which I will mention in more detail later.

                  /usr was a filesystem, and was originally contained all of the files users would use in addition to what was in /, including /usr/bin which contained binaries for programs used by users. On very early UNIX systems, user home directories were normally present under this directory.

                  /tmp is exactly what it says it is, a world writeable space for temporary files that will be cleaned up (normally) automatically, often at system boot.

                  /users was a filesystem used by convention adopted by some Universities as an alternative for holding the home directories of the users.

                  /lib and /usr/lib were directories used to store library files. The convention was very much like /bin and /usr/bin, with /lib used for libraries required to boot the system, and /usr/lib for other libraries. Remember that at this time, all binaries were compiled statically, as there were no dynamic libraries or run-time linking/binding.

                  /etc quite literally stands for ETCetera, a location for other files, often configuration and system wide files (like passwd, wtmp, gettydefs etc. (geddit?)) that did not merit their own filesystem. With all configuration files, there was normally a hierarchy, where a program would use environment variables as the first location for options, then files stored in the users home directory, and then the system-wide config files stored in the relevant etc directory (more on this below).

                  /dev was a directory that contained the device entries (UNIX always treats devices as files, and this is where all devices were referenced). Most files in this directory are what are referred to as "special files", and are used to access devices through their device driver code (indexed with Major and Minor device numbers) using an extended form of the normal UNIX filesystem semantics.

                  /mnt was a generic mount point used as a convenient point to mount other filesystems. It was normally empty on early UNIXes.

                  When BSD (the add-on tape to Version 6/7, and also the complete Interdata32 and VAX releases) came along (around 1978-1980), the following filesystems were normally added.

                  /u01, /u02 ..... Directories to allow the home directories of users to be spread across several filesystems and ultimately disk spindles (this was by convention).

                  /usr/tmp A directory sometimes overmounted with a filesystem used as an alternative to /tmp for many user related applications (e.g. vi).

                  I think that /sbin and /usr/sbin (System BINaries, I believe) also appeared around this time, as locations for utilities that were only needed by system administrators, and thus could be excluded by the path and directory permissions from non-privileged users.

                  Things remained like this until UNIX became more networked with the appearance of network capable UNIXes, particularly SunOS. When diskless workstations arrived around 1983, the filesystems got shaken up a bit.

                  / and /usr became read-only (at least on diskless systems)

                  /var was introduced to hold VARiable data (a meaningful name again), and had much of the configuration data from the normal locations in /etc moved into places like /var/etc, with symlinks (introduced in BSD with the BSD Fast Filesystem) allowing the files to be referenced from their normal location. /usr/tmp became a link to /var/tmp.

                  /home was introduced and caught on in most UNIX implementation as the place where all home directories would be located.

                  /export used as a location to hold system specific filesystems to me mounted over the network (read on to find out what this means)

                  /usr/share was also introduced to hold read-only non-executable files, mainly documentation.

                  About this time the following were also adopted by convention.

                  /opt started appearing as a location for OPTional software, often acquired as source and compiled locally.

                  /usr/local and /local often became the location of locally written software.

                  In most cases for /var, /opt, /usr/local, it was normal to duplicate the bin, etc and lib convention of locating binaries and system-wide (as opposed to user-local) configuration files and libraries, so for example a tool in /opt/bin normally had it's system-wide configuration files stored in /opt/etc, and any specific library files in /opt/lib. Consistent and simple.

                  The benefit of re-organising the filesystems into read-only and read-write filesystems was so that a diskless environment could be set up with most of the system related filesystems (/ and /usr in particular) stored on a server, and mounted (normally with NFS) by any diskless client of the right architecture in the environment. Different architecture systems could be served in a heterogeneous environment by having / and /usr for each architecture served from different directories on the server, which could be a different architecture from the clients (like Sun3 and Sparc servers).

                  /var also became mounted across the network, but each diskless system had their own copy, stored in /export/var on the server, so that things like system names, network settings and the like could be kept distinct for each system.

                  /usr/share was naturally shared read-only across all of the systems, even of different architectures, as it did not contain binaries.

                  This meant that you effectively had a single system image for all similar systems in the environment. This enabled system administrators to roll out updates by building new copies of / and /usr on the server, and tweaking the mount points to upgrade the entire environment at the next reboot. Adding a system meant setting up the var directory for the system below /exports, adding the bootp information, connecting it to the network, and powering it up.

                  And by holding the users home directories in mountable directories, it enabled a user's home directory to be available on all systems in the environment. Sun really meant it when they said "The Network IS the Computer". Every system effectively became the same as far as the users were concerned, so there was no such thing as a Personal Computer or Workstation. They could log on on any system, and as an extension, could remotely log on across the network to special servers that may have had expensive licensed software or particular devices or resources (like faster processors or more memory), using X11 to bring the session back to the workstation they were using, and have their environment present on those systems as well.

                  As you can see, this was how it was pretty much before Windows even existed.

                  Linux adopted much of this, but the Linux new-comers, often having grown up with Windows before switching to Linux, have seriously muddied the water. Unfortunately, many of them have not learned the UNIX way of doing things, so have never understood it, and have seriously broken some of the concepts. They don't understand why / and /usr were read-only, so ended up putting configuration files in /etc, rather in /var and using symlinks. They have introduced things like .kde, .kde2, .gnome, and .gnome2 as additional places for config data. And putting the RPM and deb database in /usr/lib was just plain stupid, as it makes it no longer possible to make /usr read-only. They have mostly made default installations have a single huge root filesystem encompassing /usr and /var and /tmp (mostly because of the limited partitioning available on DOS/Windows partitioned disks). They have even stuck some system wide configuration files away from the accepted UNIX locations

                  So I'm afraid that from a UNIX users perspective, although many of the Linux people attempt to do the 'right-thing', they are working from what was a working model, broken by their Linux peers. Still, it's better than Windows, and is still fixable with the right level of knowledge.

                  I could go on. I've not mentoned /proc, /devfs, /usbfs or any of the udev or dbus special filesystem, or how /mnt has changed and /media, nor have I considered multiple users, user and group permissions, NIS, and mount permissions on remote filesystems, but it's time to call it a day. I hope it enlightened some of you.

                  I have written this from memory, based on personal experience of Bell Labs. UNIX V6/7 with BSD 2.3 and 2.6 add-on tapes, BSD 4.1, 4.2 and 4.3, AT&T SVR2, 3 and 4, SunOS 2, 3, 4 and 5 (Solaris). Digital/Tru64 UNIX, IBM AIX and various Linux's (mainly RedHat, and Ubuntu), along with many other UNIX and Linux variants, mostly forgotten. I may have mixed some things up, and different commercial vendors introduced some things in different ways and at different times, but I believe that it is broadly correct, IMHO.

          4. Anonymous Coward


            Yes, really.

            A modern OS has no place shipping a filesystem that does't allow massively modern features like ahem linking.

            The layout issue - for a commonly used "place" to have a name like "Documents and Settings" is a travesty and to have apps storing random crap in the registry (a virtual FS with an atrocius layout), ini files, system dirs and \\%USERNAME%\\Local Application Data\\{{GUID}} is somehow less cryptic?

    3. Ted Treen

      It's all relative

      The newest Ferrarri uses an internal combustion engine.

      A variant of 1890s technology.

      The F-18 uses a gas turbine engine.

      A variant of 1930s technology.

      Your point being?

  3. James Le Cuirot

    "But you will be part of the past."

    I wouldn't say that. No doubt GNOME will continue to be developed by the community and a "Gubuntu" flavour will spring up, much like Kubuntu for KDE. It's also worth mentioning to those who aren't familiar with Wayland that all the major desktops will probably support it before long. I haven't tried it myself but it looks interesting.

  4. Rebajas

    The past...?

    I think it more likely a branch called Gnubuntu/Gubuntu/Gnobuntu will sprout; like has been the case for KDE - not so much a relic of the past, just living in an alternate Universe.

  5. Hugh 5

    Yes BUT....

    Ubtuntu is good. In fact Ubuntu is very good BUT it is still not right.

    I spent the best part of a week recently looking to deploy a Linux server with RAID. After SME-Server failed to get even get past the install and ClearOS not offering the right features I turned to Ubuntu Server....

    What a HUGE disappointment! I know what I am doing but it turned out to be a CLI mess.

    So next I went back to Ubuntu desktop, with which I am very familiar. However because I wanted RAID I was taken into the world of the "Alternate" install. As far as I could tell this just meant me going down a route where lots of things didn't work (and I found myself swatting up on chmod and chown commands in order to tidy things up which is never a good sign lets be honest).

    After a week I had had enough and installed Windows 7 Professional. Everything worked. Mirroring (RAID1) works. Shares work. Remote desktop works. Network configuration works...

    Frankly this is embarrassing. I am a BIG fan of Ubuntu but until desktop and server match the elegance of Windows (let's cut them them some space on the Mac front for now) not enough folk are going to take the time and effort to stick with it, however noble the intentions of Mr Shuttleworth.

    1. Anonymous Coward

      If RAID and a server were your goal...

      ... then I heartily recommend you consider a business oriented Linux such as CentOS or maybe Debian, a distro where stability is the goal rather than shiny Jobsian bells. If you stick to the packaged stuff from yum or apt then it's usually solid as a rock.

      CentOS 6, based on RHEL 6 will be around in the next few months and for a while at least won't be rocking it like it's 4 years ago...

    2. Anonymous Coward
      Anonymous Coward

      No but

      You're a big fan of Ubuntu but think PC World offers lots of choice!?

      I have a bunch of servers with raid (Centos, Fedora) and various netbooks and laptops running 'Buntus. I don't have hardware issues. Okay, so I have a tendency to spill coffee on keyboards.

      Installing windows (the OS I used for many years) on ANYTHING has never been pleasant, requiring as it does endless restarts and hunts for drivers. It can't even figure out how to use a USB modem without some crappy 3rd party software, hobbling the damn thing for other OSes.

    3. Archivist

      Horses for courses

      It's true that Ubuntu is not the easiest distribution to get hardware or software RAID working properly, but don't give up. It is possible and you do end up with a robust system plus a very friendly user interface. I'd be willing to bet there are hardware configurations that would make your W7P fall over yet work with another system. A single example is not sufficient to judge any system

      If you just want a RAID box I'd suggest crossing to "the other side" and use SUSE or RH.

      1. matt 115

        Hardware Raid

        Never been lucky enough to have a true hardware raid controller, but I'd have thought it would be totally transparent/independent to the operating system

  6. Anonymous Coward

    It is your father's linux.

    Linux is just the kernel. What Ubuntu is doing here is very interesting. They aren't changing linux, but they are changing the GUI. Lightweight, powerful, it's finally doing something a lot of people have been waiting for, a sane graphic layer.

    X was a horrible project based on a principle no one cares for anymore.

    x11-xorg was an attempt at giving some direction toward sanity as a fork, but attempting to remove the bloat from the code was like trying to gut a whale in the midle of a china shop by having blind philosophers direct a bunch of raving monkeys with hand signals.

    I'm personally hoping Wayland can finally bring proper 3D to Linux. Maybe then we can get some games to work properly.

    1. P. Lee

      re: It is your father's linux

      > X was a horrible project based on a principle no one cares for anymore.

      Not true. Perhaps no-one is brave enough to do much development of it, but love being able to ssh from my mac to my headless linux box and still run any X app. Sure I could use the CLI or the curses interface, but for some things a proper gui is best.

    2. Charles Manning

      Ubuntu vs X

      X is getting worse and worse under Ubuntu. Is that what motivates Ubuntu to shift off X or is Ubuntu purposefully breaking X to make it look shite and make Wayland look better? Dunno.

      Remote X (xdmcp and friends) last worked properly in 8.04 or so. I have not looked at Wayland much and hope it has emote capabilities.

      Eye candy for eye candy sake is pointless. Apple do not just make a slick GUI to look cool. There's a whole lot of usability research that goes into the Mac.

    3. Peter Gathercole Silver badge

      Re. "X was a horrible project"

      I agree that X was designed for a different environment than personal computers running a GUI on the same system, but to brand it a "horrible project" just goes too far.

      Because of it's origins (in academia), it would be fair to say that X10 and X11, particularly the client side, was one of the first "Open Source" projects (along with the original UNIX contributed software products - many of which pre-date GNU). As such, it helped define the model that enabled other open source initiatives to get off the ground. But it suffered teething problems like all new methods, particularly when it got orphaned when the original academic projects fell by the wayside.

      What happened with XFree86 and was messy, but ultimately necessary to wrest control back from a number of diverging proprietary implementations by the major UNIX vendors (X11 never did form part of the UNIX standards). I don't fully understand your comment of reducing bloat, unless you mean modularising the graphic display support so you only have to load what you need, rather than building specific binaries for each display type, but that is just a matter of the number of display types that needed to be supported. X11R5 and X11R6 was actually lightweight by the standards of even

      But I have said this before, and I will say it again. If you don't understand what X11 is actually capable of, then you run the risk of throwing the baby out with the bath water. It would be perfectly possible to keep X11 as the underlying display method, and replace GNOME as a window manager (much as Compiz does, and does quite well). This is one of its major strengths, and would allow us die-hard X11 proponents happy. If you use one of the local communication methods (particularly shared memory) you need not necessarily have a huge communication or memory overhead, especially if you expose something like OpenGL at the client/server interface. It's higher than having the display managed as a monolithic single entity, but I don't believe that any of the major platforms do that. There is always an abstraction between the display and the various components.

      Having tried Unity and 10.10 netbook on my EeePC 701, surely one of the targeted systems (small display, slow processor) for several weeks, I eventually decided that it was COMPLETELY UNUSABLE at this level of system. The rotating icons on the side of the screen were too slow, and the one you needed was never visible leading to incredible frustration as you scrolled through the available options, trying to decode what the icons actually mean while they fly up and down the screen. It appeared very difficult to customise, and I begrudged the screen space it occupied. My frustration knew virtually no bounds, and it's lucky that the 701 did not fly across the room (note to self - check out anger management courses) on several occasions.

      I reverted to GNOME (by re-installing the normal desktop distro), and my 701 is now usable again, and indeed quick enough to be used for most purposes including watching video.

      I know I am set in my ways, but I can do almost everything soooo much faster in the old way. I fail to see that adding gloss at the cost of reduced usability and speed helps anybody apart from the people easily dazzled by bling. To put this in context, I also find the interface on my PalmOS Treo much easier to live with than Android on my most recent phone.

      I'll crawl back under my rock now, but if Unity becomes the main interface for Ubuntu, I will be switching to the inevitable Gnubuntu distribution, or even away from Ubuntu completely.

      1. Craig Chambers

        Missus hated 10.10 netbook remix

        Subjective and minus the technical discussion on the merits of X and Unity, but I had to switch the Maverick install on the missus netbook back to a default Gnome desktop because she hated it so much. She actually quite liked the UI in the lucid (10.04) netbook remix, but the 10.10 one was a step too far.

        I presume that she is target demographic for UNR as she knows nothing of the underlying system - other than its name, but if her reaction is anything to go by following the upgrade then Unity is not ready for the likes of her.


      The ship already sailed for you X haters.

      > X was a horrible project based on a principle no one cares for anymore.

      Nonsense. While X has been subjected to a wave after wave of FUD over the years, the market leader has been moving closer and closer to it with their own product. There are now special remote access hooks in Windows that allow it's own graphical terminal product to perform respectably well. While all of the X critics have been trying to shout it down, the market leader has pretty much adopted it in principle.

      So, I can have reasonably good performance to a remote desktop in Win7. Trying to do the same with a Mac desktop and VNC is just painful. Meanwhile the ninny talking heads of our own want to toss the baby out with the bath water.

      The "wayland approach" does not benefit the Mac so much. It doesn't even really help for games. The only thing that helps for games is having the lion's share of the market and being viewed as worthy of effort. WinDOS never had problems with games no matter how ugly the implementation details were.

      X was designed when 100Mhz was a crazy fast CPU.

      The value of dumping it now when phones have a 1Ghz CPU is somewhat disputable.

  7. Anonymous Coward

    Mark Shuttleworth and Steve Jobs can not be compared

    What Mark Shuttleworth is doing with Ubuntu is totally within the philosophy of FOSS and Linux. He has a vision and he works to make it reality. You are entirely free to take his work and use it as it is or you (yes, you!) can come up with another desktop paradigm for Ubuntu that will make Unity + Wayland look like an old ridiculous dinosaur. As long as Ubuntu will be free software, those who want will have the option of ripping off whatever they don't like and replace it with whatever they might please, so why make a fuss out of it. It has been done before and FOSS world kept going on undisturbed.

    Steve Jobs is not like that. He also has a vision and you are free to accept or reject it but the similarity stops here abruptly. In case you accept it, you will have to embrace it as it is, there's no way you can change it and there's no way you can come up with something that might be possibly better.

    As for the dream of a unified UI across devices having a wide range of screen sizes, I wish him good luck but I'm still convinced a 5in screen can't be treated like a 25in one. Why the hell would I want the crammed UI of my phone being displayed on my 25in large desktop screen ?

  8. AdamWill


    "In fact, Unity appears compelling enough that even Ubuntu competitor Fedora is hoping to support it in future releases."

    Sigh. my blog post that you linked to is obviously not hedged around with enough disclaimers. I should add a big flashing sign.

    It makes no sense to say that 'Fedora is hoping to support it', because Fedora doesn't work that way. Fedora doesn't have a unitary brain. *I* am hoping to package (not, exactly, support) it, because I find it interesting. That's it. It's not because it 'appears compelling enough' - I've never actually laid eyes on it yet. It's more that I want to *find out* if it's compelling, and I figured other Fedora users might like to as well, so it seemed to make sense to find out by packaging it.

    Never let inconvenient truth get in the way of a good narrative, though!

    GNOME Shell is as much of a radical departure as Unity is. Are either of them actually any good or what anyone wants? We don't really know yet. There's a lot of uncertainty around both. But there's nothing *more* revolutionary about Unity than about Shell.

    And on Wayland...that's really not Mark's vision, is it? It's mostly Kristian Hogsborg's vision, being as how he's the guy who wrote it, and all. It's also more generally the vision of just about everyone significantly involved in Linux graphics development; ask any hacker and they'd probably tell you something like Wayland was pretty inevitable, and after Wayland started, they'd tell you it'd be pretty likely many distros will use it in future. All Mark did was, well, announce this fact.

  9. Anonymous Coward
    Thumb Down

    Part of the past...

    >try GNOME 3.0, rest assured it will be possible. But you will be part of the past.

    Really will I? Looking at the man-hours and talent pouring into GNOME compared to Ubuntu's minor contributions and half-arsed developments I doubt it.

    >Ubuntu Linux spent the last few months of 2010 dropping bombshells on the Linux world

    Get real - causing the occassional sigh and shaken head maybe.

  10. spegru
    Thumb Down


    I have this nightmare that in the future, desktop OSs such as Windows, OSX, CDE on Unix, KDE and Gnome on Linux will no longer exist and the this will combine to create a proprietary appliance-like environment that the 'powers that be' will use to control what can and what cannot be installed and especially, connected to the internet.

    We have already seen this with iOS, Android and WinPho7 and it will get worse with tablets

    One day we will look back and wonder where our freedom went

    1. Anonymous Coward
      Thumb Up


      That went out the door when Amazon decided it was too naughty.

  11. Change can be good
    Thumb Up

    Eagerly waiting for the post-Mac makeover

    Sometimes the press does give Steve Jobs a tough time. But he comes up with very desirable products for Apple.

    I don't find Mark temperamental or arrogant however he is both visionary & dynamic. I'm waiting for the exciting changes he is bringing to Ubuntu to fructify. I feel he can get some very desirable gadgets. For instance, ARM tablets that run Ubuntu & sold by canonical so that they are not held hostage by OEMs who are pressured upon by any OS Monopili$tic company. uPad would be great. iPads are not expensive but are not inexpensive either.

    Hoping & expecting Ubuntu & Mark to shake up the competition.

    1. Anonymous Coward


      Apple has thousands upon thousands of employees. Steve Jobs doesn't bring you any of the above.

      1. Anonymous Coward
        Anonymous Coward

        re: Steve Jobs doesn't bring you any of the above.

        He's steering the company, he decides what direction R&D takes and which products to release ... who do you think is responsible for Apple being big into phones, portable multimedia devices and tablets while Microsoft is still pretty much stuck on the desktop?

  12. Anonymous Coward
    Anonymous Coward


    Been using Ubuntu and Kubuntu for years now. The only time I ever even think of Shuttleworth's existence is when reading articles like this one. I think he has a long way to go before reaching any kind of Jobsian status.

    As for Unity, well, Gnome 2 runs just fine for me on my Acer Netbook, but Vive La Différence, Vive La Linux! I'm glad that Ubuntu is progressive rather than conservative. There's a place for both philosophies in IT, just as there is in politics.

  13. The Cube
    Thumb Up

    Go Mark

    Ubuntu is the only desktop Linux I have ever bothered using for more than an hour. Shuttleworth is on the right track for mass adoption of a Linux alternative, it needs to install automatically, be easy to use and look good. So many Linuxes are ugly, really ugly, Susan Boyle ugly, even make Windows XP look modern ugly. It would be really nice to have a Linux that looked decent as well as working properly.

    Typed smugly on a Macbook by a patronising twat in a roll neck.

    1. elderlybloke

      Go Mark

      Dear "The Cube",

      Please post a photo of yourself so that we my make derogatory , stupid and ignorant comments about your appearance .

      What have you achieved lately? Susan Boyle has gone from obscurity to being worth millions in a couple of years, because of her talent.

      what talent do you have?

  14. Anonymous Coward
    Anonymous Coward

    Would be nice

    The UI in OS X is pretty great. Would be nice if somebody did a clone. Not sure if X and its window managers can get there though.

    The last time I used Ubuntu, its menu/toolbar situation was rubbish. It had more or less the Windows 95 start bar at the bottom, then a menu bar at the top that sort of looked like a Mac's menu bar but wasn't specific to the active application (what good is it then?), and then each window had its own optional menu bars. Talk about schizophrenic. Confusing and a complete waste of screen real estate. It's good that somebody with a vision (any vision) is trying to sort this out.

    1. Goat Jam


      Some annoying Mac acolyte wrote;

      "a menu bar at the top that sort of looked like a Mac's menu bar but wasn't specific to the active application (what good is it then?)"

      Detached, application specific menu bar. Ugh. That has got to be the single most user unfriendly, unintuitive and godforsaken abomination to ever be designed into a UI and I have no idea why so many kool-aid gulping mac fanboys bang on about it like it is the best thing since sliced bread.

      The only thing I can think of is that mac users are those sort of knob ends that run every window maximised and get all confused if they have more than one app open at a time. That might have something to do with the lack of multitasking in macs before OSX came along but I suppose it would make some sort of sense because the top menu bar is always the menu bar for the active window which doesn't really matter if ALL YOU CAN SEE ON THE SCREEN IS THE ACTIVE WINDOW!

      But, consider for a moment that you are not such a retard and you actually use a computer with multiple windows open (like a do) and you pretty soon realise that when you are working in a window on the bottom RHS of your screen THE LAST THING YOU WANT TO DO IS MOVE YOUR VISUAL AND MOUSE FOCUS RIGHT ACROSS TO THE OTHER SIDE OF THE SCREEN every time you want to access the fricking menu.

      Even worse is if your primary focus is Window A but you want to access the File menu on non-focussed Window B, whats that you say? I have to click on Window B and THEN MOVE TO A TOTALLY DIFFERENT PART OF THE SCREEN? WHY?

      God forbid if you forget to change focus to Window B and you open the File menu for Window A without noticing!

      So I ask you mac fanboys. Having a menu bar that is specific to the active application but located in a physically different area of the screen, why on Earth is that good? Please explain because I certainly don't get it.

      1. jubtastic1

        Happy to oblige

        T = a + b log2(1+(D/W))'s_law

        1. Luke McCarthy
          Jobs Horns

          Fitts's law bollocks

          Just because it's easier to click things on the edge of the screen isn't a reason for putting the menubar there. Why not put everything on the edge of the screen and put a big black square in the middle, because obviously the stupid user is to cack-handed with the mouse to click on anything.

          1. JEDIDIAH

            Fitt's law bollocks

            Good point. I suppose this is why I like to put my "dock" at the top of the screen in Linux.

            One nice thing about having a proper top panel is the fact that I can actually make good use out of it for notification and monitors and such. These sorts of applets are woefully lacking in MacOS. There just isn't enough room in a menu bar tailored for text to accomodate nice pretty icons that can display useful information.

            Plus, the "real estate loss" is effectively just the height of the single menu which is pretty small on a modern monitor.

        2. JEDIDIAH

          Fitts law is bogus

          Fitts law might have made sense on the original puny Mac.

          However, Fitts law is bogus nonsense on a modern monitor.

          Fitts law clearly is WRONG on my 28 inch monitor.

          This is just something that Apple users cling to like some article of faith, like American fundies and creationism.

      2. snafu

        Don't paint me dumb

        The advantage would be that, even being a different area of the screen, it is always the same area, favoring "muscle memory" and tolerating far less precision of movement (just throw the pointer in the general direction: it will stop by itself upon meeting the top of the screen).

        Not that your arguments are invalid (with current big screens there's a bigger visual disconnect between the top menu and an app's windows and palettes), but there are some advantages to the approach and several cons to Windows', too. The issue has been discussed in UI circles often enough.

        And please stop the namecalling. Using Macs for a long while (since the SE/30 era) makes one specially sensible to UI issues. We ARE aware of things like that.

      3. HMB

        Not getting it...

        There's some really aggressive ranting in there!

        I'm sorry you don't get the whole single file menu thing.

        I can't say I've ever needed to rapidly switch between file menus in around 20 years of computing, I don't know what you could be doing that requires that.

        It took me 15 minutes to learn OS X. It's one of the easiest and most user friendly operating systems around. Personally I don't use OS X simply because I don't want to pay 3 times the price for a computer.

        The number of programs that need a file menu which I use are in a minority. My email client doesn't need one, my web browser doesn't need one, spotify doesn't need one and neither does VirtualBox, certainly not on every VM anyhow. I welcome the changes in Office that tidy away that clunky file menu into something a bit more pleasant.

        Sometimes it's just time to move on. There will always be people who don't want to change.

      4. InITForTheMoney

        Why the single menu is both user friendly and efficient....

        Have you ever considered that mouse pointers accelerate based on how quickly you move the mouse or trackpad, so the amount of movement required to move the pointer to a menu attached to a window is actually roughly equal to the amount of movement required to move a cursor to a menu at the top edge of the screen, this is because the cursor is bound by the edge of the screen and will stop over the menu no matter how fast the cursor is moving.

        If you have to focus the cursor on to a control in the middle of the screen you have to do this more slowly as you need to slow the cursor down again or even reverse your movement in order to focus over the control, in the middle of the screen you have to do this in 2 dimensions, but at the edge of the screen you only have to do this in one - horizontally (since the screen edge has already stopped the cursor vertically and left it resting handily over the menu bar). Given that words are generally wider than they are tall, it's actually very easy and quick to focus over a text based menu item at the top of the screen.

        Even if the application you want to use hasn't got focus, you only have to click somewhere (anywhere) in that large control we call a window in order to give the application focus and the menu the correct context, this requires almost zero precision and takes no time at all. If you take these facts in to account, it's generally quicker to grab a menu item on a Mac than on Windows or Linux.

        Other benefits:

        * You save a lot of screen real estate by having a single menu that changes based on what application you are using at the time

        * The fact that every application has consistently named menu's "<AppName>, File, Edit, Window & Help" means that users can consistently find what they need for any application in an expected location on the screen

        * It's better for macros and accessibility software because if you need to pre-program menu movements or screen hot spots for a blind users screen reader software, you know exactly where the menu will be, since it's always in the same place

        * Given that a mac is designed to be used with one mouse button by a "consumer" not a "techie", the users expect to use the mouse for everything to do with driving the computer and yes that means going to the menu for EVERY option, even copy and paste for which any windows user has probably learned the key sequence. The mac is designed so that a user can pick up how to use it very quickly through repetition. Clear user interface guidelines for mac apps and consistent placement and terminology in menu's help that learning process.

        There are 3 reasons the placement of the menu on the mac hasn't changed since 1989 - it's efficient, it's simple and nobody has yet come up with a better way.

        Your comment about multi tasking is absolute rubbish, Macs have been multi tasking since I first used one in 93 and even before that, it wiped the floor with the Windows alternative. Linux was only really in use in academia at that time, had very few apps and wasn't a contender. What the Mac didn't have before OSX was decent resource management and scheduling, but this was no worse than any other desktop operating system of the time and using multiple apps at the same time on the mac was quite user friendly as long as you had enough RAM. As evidence, it would be pretty stupid to have an OS that let you copy and paste a chart in to a word processor if you had to quit the spreadsheet application before you could open the word processor and paste the chart now wouldn't it? You have obviously had extremely limited experience of using Mac's if you think they couldn't multi task before they started to run their own Unix variant.

      5. Roger Heathcote 1

        It isn't good...

        It isn't good at all dude, I'm so with you on that one. OSX has some terrible design flaws (window resizing from bottom right hand corner anyone?) but this has to be the single most annoying one. I hope it's optional because if not It would be enough to make me switch OS again and I quite like Ubuntu.

        IMHO all of them have got it wrong when it comes to GUI's anyway - they're all pretty terrible and I don't mean that from a CLI fans perspective either, the CLI is worse if anything. No, I mean modern GUI's are awful - they all presume there's one best way of doing something and then bully developers into accepting that decision. There's something deeply wrong when developers end up in charge of GUI design in their apps anyway, being a good programmer doesn't mean you're a good graphic designer.

        Really the move that needs to be made is to meaningfully decouple the graphics and layout from the logic in apps so graphics and UI people can actually do the graphics and UI without having to become C coders and, gasp, maybe even the end users can customize applications layouts themselves. Of course none of the head honcho's of the desktop OSes are brave enough to spearhead such a shift and they are constantly losing ground to web apps which do follow the MVC pattern (and thus can easily adapt their interface for multiple situations and support multiple clients).

        Sadly in the world of 'web apps' where an OS is just a way of booting a browser all software becomes private and proprietary again because it's on somebody else's cloud and not in your computer. Seriously people... if you like free software the desktop needs saving and it needs far more visionary thinking than were getting out of Shuttleworth, Jobs and Balmer combined :/


This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2022