back to article Ubuntu 10.10: date with destiny missed

Canonical delivered the latest version of its Ubuntu Linux distribution on October 10. Releasing Ubuntu 10.10 on 10/10/10 might seem an auspicious idea, but after the overhaul that was Ubuntu 10.04, the latest release looks tame by comparison. While there is little in Ubuntu 10.10 that will knock anyone's socks, it makes for a …


This topic is closed for new posts.
  1. Joe Montana


    While it's preferable for a unix system to be split into multiple partitions, remember that Ubuntu is competing against OSX and Windows which both use a single large partition for everything (and actually make it difficult to do anything else).

    The problem with separate partitions from a user perspective is not knowing how your data will be distributed and ending up with some partitions being too small.

    1. The Other Steve
      Thumb Up

      What he said, only more so

      "The problem with separate partitions from a user perspective is not knowing how your data will be distributed and ending up with some partitions being too small."

      And the aim is for it not to stop during install and ask the user questions they can't answer, or which frighten them off. Remember kiddies, this is not a geek distro, this is trying to be Linux for human beings.

      Ease of use trumps all. This is how it has to be if you want Linux to do anything on the desktop other than annoy the shit out of people.

    2. Goat Jam

      Can't agree with that

      I've been using 8gb for / and the rest for /home for *years* and never had a problem. These days with terabyte drives commonplace shaving off a paltry 8Gb for the OS partition is utterly trivial in the great scheme of things.

      The benefits far outweigh any "disk wastage" concerns that you might have.

  2. Anonymous Coward

    Does anyone like it?

    Personally I love it and use it day-in, day-out, yet...

    The Windows people hate Linux per-se, 'cos they think it's cheap and tacky! The Mac people think their precious interface designs are being ripped off! Finally the Linux hardcore people think it should be given away with the book Linux for Dummies!

    1. The Other Steve

      A smorgasboard of ignorant hate, but sadly also true

      "The Windows people hate Linux per-se, 'cos they think it's cheap and tacky! The Mac people think their precious interface designs are being ripped off! Finally the Linux hardcore people think it should be given away with the book Linux for Dummies!"

      And those of us who use all these technologies, appreciating each and every one for it's particular benefits or drawbacks, using each as appropriate to the task in hand, think all of you are dicks.

      1. Code Monkey


        OK I'm fond of banging the linux drum, it's true but +1 for that.

      2. serviceWithASmile

        a most excellent comment


        "The Windows people hate Linux per-se, 'cos they think it's cheap and tacky! The Mac people think their precious interface designs are being ripped off! Finally the Linux hardcore people think it should be given away with the book Linux for Dummies!"

        And those of us who use all these technologies, appreciating each and every one for it's particular benefits or drawbacks, using each as appropriate to the task in hand, think all of you are dicks.


        I would have posted this if not beaten to it.

        Time, I think, for the various proponents of each OS to grow up a little and be more open minded towards other OS users. I'd include myself in this but I'm already there, albeit recently.

        An OS is a tool to do a job, not much more.

        There are too many people who view it as picking a football team to support - you might be hated for supporting Rangers by a Celtic fan, but how often are you hated for using a 12lb sledge over a 10lb sledge?

        Linux *does* come with a free political movement though :P

      3. Anonymous Coward
        Anonymous Coward


        No comment on why you are for or against Product A or Company X is valid?


        I am of the opinion that MS is very much a force for evil on this planet and has done a lot to harm the industry I work in and love. I think anyone who gives them dollar is complicit. Now, you may think I'm a dick for this, but then I think you're a massive, flopping cock for having absolutely no ethics whatsoever. MS has spent so much time and money lying, cheating and stealing that frankly anyone who supports them (financially or otherwise) should be banged up and forced to eat *nix pie for a good five to ten stretch.

        I am also of the opinion that Apple's insistence on doing everything differently in the CHI is intensely fucking irritating. For everyone else, F5 means refresh, for Apple ... well fuck knows. You may think I'm a dick for expressing how I feel about Apple products, but that in and of itself makes you a dick. I am, whether you like it or not, allowed to express my opinion, particularly on an explicitly social platform like El Reg.

        Do you shout "dick" at your friends when they tell you how amazing a film was or how bad a concert was? Do you shout "dick" at the wife when she tells you she really doesn't like anal sex? When your dad starts a conversation about how a particular engineering firm was renowned for the poor quality of its hanging flange brackets, do you shout "dick"?

        Yes, the OS wars can be a little tiresome and often immature, but having a strong preference for something and expressing that preference do not necessarily make you a dick. Personally I could do without all the endless "I used to hand wank my VAXen back in the 70s" grandparama.

    2. Roger Varley

      Does anyone like it


      The Windows people hate Linux per-se, 'cos they think it's cheap and tacky! The Mac people think their precious interface designs are being ripped off! Finally the Linux hardcore people think it should be given away with the book Linux for Dummies!


      Eh .. youngsters t'day. I remember Slackware CD's being available in the back of t'book when I was still installing Windows 3.0, & 3.11 from diskette.(1)


      (1) Mind you, you've never really installed a system from diskette unless you've done a Novell Netware install - 30- 35 diskettes (and Murphys law applied even then) if my ailing memory recalls.

  3. professorpolymath

    trivial criticism

    The article makes pains to point out that Ubuntu doesn't create a separate home partition for user data. The article implies that this is is a shortcoming of Ubuntu. This is a stupid criticism.

    Let us observe that no versions of the successful operating systems Windows and Mac OS create separate disk partitions for user data. Why not? Probably because it is a worthless division of disk space.

    Disks fail, not partitions. Disk failure is something worth worrying about.

    1. Anonymous Coward
      Anonymous Coward


      Maybe I'm being short sighted but I cant see the point in having multiple partitions either, at least in a home PC. Why limit your system or home directory to a given size? Doesn't make sense.

      As the article points out, advanced users can do this, but I'm pretty sure a screen asking 'would you like to partition your home directory from the system directory' would confuse people like my Mum. Normal folk just want the system to work. Faffing around by introducing these sorts of limits is just going to make things more confusing.

      1. elderlybloke
        Paris Hilton

        Hey Mush

        Anonymous Coward

        Posted Monday 11th October 2010 09:02 GMT

        I bet I am older than your mother -and father, and I am not confused by this partition for home directory idea.

        Have done it a few times when installing the beloved Ubuntu.

        I also back up the home directory frequently.

        I learned many moons ago that hard drives go silent and are never heard from again,and just when everything seemed to be going so well.

        Peace be with you young man.

      2. Scorchio!!


        I don't know about you, but my mum doesn't use Linux. She's still stuck in the past with XP. The thought of her using Linux tickles me pink, and I've been preparing her for it, but this is going to be the slowest rollout in history. Oh, she does have several drives in her machine, for the reasons discussed. As for me, I never leave any files on my notebook, preferring to use a USB hard drive, and work out of TrueCrypt files stored on it. The maths are quite simple, no?

    2. Destroy All Monsters Silver badge

      Hey, prof..

      ...ever used a Linux system in Real Life?

      Where like, programs are running that fill partitions to 100%?


      Aw well.

    3. Fuzz


      Obviously the point of this is so that you don't lose your files during a reinstall.

      Thing is a modern OS should never need reinstalling. It's not something I've had to do since we were using windows '98.

      It all it ends up doing is meaning you run out of space either in /home or in / when the other partition still has loads free.

      Most OEMs do this with Windows computers now. There it's even more pointless because without a redirected user folder people still store all their files in the docs folder or on the desktop. I see a lot of computers friends bring to me when they've "run out of disk space" what they've got is a choc full C drive and 200GB free on D.

    4. Anonymous Coward
      Thumb Up


      It may have been necessary to micro-manage tiny little partitions when we were installing slack in 1995, but this is the century of the fruit bat. Get with the program, granddad.

      1. Anonymous Coward
        Anonymous Coward

        "Thing is a modern OS should never need reinstalling."

        But, of course, Windows often does.

        1. Adrian Challinor

          The title is required, and must contain letters and/or digits.

          "Thing is a modern OS should never need reinstalling." But, of course, Windows often does.

          Modern, Windows and OS. Words that should never be used together.

          Mines the one with Ubuntu 10.10 memory stick in the pocket.

          1. Chemist

            @Adrian Challinor

            "Thing is a modern OS should never need reinstalling." But, of course, Windows often does.

            One word : sarcasm

        2. ed2020
          Thumb Down


          Windows very rarely *needs* reinstalling to rectify a problem. In fact most of the time reinstalling is a particularly bad way of problem solving because even though it may fix the problem you still don't know what caused it.

          1. Anonymous Coward
            Anonymous Coward

            Re : Shoe menders

            "because even though it may fix the problem you still don't know what caused it"

            Bloody Windows caused it !

            1. ed2020
              Thumb Down

              Re. "Re : Shoemenders"

              If that's what you consider to be a satisfactory root cause analysis then fair enough. I don't.

              And if that is the RCA then reinstalling Windows seems like an interesting way to "resolve" the problem.

            2. Doug Glass

              Causes cancer too

              Whatcha bet.

          2. Yaro


            Sorry, but Windows offers scant few options for fixing major issues before "reinstalling Windows" becomes the only real option. Sure, things made VERY slight improvements from XP to Vista to 7. But I must emphasize VERY SLIGHT.

            In all my years doing this I've found Windows boxes to be the single biggest pain in the ass to support. And it's all due to the registry. See, in a standards-compliant operating system (Under POSIX, FHS, and SUS which are the *real* operating system standards, not what Microsoft offers its users as a de facto standard that only applies to Windows.) configuration is always plaintext, and organized in simple directories anyone can read, but not necessarily make changes to. This, by the way, is also why Windows is so readily infected is because it offers piss-poor file permission protections and thus even an unprivileged yahoo can still change system settings and thus so can a virus.

            See, with Windows, it stores ALL the stuff important to the system working properly in a binary database ONLY THAT INSTANCE OF WINDOWS CAN ACCESS. If Windows breaks, good luck getting access to its configuration to fix the problem. In a standards-compliant system one could run a boot disk and use a simple text editor to fix what is likely one errant value. Because the windows registry is a binary database, you can't access it with anything but the registry editor installed on that copy of Windows.

            Sure, Microsoft added a "last good configuration" option, but that hasn't done a good job on the vast majority of Windows boxes I've had to fix. Safe mode is nice if there's a driver or virus running amock but still relies on the system actually WORKING enough to boot to the desktop in the first place. And the recovery console they used to offer had very little useful tools, none of which, by the way, could access the registry.

            The registry is far from robust or reliable. Sometimes... okay more frequently than "sometimes..." a COMPLETELY MINOR VALUE that's set wrong in the registry can cause a fatal chain reaction to a Windows installation.

            Trust me, Windows is still far from the "reinstallation unneeded" state all the other operating systems have reached, and all because Microsoft doesn't have the first smegging clue how to design a quality operating system.

            I haven't had to reinstall Linux in a long, long time. Why? Because if I break something I just start up a Live CD and change my configuration file to something "correct."

            1. Stephen Bungay

              True for the most part... but...

              You can fix the registry using... wait for it....Linux.

        3. Aggellos

          morons but not from outer space

          I suggest you not try to use any flavour of linux..if you have problems with windows that require it to be re-installed.

          1. Code Monkey

            I disagree

            Every Windows problem is a great reason to use any flavour of linux....

          2. Doug Glass


            Care to explain that?

        4. Chemist

          RE : "Thing is a modern OS should never need reinstalling."

          The forums are FULL of Windows users who seem to need to re-install every few months as their systems grind to a crawl

          1. ed2020
            Thumb Down


            No, the forums are full of users who need to reinstall Windows because they are incapable of diagnosing the cause of their problems. The need to reinstall is the user's lack of skill, not a problem with Windows.

            And the slowdowns are invariably third party software or malware related. Both of which can be resolved easily without a reinstall.

      2. Andy Fraser


        The first thing I did with my new Windows 7 laptop was get rid of the 2 partitions. I also can't see the point of multiple partitions on a home computer but then I'm religious about backing up and making restore DVDs so I can quickly get my system back in the event of a disk failure.

        I do use multiple partitions on my Linux server but then it makes sense there.

    5. The BigYin

      Partitions serve a purpose

      1) They can make upgrades, back-ups etc a bit easier

      2) They can allow the use of different file systems for different tasks

      3) They can allow one to move files on to a separate spindle for performance

      I will agree that some of the above are not for the new user and not partitions in the strict sense but doing it is not necessarily a "Bad Thing"(tm); it all depends on one's needs and at least Linux will *let* one partition (and re-partition) easily.

      I have recently almost lost the will to live trying to get XP to move "Documents and Settings" from "C:" to "S:" to make better use of the second drive. The fact that an "industry standard" OS cannot do this simple thing is, frankly, a sick joke.


      I agree totally about you "disc failure" statement. Everyone should run back-ups. I was getting the bike MOT'd and the place I went to had had a crash - 2 years of records down the toilet, no back-up. It makes your mind boggle. Back-ups easy to do (including off-site as well) and only idiots don't.

    6. Code Monkey

      I want 2 partitions

      I've always used 2 partitions and would continue to do so were I to use Ubuntu. It might be fine for most of you but some users' hardware causes the upgrade to fail. This happens with every version.

      Of course I'd make a backup before upgrading but having /home on a separate partition means I can reinstall without having to restore. Much less dicking about.

      1. Geoff Campbell

        XP may be industry standard....

        ....but it's hardly currently. The answer is probably to upgrade to a more modern version.


        1. The BigYin
          Thumb Down


          If I give you my PayPal account details, will you send me the £200+ I'd need to upgrade? No? Thought not.

          I'll use XP until it either goes out of support* or I no longer rely on a single PC for everything. Then it will be Linuxed. Personally I cannot afford the cost of Windows 7 and, after having to endure it in my day job, I wouldn't want it anyway.

          And, more to the point, why should I upgrade something that does, pretty much, what I need? I could do a re-install and fix the whole C:\S: thing but then, as I said above, I'd just Linux the fecker.

          *sp3 is still under support before anyone says anything

        2. The BigYin


          Oh, and one other thing, Windows 7 also suffers from the inability to easily more the "User" folder; so an upgrade wouldn't help.

          1. karakalWitchOfTheWest

            @The BigYin

            It's not rocket science to move your Profile-Folder in Windows to another drive...


          2. Doug Glass


            Just create like-named User folders on another drive or partition and then change the location in the original folder's "Location" tab to the new folder. 7 even moves all your file for you if you wish. Easy as pie, falling off a log or whatever makes you happy.

    7. Mountford D

      Partitions are incredibly useful

      I have always installed any flavour of Linux with a separate /home partition. It just makes the users' private space so much more portable and flexible. Try a fresh install of say for example, 9.10 to 10.04. Without a separate user partition, you will trash the entire disk when you format it, including all the users' data. etc. Having a separate partition retains all that. I have even changed flavours of Linux while retaining all my desktop settings, emails and documents.

      With the standard disk configuration tools like gparted, resizing partitions is really not an issue so the subject of disk wastage is hardly a discussion point. With the cost of storage these days, a surplus of 20Gb on a 1TB drive is very trivial. In any case, if you are that desperate for space, just create a link to the surplus space.

    8. Anonymous Coward
      Anonymous Coward

      Sensible compromises

      I think you've missed the point. Disk space is cheap these days, so it doesn't matter if you have a few wasted gig here or there. There's a sensible compromise to be made between the Windows approach of stuffing everything on C: (ahah.. drive letters.. how antiquated) and the excessively anal server admins who think /, /boot, /tmp, /etc, /var, /usr/local must all be their own partitions. I'm running three partitions - a 10 gig system /, a 10 gig staging area, and the rest is my /home. When a new Ubuntu release arrives, I do a fresh install to the staging area, boot from it, check it all works and looks good and there's no problems, and if so.. tell grub to boot from it in future. Simples.

      1. Peter Gathercole Silver badge

        @AC re. Sensible compromises

        Drive letters were antiquated when MS used them in DOS 1.1!

        UNIX already had a fully hierarchical filesystem years before Bill went to see IBM.

        The concept of filesystems on separate partitions really goes back to the original Bell. labs V6 and V7 code for PDP11s, where the partitions sizes were hardcoded into the device driver for RP disks (no on-disk partition tables there!), and when the smaller RK disks were barely large enough for / or /usr.

        Each device could have a maximum of 8 partitions defined, and the definition of the partitions had to work with all drives of that type present in the system. IIRC, it was normal practice to make one partition span the whole device, two more cover half of the device each, and a further four more to cover a quarter of each device. It was, of course, stupid to try to mount the overlapping filesystems, or use the wrong minor device, but this model gave flexibility.

        My old Systime 5000E (a PDP11/34E in Systime covers circa 1982) had 2x32MB CDC SMB disks, with a controller hacked to look like an RP controller with RP03 drives, with overlapping partitions of 1x32MB, 2x16MB and 4x8MB. I had / on 1 8MB partition (formatted to use just 6MB, with the last 2 MB used as swap space), /usr on another 8MB partition, and then used the remaining 16MB as a /user filesystem, which was equivalent to /home on a Linux or more modern UNIX system. There was no /var or /opt at that time, as Sun were only just thinking about diskless systems. A second drive had a single 32MB partition for the /ingres filesystem (which actually had the whole of the BSD 2.6 [for which I sadly do not have a copy of the tape] unpacked in it), and which contained the Ingres database code, and all of the defined databases.

        It was the only real way to manage such systems. If you are really interested in knowing what was involved in setting up ancient UNIX systems, I suggest that you start here, and then brows the rest of the UNIX Heritage Societies site.

        BTW, I started on Version 6, although I have put the link in for Version 7 as that is regarded as the point where UNIX really started to fragment.

        1. Peter Gathercole Silver badge

          Oops, silly me.

          I meant to say CDC SMD (Storage Modile Device) drives, not SMB. How memory fades.

    9. Giles Jones Gold badge


      Especially given desktop PCs are only about 10% of sales now. Laptops are popular and very few of them have multiple hard discs.

      1. Stephen Bungay

        You are confused...

        Multiple hard discs is not a prerequisite for multiple partitions.

    10. Anonymous Coward

      Always use a standalone home partition.

      I always run my home files in a separate partition, why?

      Well I run the current X.04 O/S, when the X.10 come out, I install it into a seperate partition and then simply mount my home into the new O/S. If it works OK, it stays, if not, I simply restart the old X.04, never having had to move a single file.

      Then further down the line when X(+1).04 replaces X.10, I do the same again. I have used the same three partitions for 3 years now, only replacing the O/S as needed.

    11. Peter Gathercole Silver badge

      I prefer two partitions (but I am a UNIX sysadmin)

      It's swings and roundabouts. I tend to use a separate home partition so that I can dry-run a new release while keeping access to all of my files in both releases.

      Unfortunately, this is not a perfect solution, as quite often, the configuration files for all of the dozens of apps and utilities change between major releases. You often watch informational messages about configuration files being 'converted' to a new version, and find that it no longer works with the old OS. This broke the sound on my Thinkpad between 6.06 (Dapper) and 8.04 (Hardy) (both LTS releases).

      I've never been satisfied that 10.04 is ready to switch to, because there are sound, display and suspend problems, so I am still running Hardy. One day, I will boot Lucid, update everything, and all my problems will be over, but I'm not holding my breath, and I don't want to switch away from LTS releases for my main systems.

    12. Anonymous Coward
      Anonymous Coward

      I agree and disagree...

      I agree that the hard disc is the REALLY important bit, but partitioning saves the hassle of having your home directory wiped when either doing a reinstall or switching distros. The default for UBUNTU is to install side-by-side when it sees another Linux installation, or reformat and loose everything. Putting /home on its own makes sense, cleanly separating the user data from the O/S. But hey, each to their own.

      You can get around the hard drive failure problem by using a RAID.

    13. Mat

      I kind of agree with you but...

      Having a separate home partition allows you to do a 'clean-ish' install without losing data. I did this today and it has gone incredibly smoothly - The only issue was dropbox which was resolved by clearing the .dropbox* folders.

      I've also reverted to 32-bit and have had no problems as yet.

    14. Yaro

      Missed the Point

      You're completely missing the point of why people have separate partitions for /home or any particular directory. It's not necessarily to protect you from a failing hard disk (Though it can if your /home is on a completely different hard disk or even a different computer.)

      It's primarily to protect your valuable data from whatever may come if you reinstall your operating system. It can allow you to save a LOT of time and headache on backups for that purpose when instead you can just tell the installer of your OS to go ahead and reuse that partition instead of creating a new /home.

      Other benefits include allowing you to use a single /home directory for multiple *nix installations, even sharing of your data with Windows in a dual-boot.

      Windows is not successful because it's well designed or does things "correctly" as Windows is neither of those things.

      I agree this article overestimates the problems with Ubuntu's automatic mode, especially in light of the fact it allows for two means of manual partitioning on the same disc, and allows you to specify filesystem mountpoints at install time if you go manual.

      I do think it's automated partitioning leaves much to be desired. But, if you've already got a /home you want to assign to a new Ubuntu install you will NOT want to give it the reigns of what is mounted how anyway, since it'll completely ignore your intended /home anyway.

    15. Doug Glass

      Re: "Disks fail, not partitions"

      Yep, so keep your files close at hand on the system drive's "Data" partition and back that data up to one, or more, internal hard drives on a regular schedule. Or even to an external HDD. I've had the OS fail far more often than the HDD, so a reinstall to the system drive's "System" partition is fairly painless.

      The idea of Ubuntu doing the auto partition and separation thing is very appealing. But then what the hell do I know, I'm just a multi decade end user.

  4. TonyHoyle

    Multiple partitions? Is it 1970?

    Fedora hasn't 'switched' to multiple partitions - the earliest redhat releases did it.. they simply never adopted the far more sensible default of a single partition.

    It's a throwback to the unix days when drives were small and you put your /usr partition on a separate drive for both space reasons and because you could mount / read only. On a home machine not only is it not needed it causes problems - it's easy to fill one partition and be stuffed without a repartition (which is possible with lvm etc. but hardly something you expect a home user to be messing with).

    1. Jim 59
      Thumb Up


      I have been a unix systems administrator since 1992. I would recommend that Linux "power users" and enthusiasts put their data in a separate partition because it makes future changes and OS installs so much easier. Other users should probably keep it simple with all data and the OS together.

      However, there are no "other users" in Linux. They are all enthusiasts.

      1. CyberCod

        Now now, not so fast

        I install Ubuntu Linux on desktops for everyday small-town people. In fact, I'm installing it right now, for someone who I just talked into it about 20 minutes ago. He went from not knowing anything about it, to seeing a demonstration, to paying me money to install it all in about 10 minutes.

        And this is normal. There are Linux users who are not Linux enthusiasts. They are Facebook enthusiasts, and Myspace enthusiasts, or parents who let their children use the machine and mess it up. The average person is ready and willing to try something different. And the only thing stopping them is that all you bozos are in here bitching over trivialities instead of talking to your neighbors.

        Granted, I'm running a PC repair shop, and I'm one of the top ones around here, so my opinion carries weight with the locals, but honestly, talking someone into trying Linux is much easier than it was just a couple short years ago, and they don't necessarily become enthusiasts just by using it. I have personally converted over 100 people, of all ages. And honestly, its easier to convert a grandmother, than some 20-something kid. But that grandmother has more influence in the community too... she talks about it at her bridge club, and when she goes to the senior center. She smiles at her friends as they huff and puff about how its not safe to use computers because "the viruses will eat your wifi's" or whatever. And when her friends and family see her using it safely and easily for a full year, without mishaps, reinstallations, infections or any major problems, then they too start to think about whether Linux is right for them.

        It isn't just for nerds anymore, and it's being used by many more types of people than you think.

        The article author's take on partitioning is a flimsy attempt to get views by caching in on any negative thing he can find. Setting up advanced partitioning is easy enough, and should not be automatic. If you don't know about it, you probably don't need it. If you do know about it, its not that hard to figure it out. If it had been done automatically, I would've been pissed.

    2. Peter Gathercole Silver badge

      @Tony Hoyle

      In the 70's, you could never mount / read-only. The ability to do this only came about when Sun implemented their diskless model, where all of the files that would be modified on the / partition (often the files in /etc such as passwd, utmp, wtmp, and mtab) were moved into /var, specifically so / could be mounted read-only on diskless workstations. I'm a bit vague about Sun timelines (I was woking with PDP11s and Bell Labs. versions of UNIX at the time), but I would guess that this happened around 1983, a few years after Sun was set up, with the release of the Sun2 workstations.

      In this model, / and /usr were remote read-only mounts, /var was a remote read/write mount specifically for that workstation, and /home was a read/write shared mount for user files.

  5. tempemeaty

    Partitions are a geek fetish

    I used to partition also. Then I learned to stop that. I've had partitions fail often enough over the years I avoid them now. Since discontinuing that geek fetish my installs have been problem free for the normal life spans of the drives. From my experience I find Ubuntu is doing just fine.

    1. This post has been deleted by its author

  6. Greg J Preece
    Thumb Up

    That's new

    "Free software purists may decry the move, but Canonical clearly doesn't care and is ultimately more interested in a free desktop that allows the users to install any kind of software applications they'd like than it is in satisfying the militantly free crowd."

    It's not long since Ubuntu refused to even prompt users to install MP3/DVD capabilites on or after install, as that would corrupt their pure OS. It's good to see they've finally wised up and realised that the end user doesn't care one iota about "free as in free" software, however much we might - they just want a system that works out of the box. And if Ubuntu's primary goal is to spread Linux to the end user, this is a reality they had to accept.

    1. Phillip Baker

      Re: That's new

      They've offered binary-blob only drivers for some time (including prompting the user that they were available, and explaining in stupid-ese the consequences of using them).

      As far as 'common sense' to non-free is concerned, they've been on the ball for a couple of years at least.

      1. Code Monkey


        I'm no Ubuntu fan but they're right on this score. Those users who give a rat's hindquarters that their packages are completely "free as in speech" have Debian and other distros. Meanwhile the rest of us that don't think that MP3, binary drivers or whatever are ideological monstrosities can get on with our lives.

  7. bertino

    Looks like there are a few important fixes

    Have a 5 year old HP Pavillion laptop that had issues with the Intel GPU, such that it would boot into a blank screen. Adding boot switches enabled me to install it, but there were problems with random crashes (full crashes, freezes or random reboots). This was not a problem on pre 10.04 releases.

    EeePC 901wifi needed a module rebuild with every new kernel release. Also seems to be fixed.

    Hopefully there a no problems with this release, early days yet though!

  8. sebacoustic
    Jobs Horns

    Ubuntu One iphone app

    do people feel it's a given that something like this will not be rejected? Music sync - from St. Jobs' point of view that's an iTunes competitor, for sure. I wouldn't bet on it.

    Why don't the squirt out a WiMo 6 application instead, for me and all the rest of us that use Ubuntu because it works and a sucky WinMobile phone because the boss pays?

  9. Anonymous Coward

    Compare and contrast...

    Should we be expecting "major new features" every six months? I'm very happy with "polishes and refines", thank you very much...

    1. Rebajas

      I always thought...

      I always though x.10 was a refinement of x.04 anyway?

      1. Anonymous Coward


        x.10 = x.04 + six months.

        Anyway, the upgrade was thoroughly painless. Fire up the update manager, sit and wait whilst it works in the background - then reboot when asked and...

        Presto! What was 10.04 is now 10.10 with less than five minutes of downtime.

        Quicker & easier than Patch Bloody Tuesday usually is.

        (Can we have a Meerkat icon, please?)

  10. Neal 5

    Partitioning isn't really an issue

    with Ubuntu or it's lack of it automatically. Suse also automatically partitions, and in my experience, Suse is a lot less user friendly than Ubuntu. Is there actually any real benefit from partitioning, unless you intend to dual boot with another OS. For Windows systems I can see the reasoning for it, with Linux, I cannot. It's just todays way of thinking, and virtualization is eliminating that, along with movement towards cloud computing where it's pretty soon going to become someone elses problem anyway.

  11. Pete 2 Silver badge

    Free, but not free

    Same supermodel, new frock.

    Basically this version of Ubuntu has fresh new versions of everything but little in the way of new features. None of the Ubuntu marketing is aimed at telling existing users what benefits (i.e. things they'll now be able to do, that they couldn't do before - or things that are now easier /faster /better than in previous versions) they will gain from this release.

    On that basis, while it won't cost us freeloaders anything to download the new stuff, it will cost a great deal in terms of the time needed to either upgrade or backup, wipe, install the O/S then reinstall all our apps. The cost of that last step: getting all the applications that aren't included in the base release, but are needed to move an Ubuntu box from a simplistic games and worm-processing environment to something akin to useful - is the killer.

    Even if everything happens as it should there's a good afternoon's work involved. Sadly, experience (from the last time I did this) has shown me that things don't go as they should and that there will be some code that simply won't port, other stuff which has decided to eschew the path of backwards compatibility and yet more applications who's authors have abandoned the fruit of their keyboards and will NEVER work with 10.10. Put all this together and a more realistic estimate is a couple of days of head scratching - so say goodbye to a weekend.

    Even at minimum wage rates, that's £100's worth of my time that this "free" version of Ubuntu would cost me. What I get for that is all the latest versions of my existing applications, but precious little in the way of new functionality. A high price to pay for no tangible benefit, and a very high price in terms of lost free time.

    The basic problem is that Ubunut, and all the major players, are still fixated on getting people to "try Ubuntu", in the same way you cajole a small child to "try a piece of cabbage - mmmm, lovely: yum, yumt". You know they'll hate it, they know they'll hate it, but you feel (somehow) that it will be good for them, It's about time they gave up on this ploy - it obviously doesn't work. Those people who like cabbage, sorry: Linux are already using it. Those who don't, aren't. How about rewarding the loyal (or is it just cheap?) fan base and improving the migration process and actual BENEFITS, instead of focusing on the trite and superficial: such as the colour scheme and the installation process?

    1. Anonymous Coward

      Reinstall with 10.10?

      Why? Upgrade it, in place. This is a Debian derivative we are talking about here dist-upgrade has been around for more than 10 years. I've lost count of the number of systems I've upgraded over the years, often through multiple releases. Then all your installed software will be updated regardless of whether it is a default part of the release or not.

      Fedora and the other rpm distros have never been as good at this and that is probably the real reason they default to the split partition (though this is NOT true on the enterprise versions where I would argue it is more necessary)

    2. Ocular Sinister

      Good for a laugh

      Try re-reading this again replacing "Ubuntu 10.10" with "Windows 7". Its good for a laugh.

      Why would anyone re-install Ubuntu if they already have it? Or are you ranting about migrating from Windows? This whole post smacks of someone who hasn't even seen the various Linux application stores. Once your Ubuntu is installed, you don't re-install the OS or any applications. They get updates through the app store at regular intervals. These might be bug fixes or larger changes, it matters not. A "release" is nothing but a particularly large set of changes. Bigger download, but bigger deal? Not really.

      Question: How long would it take me to upgrade my XP (With Office, Visual Studio and numerous additional tools - I've no idea if any of them are incompatible) to Windows 7? And how many licenses would I need to buy (in addition to the Windows 7 license?) That's got to be a lot of work *and* a lot of licensing to sort out - even if its just calling the suppliers to get new license keys.

      1. Peter Gathercole Silver badge

        @Ocular Sinister. Experience tells me otherwise

        When Dapper Drake (6.06) was the LTS release, by the time Hardy Heron (8.04) came along, many of the packages in the repository were functionally stable. This meant that you may get bug fixes, but you would probably not get a bump of the version.

        If you were adventurous, you could add the 'backports' repository to the list of subscriptions, and get a select few packages at the same level as a more recent Ubuntu release.

        As a result, even though dapper was still 'supported', it began to be very difficult to put .deb files from the Debian repository onto Dapper, because the prerequisite libraries would not be present. Ditto compiling up stuff from source.

        Hardy does not appear to be quite so prone to this, now Lucid is available, but you can see it starting to happen, especially with third-party software like the BBC iPlayer.

        I'm sure that if you joined the Ubuntu developer community, offering to make the backports repository more complete, you would be welcomed with open arms. But until then, the current developer community will be more interested in putting recent versions of the packages into the latest-and-greatest releases, not into the older ones. I myself would love to do this, but personal commitments do not leave me with the time to do it at the moment.

        It's a shame, as I believe that ordinary users would be best served putting a LTS release on their systems and leaving it there for the lifetime of the system.

        Strangely enough, I did a Windows XP to Windows 7 upgrade recently (one of my kids gaming rigs), and it was much easier than I expected, at least using a second disk and a parallel Windows 7 retail install to make a dual booting system. I do not think I had to re-license anything. All the programs installed on the XP drive were identified and recognised, and ran without problems. But these were mostly games, but did include Office.

        Microsoft must be doing something right!

    3. nemo20000

      Share what you know, learn what you don't

      Pete2: “Those people who like cabbage, sorry: Linux are already using it. Those who don't, aren't”

      Wrong. I recently used 10.4 to come to the rescue of someone whose XP installation had eaten itself. They couldn't find the installation disk (if the machine ever came with one), they didn't have time to wait for the manufacturer to send a replacement.

      A Ubuntu 10.4 LiveCD convinced them they could access their word processor documents, spreadsheets, PDFs and photos, so was installed. It works fine for them. They know nothing about Linux.

  12. paul 97
    IT Angle

    User Friendly Install

    Desipte what the article says - the install process is very user friendly.

    If your clever enough to know what a partition is - then you can easily set up multiples for you home folders.

    And as with all ubuntu desktop installs - you can quite happily run firefox and surf the web while the system installs.

    1. Paul Chambers

      What this one said....

      If you know that you want different partitions, you almost certainly know how to do it. If you want different partitions it's entirely possible that you might want these partitions spread over different types of drive (solid state or ramdisk, or RAID - mirrored, striped or a combination - for example), Filesystems (to put a NTFS partition to be mounted in both windows and linux for shared data, perhaps), or for a range of other reasons (operating system testing, virtual machine storage.....etc).

      The author should get over himself and his parochial self interest, and realise that the least complex setup is almost certainly the correct choice for a default installation for an inexperienced user.

      Stick everything where it is easy to find and learn from. Those of us with special needs can sort ourselves out, thanks.

  13. Anonymous Coward

    Lose lose

    It seems to me every time a Linux distro comes out, it is on a lose lose situation:

    Too many/large changes and they get slated (KDE 3 to 4 anyone).

    Too few/small changes and they get slated (polishing up making more solid i.e this release).

    If you have business customers, fewer radical changes can be important (if radical make unstable for instance).

    As regards partitioning, I agree. openSUSE does it better, especially if Windows & Linux already installed.

    Should be 2 partitions minimum (/ and /home) IMO (so you can fresh install without data loss).

    1. Anonymous Coward
      Anonymous Coward

      It is very easy reinstall with without data loss, using only a single / partition. Just tell the installer "don't format this partition". It's hardly rocket surgery.

      1. Anonymous Coward
        Anonymous Coward

        Tried that several times under several releases

        and it crashed out the installer (Ubuntu and Debian), hence allowing clean install.

      2. Goat Jam

        "don't format this partition"

        No thanks, the amount of kruft from the old install that will pollute your system directories afterwards will be enormous.

        If you are considering doing this please take a step back and do an in place OS upgrade instead, at least all the kruft from the existing install will be updated or removed as necessary.

        1. Anonymous Coward
          Anonymous Coward

          Sigh, don't be scared

          Boot from the live cd. delete the "kruft. Leave /home. Install without formatting, profit.

          I know it might sound challenging if you're using to clicking windows, but most of it can be done with the mouse, don't be scared.

      3. Scorchio!!


        I'm not sure what the fuss is about. Even for my portable machine I use an external/USB hard drive for data (on this I use TrueCrypt). On all desktop machines I have at least 2 drives, usually 4, and these are where I keep my data. They are all partitioned, and only one partition is empty. That's a slot for Ubuntu when I have time. Meanwhile I'm going to buy a machine without an OS, and I'm going to try Ubuntu on it. My experience thus far has been that it is alright. So now the next logical step is to provide it with its own playground for portability. If that's good enough I'll consider phasing out MS entirely. Just as long as I can access all of my old files.

        Anyhow, I don't GAD if an OS formats my boot drive. It's only a boot drive FGS. Besides which, when a MS OS goes wrong I restore from image, after backing up all files (usually I make an image, for later browsing), thus wiping everything in the process. All this worrying over partitions is for me a thing of the past, and I think it wise that all competent PC users keep their data completely separate, never mind the backup.

    2. Kurgan

      Bug fixes first!

      I understand your point, but I think that there actually is a proper way to release new versions. A simple rule: Fix bugs first, then do whatever you see fit to make your product better. Which in my opinion means add support for new hardware, add new features, and maybe add new looks and widgets or a totally new interface, but do not force users to use it. Because we all know that 99% of the users don't want to have to learn again where every button, menu item or icon is.

  14. Tom 7 Silver badge

    Partition problems?

    More like ignorant user problems. You want more partitions - make them yourself.

    Cant? Then STFU and go back to windows if you don't actually want to learn about computing just type criticisms on them.

    I want two partitions well I've got seven, no 43 all on the same drive so when the hard drive crashed my /home partition is,,,,,, GAFG.

    1. The Other Steve

      Sucks to be you

      "Cant? Then STFU and go back to windows if you don't actually want to learn about computing just type criticisms on them."

      Whiny geek stamp feet. I hope to fuck you don't work in IT, but I smell a tech support monkey.

  15. Stephan

    Date with destiny missed?

    The "date with destiny missed" headline is a bit dramatic isn't it?!

    You can't really call it a failure for a partitioning non-issue.

  16. bexley


    If you put /home on a separate partition then you can upgrade / format / install the os parition and not loose any of your configuration or files

    i will install 10.10 later today and once it reboots after it finishes i will have all my tweaks / configs, files and software configurations already there saving me hours of setting stuff up (compiz for example)

    He reason that Ubuntu does not do this for you is because how do they know how big you want /home to be?

    1GB? 100 GB ? 1000GB?

    It would not be too hard to put an option in the partitioner to offer a 40/60 ratio between / and /home though.

    I have never needed more than 10GB for / so the rest goes to /home every time

  17. A J Stiles


    Splitting filesystems across multiple HDDs is good, and is how all my early Linux boxes were set up (what with them being cobbled together from whatever second-hand parts I could scrape together).

    But there's little point in partitioning a single big HDD. Chances are you'll find out, too late, that you made one partition too small. And because they're all on the same spindle, you'll lose the lot if you lose any of them.

    The payware store is an interesting idea, but it is my experience that people would rather *not* pay for software if they can possibly avoid it. Although if it ends up encouraging the development of Open Source alternatives for everything in it, that surely can only be a good thing in the long run.

    And please, FCOL, make it a target to get rid of -dev packages. Just put the development files in the main package and have done with it. "Easiest distro on which to compile stuff from source" would not be anything to be sneezed at.

    1. amehaye

      Re. Partitions

      Most users don't want to compile from source. So you are effectively suggesting to add bloat to the distribution.

      1. A J Stiles


        Bloat? Not really, and not as much as you think.

        Back in the day, when processor speeds were measured in MHz, disks in MB, RAM in KB and download speeds in kbit/sec, it was entirely reasonable to trim packages down to the bare essentials, even if that meant making some tasks hard; people generally knew what they were doing anyway. Nowadays, the reverse applies.

        Half the reason why "Most users don't want to compile from source" is that you have to have a bunch of -dev packages installed in order to do it, resulting in a catch-22 situation.

        Compiling a package from source on a well-set-up system *isn't* hard; it takes four predictable commands, or three if you are using sudo. It is made needlessly hard by misconfiguration; some of which is the fault of distro maintainers still living in the past.

        1. The Other Steve

          Meh, you're both right.

          While I agree that most users don't _want_ to compile anything from source, it is an unfortunate fact of a penguin flavoured life that sooner or later you _have_ to.

          When this happens, it would be handy if it just worked.

        2. Nexox Enigma

          Yes and No

          I agree completely on the -dev packages - in many cases the overhead of having 2 packages instead of one outweighs the space consumed by a handful of header files. And it would sure as hell save time. The 'bloat' would be lost in the noise of any package-managed distro. Then again, compiling w/ package management is just a bad combination from the start.

          I mostly disagree on the partitioning ideas though:

          A) It's terrible to spread a filesystem across multiple drives without redundancy. A few distros that I've tried (Fedora 11 comes to mind) default to one massive LVM partition across all of your drives - when you lose a drive, you've not only lost some unknowable amount of files, but your filesystem is hosed, and you'll likely lose all of the files that are still stored on your good drives. If you meant 'run an md-raid5 on multiple different sized disks' then I suppose that's an OK solution. Though multiple filesystems and manual storage management (moving things) would probably be better.

          B) If you are using LVM, there's no problem with too much / too little space, or guessing your partition sizes. You can just start at 10% each and expand them over time, as needed. Decent file systems will do online expansion, so you never have to unmount (handy for /). And if you manage to mess up, you can typically do an offline shrink.

    2. Goat Jam
      Paris Hilton

      Compiling from source

      Shirley those who wish to compile from source are capable of invoking;

      apt-get install build-essential

      first, right?

      There is no need to bloat things up for everyone else.

      1. A J Stiles

        It's more than that

        If only it were that simple!

        The "build-essential" package is *just* a bunch of dependencies on the standard build tools (GCC, make and others).

        The source code for libraries (and programs that act a bit like libraries by exposing an API) contains "header files" which describe the functions and constants therein. Once the build is complete, you don't need these header files for normal everyday use of the library. So they can safely be omitted from "libfoo.deb", to save space in the package and so reduce the download time by dial-up modem. BUT if you are ever going to compile another program that makes use of any function or constant defined in the library, you *do* need the headers. So there is a separate package "libdoo-dev.deb" which contains these headers and other files likely only to be of interest to developers (remember, in Linux parlance, anyone who builds anything from source is a developer).

        When you download some spiffy new tarball from SourceForge or somewhere that isn't in your favourite repository yet and it says it needs "libsdl 1.2", what *isn't* obvious is that it almost certainly needs the header files that would have been left lying around by compiling libsdl1.2. So, like a good user, you do

        $ sudo apt-get install libsdl1.2

        and try to build the package. Which then fails to find some header file that was moved out to libsdl1.2-dev and promptly dies,complaining that it can't find libsdl1.2.

        *THAT* is highly counter-intuitive, and it's what propagates the false idea that compiling from source is a black art.

        What you *should* have done was

        $ sudo apt-get install libsdl1.2-dev

        which would have installed libsdl1.2 automatically, as -dev packages are built to depend on the main package. But you didn't know this, because you aren't a hardcore hacker. And now you think compiling from source is beyond the ken of mere mortals like yourself.

        All this could be avoided if the -dev packages were just merged into the main packages. Yes, it would make packages a bit bigger and cause trouble for those installing on ultra-low-spec hardware. But Ubuntu is not meant for that -- although it can be used that way, if you know what you are doing. Ubuntu is supposed to be Linux for the mainstream. And dropping separate -dev packages will benefit mainstream users more than it will inconvenience people who want to do highly-customised installations.

  18. John Sanders


    Should be concentrating on solving the important issues like the intel i8xx fiasco that has crippled graphics on hundreds of thousands of laptops!

    But yes, I do understand that from their marketing perspective it is more important to create a marketing image, creating a lasting unique GUI appearance requires some differencing qualities, a nice color scheme, a cute distinct font, etc. That it is all good and that

    But for god sake, what good is it when one can not display all that visual goodness at all or having to do with un-accelerated video so decelerated that is useless. I know this is all Keith Packard's minions fault ( and not Ubuntu's, but come on, the job of a distro is to pack software that works, and prior to 10.04 there was an Intel driver that was fully functional!

    1. LaeMing

      You seem to be under the impression

      that the people doing the UI design could be retasked to graphics driver programming/debugging without a couple of years of training. I don't think forcing graphic-artists / UI-designers to code low-level drivers would produce a result you would be happy with.

  19. Paul_Murphy

    Ubuntu is supposed to be simple.

    Therefore not involving partitions is what Ubuntu should be doing - if you need different partitions use another distro.

    What _might_ make sense to Ubuntu users would be to have a drive for installing the OS on, and another for the user data, but it should be as simple and straight-forward as it can be.


    1. Dave Bell

      We have the technology?

      There's not much point in multiple partitions on the same physical drive, though it might be useful for some of OS-level methods of back-up. But it's a long time since we stopped using Megabytes on the hard drive labels. These days, the people who auto-install Ubuntu aren't going to care. And, a couple of times, I've wondered if geeks are even thinking. When somebody is exulting over a 1Tb Squid cache for a MMORPG, have they even thought about how long that will take to fill?

      Oh, I can thinkof situations where that would be useful. But it isn't something domestic.

  20. Puppy
    Thumb Down


    Catastrophic, got it to my PC as upgrade (ubuntu user from 2006) and result is:

    GRUB2 DEAD (needs reinstall)


    LIVE CD (needed to repair grub) WON'T START, FULL OF ERRORS (logical, not data)

    Now I cannot access Ubuntu 10.10 as result.

    Should I switch to some other distro because it seems than n-th attempt to make money on open source is failing ?

    1. spegru

      The title is required, and must contain letters and/or digits.

      if live cd wont work it's almost certainly hardware (disc or the drive) related

  21. John White

    I LIKE those pesky partitions

    Come off it peeps - Windows needs separate partitions more than Linux - reinstalls need to format the disk and you lose all your personal data (bad upgrades anyone?) Kepp personal stuff in a separate partition and no pain for re-installs - that applies for Windows AND Linux. Yes I DO add separate ones on my Ubuntu installs - it's all GUI stuff and takes less than five minutes. Its saves LOTS of time restoring your user data on a re-installed base.

    My son (who is Windows only - 'no games on Linux and it's borig cos it don't break') has separate partitions on his Win machine - tells me it's a gift on his 6 monthly re-installs of XP.

    If disk are so big these days - what's the gripe in giving 12Gb to "/"; and the rest to '/home'? (I'll leave the '/swap' argument for others to argue over but personally I thing Win could benefit from one for pagefie )

    I've been doing Linux for 13yrs and Windows from when it was a Dos shell - some things are not just a habit - they are from innumerable Win95 upwards install/reinstalls and Linux had an idea woth copying

  22. Anonymous Coward

    Is it just me...

    ...who doesn't understand all these blowhards going on about "I want this and that" with regards to partitioning in this context as if they are in some way prevented in achieving it?

    You have a preference. You are able to select that preference with maybe a half a dozen to a dozen extra clicks depending on how OCD-influenced your partitioning scheme is. It is simply not the default.

    Why do you demand that everyone else should do it "your way" and that new users should be confused at the fact that they can't save something to their home directory or can't install a package but their total usage undershoots the capacity of the drive? If they make it partition 'by default' you will only get many more people bitching that the default is dumb in some way.

    Even with the best laid plans I have on occasion run out of diskspace on a partition and promptly kicked myself for bothering on systems where it's not needed (read: anything that doesn't need user quotas in specific sections of the system, as far as I'm concerned - we have backups for every other corner case, as simply having an intact home partition is only a tiny part of the story anyway on large multiuser systems if you don't have the configs, databases and password files too).

    I've definitely been bitten in systems where /var has filled. I had that just a few weeks ago on one of our backup boxes where thanks to mlocate being installed (and appearing benign when you're scanning the installed package list) by default by the distro and/or the company who we rent it from, the 2gb mlocate database plus the 2gb+ 'new' mlocate database it generated as it tried to index the extra 1m files we'd added since it last ran filled var and promptly upset the machine a great deal. If it had all been in one partition (there's really no specific case for it not to be as all partitions are on the same array), then it would have had 4.5TB to play with, the disk usage would have been identified before it caused an issue, and the guy who was oncall wouldn't have been woken up by our backups failing to run.

    No, I'm not very happy about the default partitioning scheme either - var could have gladly had a hundred gig, but instead I've had to resort to mapping important parts of var into folders under /home instead. Not ideal, and potentially the same trap that someone will fall into if you foist a defaults-to-partitioned scheme upon them. No thanks.

  23. David Gosnell

    Acer Aspire One with SSD

    If anyone tries this on an original SSD Acer Aspire One, please do let us know... 10.04 sucked big time with continual pausing on writing, no matter how many of the documented fixes (noatime, elevator changes etc) were applied. Bizarrely seemed to be graphics related, with suggestions that one of the lighter-weight non-Gnome versions (Kubuntu, Xubuntu) might be better. Sadly impossible to test this aspect from a live USB, and at the moment the dated Linpus Lite remains stable but would happily be replaced if it would be a certain improvement.

    1. copsewood

      @David Gosnell

      I currently use Ubuntu Netbook Remix 10.04 on an AA1 with SSD and this seems pretty stable. I did have problems with Ubuntu 8.04 on the same hardware which worked OK for quite a while and then there was a degree of data corruption. I suspect the problem was hardware related. I only use this system somewhat lightly (travel and holidays or network testing). I'm not convinced these SSD hardwares are as stable and reliable yet as rotating disks. Not sure why you can't test a bootable USB stick with operating system of choice on this hardware, I was able to try this before installing, though the USB is slower than the SSD.

      1. David Gosnell

        USB stick

        Can't test properly with a USB stick because (if it is at least partially the pretty crappy SSD to blame, one of the slightly faster later P-SSD1800 units) the extent of the problems are only apparent once the system is installed to said SSD. The USB stick might reveal general issues in regard to solid state access, but these things can be very subtle and device specific I'm getting the impression.

      2. Steven Raith
        Thumb Up

        AA1 SSD problems, other thoughts

        The SSD in the AA1 was/is notorious for stuttering on small writes - it's a problem with the on-disk controller, and it affects pretty much anything you install on it - no idea how Linpus works around it, other than perhaps operating nearly entirely in RAM and not having any swap in use.

        Using EXT2 rather than a journalled FS helps, but TBH it's a hardware problem with the AA1 setup, not a software/OS issue.

        I've got 10.10 on my Samsung N130, works nicely, even picks up the better halfs 3G stick with a truly tiny bit of fiddling (eject the virtual CD rom, then it picks up the USB modem device and works straight away).

        In fact, I showed it to her, and now she uses it for 95% of everything. In fact, the only thing her laptop gets booted into Windows for is if we want to hook the lappy up to the TV, as Ubuntus HDMI support doesn't seem to want to display at 1368*768, or whatever the tellys native res is - annoying, but not a dealbreaker.

        A worthy update. And I concur on the partition point, most of the people I have shown Ubuntu to and who have consequently tried it, would shit themselves when being asked about partitions. The fact that it will automatically set up a dual boot partition with virtually no user input (and certainly no user knowledge required other than 'how much space can I use on my HDD) still utterly fucking astounds me.

        Now, I just need to work out how to get my new iPod classic to sync from Ubuntu...although I still need Windows to restore it, alas...

        Steven R

        1. John H Woods


          Then you won't even have to eject the install volume

  24. Rogerborg

    If you have an ounce of common sense

    You'll be rsyncing /home to your backup storage once an hour anyway. Partitioning on the same drive is like making a safety net out of piano wire: looks great, until you actually need to use it.

    Anyway, the question that really needs answered is: which drivers have they regressed *this* time? What exciting WiFi and graphical assplosions can I look forward to? Anyone who's bricked their netbook, raise your hands.

  25. spegru


    I'm no zealot on this and have used Opensuse (separate home partitiion) and ubuntu/mint (no separate partition, but I would have though all this angst about running out of disc space is a bit passe in these days of Gigabyte discs! On the other have a separate disc for docs, music etc is ve useful for sharing the PC with windose (esp for newbies who are nervous about going fully over to the light side)

  26. tardigrade

    Date with destiny missed? Not really.

    It's a six month release cycle. Six months! What were you expecting? The article doesn't say. Would you rather wait longer and have a big update like Vista, because that went really well didn't it.

    Small and stable updates are good. They keep the momentum going and don't throw the baby out with the bath water. Maverick is a solid release with some excellent improvements and there is only six months to go until the next release. See how that works?

  27. Anonymous Coward
    Anonymous Coward

    Why versions?

    I always wondered why Linux distros even have versions. I mean it's not like you *need* to package up an OS into a 'version' in the Linux world. All it is is a collection of packages and their versions which are grouped together.

    Is it to avoid upgrade issues? But if you have to reinstall a new 'version' don't you have the same issues? Worse than that, you have to do an upgrade with the new 'version' anyway.

    The approach of Arch Linux or Gentoo comes to mind, which are more of a meta-distribution in that any Linux can run the package manager for them and have essentially the same system.

    1. LaeMing

      I vaguely recall

      Debian discussing a continuous-rolling release between Testing and Stable some time in the past few months.

  28. kneedragon


    Here we have a review of a new shirt, but the reviewer didn't like the collar. So now we have a protracted debate on the merits of broad v narrow collars. Taylors, shop assistants and dry cleaners are getting ready to do battle. What about the rest of the shirt?

    I came on board with Lucid, and have dumped MS for ever. 10.10 seems to just like 10.04 with minor detail improvements. It has so far done everything flawlessly.

    I am aware of the benefits of multiple partitions but for the sake of simplicity I originally installed on a single partition, and since then I've only ever done upgrades. It has given me no trouble and I'm not expecting any.

  29. Rob Davis

    Partitioning essential for dual-boot encrypted Ubuntu and Windows work PC

    ...if your work involves testing stuff on different platforms or that you mostly use one OS for development but need the other for applications unavailable on other platforms.

    Encrypted with Truecrypt or otherwise to protect sensitive company data and intellectual property.

    Such a multi-partitioned, multi-boot setup is possible for encrypted Windows and pre-10.04 versions of Ubuntu that used the simpler GRUB and not the more complex (but automatic updating) GRUB2 that 10.04 uses.

    If Canonical can get the partitioning, multi-boot and co-existing with Truecrypted Windows, then the on-the-move developer office becomes more of a less risky reality.

    I've tried to achieve such a setup here, but not succeeded yet and instead opting for booting Ubuntu from an external drive or use VirtualBox or other VM ware in Windows. Hear about my story here:

    An update is that the alternate CD should be used for such special multi-boot mulit-partition scenarios but this may not be for the novice installer.

    1. Someone

      Dual-boot RAID

      Similarly, I wish they’d sort out BIOS-compatible software RAID. When it goes wrong, as a first-time Linux user, you’re left to wade through these two pages.

      It’s not that the underlying Debian distribution doesn’t support it, because it does. Once you’ve learnt how to install GRUB 2 manually, you find that dmraid works perfectly.

  30. Francis Irving

    Hardware support

    Bizarre criticism about partitions, when there is a much larger criticism lurking. That is, lack of certified hardware support.

    I still can't buy a decent laptop from anyone, that is guaranteed to be supported (in full, including suspend/hibernate) for current and all future versions of Ubuntu.

    Yes, Dell sometimes offer one for a few weeks, and apparently lots of others sort of work. But sort of is not good enough. There are also expensive specialists, which sell a limited range of laptops.

    If only Ubuntu would do a paid service where I can subscribe to have them actually support a particular laptop (like Transgaming, where you can vote and they support a particular game)...

    1. Anonymous Coward
      Anonymous Coward

      Re: Hardware support

      @Francis Irving:

      All the laptops on that page are certified by Ubuntu to work on the respective versions that they list. You're welcome :)

  31. Stephen Bungay

    X has been tinkered with...

    Just installed UBUNTU 10.10 in a virtual machine (virtual box). The VBox extensions complain that it is being used on an unknown XServer and I'm restricted to a maximum display size of 800 by 600... which is fairly useless. 10.04 has no problem, the extensions work perfectly.

    1. WonkoTheSane

      @Stephen Bungay

      Xserver has been bumped to version 1.9. You may have to wait a few days for Virtualbox to catch up.

    2. Someone

      Re: X has been tinkered with...

      This is a known problem. If you use the PUEL version, download VirtualBox 3.2.10. This has a version of Guest Additions that is compatible. From reading the VirtualBox forum, I understood that Oracle held off releasing this until they could check it against the release version of Maverick.

      However, if you’re using VirtualBox OSE, this shouldn’t be an issue. The version of Guest Additions in the Maverick repository has been updated for the change in the X server. In fact, prior to the release of 3.2.10, users of the PUEL version could bodge it by installing virtualbox-ose-guest-x11 from the repository.

      There may be a problem with 3D acceleration in version 3.2.10 of the Guest Additions. It took me two attempts at installing, before 3D acceleration worked. At least one other person has reported a problem.

  32. Anonymous Coward
    Dead Vulture


    Why does the layman want a seperate /home partition? To stop them filling the filesystem with their bittorrent downloads and crashing the system, yea, I know. But that setup requires an advanced user to maintain the balance, or you end up wasting hard disk space.

    It makes much more sense to have one partition, warn the user when it's getting fine and only allow root to use the last 5%. Silly register.

  33. Adze
    Paris Hilton

    OTW Win7 upgrade for Vista costs... ?

    About the same as 3 x 1TB Samsung Spinpoint disks. In fact most hardware can be bought more cost effectively if you leave out the boat anchor Microsoft licence.

    Disk is cheap, partitioning for data security is largely pointless on single devices.

    Now a single disk, a small one, say 120GB/160GB/250GB - they're cheap these days - for / is a good idea, if it breaks, replace & reinstall, job done.

    For anything volatile a mirrored pair of inexpensive disks - as above - is pretty much ideal. Obviously a good backup regime is still essential if you're talking about data which is expensive/irreplaceable/time critical. Just pick your disk size to your dataset - if you've 250GB of data and you increase by 50GB a year... buy 500GB or 1TB disks... simple!

    In terms of time - I recently reinstalled 10.04 from scratch. An hour and 45 minutes after I booted from the Live CD I was back to a fully functional system, apps, data, VPNs, mapped drives, scheduled tasks, the lot basically and that was on 3 year old, hardly cutting edge when it was bought HW.

    By contrast, I recently reinstalled a client's 18 month old MS Vista laptop from the recovery partition of their HDD - 15 minutes for the OS install, which was fast, but 4HOURS of downloading security updates, patches and anti-virus later before I was ready to restore their data backup. In no way can 10.04 be described as time expensive compared with MS.


  34. kain preacher

    Windows re install

    If you have to reinstall windows once a month or 6 times in year , you are doing some thing wrong.

    1. serviceWithASmile


      you are using windows, that's the problem

      in most cases, "doing something wrong" is normal usage.

  35. This post has been deleted by its author

  36. Richard Lloyd

    Partitioning gripe is nonsense

    In the graphical installer, you're given the choice to "install Ubuntu side-by-side with other OS'es" (which, er, requires shrinking of an existing [probably Windows] partition and, yes, a new partition is created), "erase and use the whole disk" (thankfully not the default!) and "manually partition" (which I always do myself because I have multiple Linuxes on the same drive). If you're just installing Ubuntu and no other Linuxes on the hard drive, then technically you don't need separate partitions, but in reality, you probably want the OS to create a swap partition at least equal to your physical RAM.

    Anyway, apart from the total lack of innovation of anything actually useful in 10.10 compared to 10.04, I'm surprised that the disastrous ATI kernel modeset problem *still* hasn't been fixed in 10.10. I have an ATI HD 2600XT and the graphical installer for Ubuntu just goes straight to a blank screen. Sure, pressing F6 and adding "nomodeset" to the kernel boot line will fix it, but who will know to do that?! Recent Fedoras (even including F14 beta) have the same catastrophic bug and no-one seems to be fixing this kernel modesetting issue in the upstream kernel.

    I think Fedora 14 is going to be a far more interesting release than this near-pointless 10.10 release - there's actually new stuff going into F14 that looks useful, unlike, oh, a new Ubuntu system font which they don't even make the default!

  37. Bob. Hitchen

    Ubuntu Rocks

    I don't know about 10:10 but I've been using Ubuntu for years on everything but pc games and satellite telly. Out of interest I tried 10:04 that came with a pc mag. It worked first time so I loaded a spare disk in a caddy and installed it. Only thing I've done since is installed all the usual flash stuff. Partitions who cares? Get yourself a couple of network disks or usb sticks and COPY important (to you!) files. Ubuntu now offers a clear alternative for computer dummy's.

  38. Sorry that handle is already taken. Silver badge

    Does VNC work with Compiz yet?

    Type your comment here — plain text only, no HTML

    1. Anonymous Coward
      Anonymous Coward

      Compiz over VNC?

      OUCH! Thats a whole lot of bandwidth to send down the pipeline just so the windows can jiggle like jello along with a nice spinning desktop cube. Compiz is great eye-candy for the desktop... but trying to use it over VNC? Not that it can't be done... but SHOULD it be done?

  39. Henry Wertz 1 Gold badge

    /home is where the heart is

    Wow... A/C at 12:02 actually gave a good reason for seperate /home! That's actually a decent idea, to have room to run multiple distros while keeping /home.

    I'm one that just uses one big partition. And, I did do installs all the way back to 1993 (slackware), I always used either one partition.. Well, when I had a big disk in a system with a 4GB boot limit, I had a seperate /boot (something like 32MB.. since it only had to hold a kernel and a backup kernel.) I don't want to have to worry about running out of space on one partition while having plenty on the other, and /home is already in /home so I don't feel any reason to keep it sperated. I can back up /home if I screw things up badly enough to have to do a FULL reinstall (which I have never done. I had a disk crash -- based on the sound it made, I think 1 out of 4 heads actually fell off the disk mechanism -- but partitioning would not have helped that.)

    That said, first I can't believe the whole thread focuses on that alone. Secondly, good review -- I used 10.10 in a VM a bit and it really does sum it up -- there's nothing earth-shattering compared to 10.04, they've just kind of fixed some broken stuff. I think there's some nice changes "under the hood" though.

  40. Henry Wertz 1 Gold badge

    Version numbers

    "I've got 10.10 on my Samsung N130, works nicely, even picks up the better halfs 3G stick with a truly tiny bit of fiddling (eject the virtual CD rom, then it picks up the USB modem device and works straight away)."

    Even that minor issue is actually fixable -- I'm sure I saw something in /etc/hal or somewhere like this that has a list of the 3G cards that show up as a fake CD first, and basically auto-ejects the fake CD.

    "Why versions? " Sorry, but versions are important (and I do say this as a gentoo user). If everything works perfectly, then it really doesn't matter, updating to the latest to greatest automatically is all good. In reality that doesn't happen. Some apps are certified jsut on some version of a distro -- I don't care about "certified" as long as I can make things work but others do. I *have* had things work with one version and break with the next though, meaning without versions I'd be pretty screwed. I've also had cases of newly introduced bugs, where knowing I have a definite baseline to fall back on was comforting. Two examples --

    My Mini uses the dreaded GMA500. It ships with drivers for Ubuntu 8.04, newer Ubuntu versions use newer X.Org versions (and no GMA500 driver included), so without "versions" I'd have to quit upgrading at that point, or risk having the X server ripped out from under my drivers, breaking them. There's seperate hacked drivers for 9.04, 9.10 and 10.04, meaning without versions I would have had my video break at least 3 times.

    Second example -- NetBSD. I had a copy running on a NEC MobilePro 770, with a 2GB CF card shoved into it and PCMCIA wireless. So, after a while I installed the newest NetBSD onto the card, only to find that the CompactFlash support is broken. I tried to install the older one -- still broken. As it turns out, they are more branches than actual versions, there's no way to install a definitive "version" of it, it just pulls all the latest updates for that branch, and older updates are generally not even available. So the CompactFlash breakage was backported to every branch I tried.

    1. Anonymous Coward
      Anonymous Coward

      @Henry Wertz

      That's interesting but there are a couple flaws in your logic. First example it is the updated Xorg that broke your drivers. In a non-versioned environment you would downgrade Xorg again if you found it broke anything, without changing anything else or having to reinstall. Problem solved.

      For NetBSD this does not apply as it is not a collection of differently versioned packages. Userspace utilities and the kernel are developed together, so it is not possible to update them adhoc.

    2. A J Stiles

      Any reason why

      Is there any reason why the GMA500 driver from the Ubuntu 8.04 X.Org source tree can't be built under 10.10?

  41. Trixr

    Not that simple...

    My upgrade to the 10.10 beta succeeded in b0rking my grub loader, due its prompting me in the middle of install about a trivial change I had made (to the screen resolution), and prompting again as to the location it should be installed in. Whatever I chose was the wrong choice (I thought it was hd0,0, but whatever) - nothing logical like putting it back where it was without prompting.

    After fixing that little problem (after several hours), and at least being able to boot into Windows, Xorg is not working. Last time I at least was able to logon to the gui (even with a vile screen resolution) and install envy-ng to obtain the correct ATI driver. Not now - it struggles for a few minutes and then dumps me back to the shell. It's a Dell XPS that I've had for over a year - hardly bleeding edge.

    I like Ubuntu, but this kind of thing is immensely offputting - I'm not a Linux expert, but I do know the basics, and if I can't do a simple upgrade, I feel sorry for new users.

  42. Matt_W


    I've been using Ubuntu at home since Dapper, generally have few problems, but for some reason the upgrade option never worked - to go up a version I'd always have to download and burn a CD and install from that.

    This time it worked, I was delighted. Then again when I started it up I get a pretty dramatic freeze - so the thing is effectively bricked.

    Aside: One thing I would like fixed - (it might be fixed, I just can't use the bl**dy thing!) the problems with playing DVDs - since 10.04 I haven't been able to play DVDs smoothly - I did have to play around with DMA settings to get it to work pre 10, but those changes did nothing.

  43. Greg J Preece


    Not had time to play with Maverick on my laptop until today, so I hadn't noticed the new KNetworkManager. That thing is SWEET! Took them long enough to make a proper network manager, but this time they've really got it right. Nice and clear, easy to use, and optional levels of detail for the techies amongst us. Absolutely love it - a small but superbly done upgrade.

  44. Anonymous Coward

    Pointless article

    Late to the party, but heck, what a totally negative pointless article.

    As many commenters have noted, partitioning is, for the majority of uses, a thing of the past.

    The author fails to understand just how clever Canonical are being here - it's not about bleeding edge, it's not about appealing to the 'geeks' - it's about offering the masses a quality operating system - a *free* alternative, that's easy to install, easy to use and pleasing to the eye.

    I've been following Ubuntu since it's inception and it's always been a case of steady, reliable updates, innovation, embracing and nurturing Open Source and above all, respecting that's there's as many different types of users as there are linux distributions!

    What irks me the most, is the ridiculous title "date with destiny missed" - who thought that stupid one up? Heck, if you had more reason than just 'partitioning' for this supposed 'destiny', maybe it would sound right.

    Blegh. FAIL.

  45. Mahou Saru

    Well my lappy only need 2 partitions...

    Boot and an encrypted partition to hold a LVM (which contains swap and home)...

    Since encryption = fragile data, I always back it up so recovery shouldn't be an issue.

    Issues with a full partition? A non issue as I don't run as root!

This topic is closed for new posts.

Other stories you might like

Biting the hand that feeds IT © 1998–2022