back to article Backup: It really should be easy

Backup's easy; people have been doing it for years. All you need is a tape drive and software. On the other hand: backup is terrible; it takes too long and tapes fail. Really though, how hard can it be to do this simple seemingly thing: protect your data against hardware failures and file deletion/corruption? The ideal backup …


This topic is closed for new posts.
  1. Anonymous Coward

    Cloud fail

    "As businesses migrate some or all of their data to the cloud, they will be less involved in the details of backup"

    Ah yes, the head-in-the-sand approach to backup.

    You're probably ahead of me here, but the issue isn't that it's no longer your problem: it still is (the site's offline, your boss is angry, etc), but it's that now you are out of the control loop. Assuming you can get hold of the cloud provider to plead for a quick recovery, you are now just one of many of their clients, who are likely in the same situation. They'll get round to you in due course.

    But that's okay. You have an SLA!

    Quick, show that to your boss while you wait for them to get round to restoring your system. They'll probably give you an extra month's service if it takes a while. What more could you want?

    1. Anonymous Coward
      Anonymous Coward

      RE: Cloud fail

      Agreed, but the reality is that as well as outsourcing some of the pain in the ass server management, it makes it very very easy for IT types to just blag their boss over by saying something like....

      "Yeah, it's not a problem at our end, the service provider is looking into getting their systems back up and running asap. I've already spoken to them, they are going to have it back up as soon as they can. In fact, I'm really not happy with their performance over this, blah blah blah"

      ... if there is ever a problem and the truth is, 9 times out of 10, it will work.

      1. Tom Maddox Silver badge


        "if there is ever a problem and the truth is, 9 times out of 10, it will work." It's that 10th time that you want bulletproof data protection, and the discerning IT practitioner will want to ensure that the methodology is well-known and tested, which is hard to do when the data is "in the cloud."

      2. Florian Hwigl

        9 times out of 10

        And the truth is that's why 9 IT guys out of 10 don't get a job in any real non-cloudy datacenter.

    2. maclovinz
      Big Brother

      It's all part of a grand cycle...

      ...which is why I believe people will GO to the cloud, some of them, and then will want their data BACK under their control in the next 25 years, etc, etc, etc.

      Back and forth it has been between Terminals to PCs, thin clients and workstations.,....etc...

      With all the conspiracy nutjobs ("theorists") out there, you'd think they wouldn't even have an internet connection at all!

  2. Anonymous Coward
    Anonymous Coward


    If your data is "in the cloud" it's still going to need to be backed up and if you outsource something you'd better understand it very well indeed so that you can tell if your outsourcing partner is actually handlind your services properly. Backup and recovery of clouds is probably more complex than backup and recovery on standard systems due to the mobile nature of the data.

  3. Anonymous Coward

    You're 'aving a laff aren't you ?

    What kind of insane person is going to trust the very existance of their business (which is what is at risk in the event of complete dataset loss) on some other business, and the small print of the service contract ?

    Any such person you can find deserves to go out of business, because anyone who is stupid enough to gamble their own employment, the employment of their staff, and their commitment to meet customers demands on A.N Other persons virtualised environment must be a bloody fool.

    The Stop icon, because its about time this entire 'cloud' farce was put into some sensible context, and people stopped floating stupid ideas like businesses gambling their existence, with idiot ideas of abjugating responsibility of their data.

    1. Anonymous Coward
      Anonymous Coward

      RE: Your'e 'aving a laff aren't you?

      Hah...... Hah

      MS CRM 2011: Requirements (self hosted)

      Windows 2008 Server

      MS SQL 2005 Standard

      CRM 2011 Software

      + 1 Meaty server to run it all on

      + Set up costs

      + Backup infrastructure

      I can't be arsed working out the costs but you wouldn't have much left from 10 grand if you set all this up with any kind of relisiance in mind.

      MS CRM 2011: Requirements (Online "Cloud" version)

      £29 per user per month.

      And nothing else.

      For a small business, this cloud stuff is a godsend.

      1. maclovinz

        Godsend, really? Basic Math Says NO

        Small Business of 50 Users.

        You say £29/user/month....ok, let's go through this.

        So, £29 x 50 users x 12 months = £17,400K ....whoa! And that's just for one year.

        The solution he mentioned was still going to be under £15K TOTAL, and will most likely run for more than 12 months....say, a three year lifespan

        Sooo...the local solution will cost a ~£15K TOTAL one time charge. In most businesses, at that cost, this item can be claimed as an asset, thus allowing for insurance coverage in the right setting. Also, some small businesses DO handle sensitive information, and if someone could hack that cloud provider, then the COMPANY, not the CLOUD provider, would be liable....Yes, the cloud provider SHOULD be responsible, but we all know how the LAW has kept up with TECHNOLOGY. Until this is changed, expect nobody handling sensitive data to give a rat's ass about the cloud.

        Or, you can pay 17,400K/YEAR for the cloud version, thusly making the TCO with a three year lifespan roughly £52,200. Compared to ~15K for an onsite solution...hmm...

        If the one-time fee is too high, then perhaps these smaller business should actually have a BUDGET PLAN.

        I just don't see it.

        1. Simon Peters 1

          Bit more than that...

          If you follow the letter of the Microsoft Licensing bit, and are paying the better part of 7-800 quid for a CRM and SQL license per seat, you're looking at £35-40,000 in licensing costs for 50 users, and that's before you factor in the development time and hardware. Then the cloudy version starts making a bit of sense. Robbing bastards either way though!

          1. Anonymous Coward
            Anonymous Coward

            Small business != 50 employees

            I was thinking more in the relms of 5 - 15 employees myself, if a business has 50 employees, then it can probably afford a server and associated infrastructure.

          2. Ben Goodyear

            And even more

            Datacentre space to put the servers,

            Power to run them,

            Maintenance on hardware, software, licenses,

            Expertise / staff to operate the CRM, SQL etc,

            Spare staff to cover sickness/holidays etc,

            Cloud isn't right for everything/everyone, but it certainly makes sense for some apps and some circumstances.

  4. Danny 14

    on the other end

    It is quite difficult for the smaller business to implement a strategy. Cloud based is out as the raw internet connection may not be available. Most of my clients live in rural cumbria where ADSL is a blessing. Since MS removed a proper backup program (ntbackup is still somewhat usable if you know what you are doing but VSS is broken) then smaller businesses need to spend hundreds on a decent backup program.

  5. JeffyPooh

    A peck of portable 1TB drives...

    Copy *.* P:/ each night. Cycle the drives around in the usual inner- and outer-loop schedule. Store some off-site.

    Better, faster, cheaper.

    1. Florian Hwigl

      usb drive scalability

      ok. now please describe how you do that for replacing like, 5000 LTO-4's in a jukebox.

      I guess daisy chained USB hubs would be really clever?

      I promise we won't go looking at the disk failure rate or the higher operations costs.

      Oh, and yes, at home I do just that: External USB drive containing backup server + backup data, boots as vm as long as it's attached to the xen host, and will boot as normal fedora when the xen host happens to die.

      Just I can grasp the difference between risking a whole company and risking my pr0n archives?

  6. Anonymous Coward


    Does the author actually know what a computer is?

  7. AndrueC Silver badge
    Thumb Down

    Backup is easy?

    Since when? Every piece of backup software I've had to install and run (which is a lot because I have to write software that supports various formats) has been a pain in the arse. Weird configuration requirements (one required us to spend a day reworking one of our test domains to get the security sorted), bizarro land UI layout. Spontaneous server failures (thanks for that NetBackup). Wacky error messages - one of them (after allowing us to select the set from a list it had found) responded with 'No such backup'. I mean, FFS - you generate a list of backup sets we can restore from then tell us you can't find the one we selected.

    It's astonishing that crucial aspect of IT is hidden behind esoteric UIs and flat out bad coding.

    Thank God I only have to read the data and not actually rely on the systems.

    1. Jan 0 Silver badge
      Paris Hilton

      Down to earth CLIs make it easier

      In the case of Netbackup, behind the GUI is an extremely useful, comprehensive and versatile set of commands.

      Paris, because I'd like access to her commands.

      # paris -help

      Entity not found


      1. AndrueC Silver badge


        Yeah I ended up delving into that CLI stuff eventually. Unfortunately despite my best efforts one of our test servers just didn't want to know. One of the services wouldn't restart for love nor money after a while.

        It may be that as I'm not an IT administrator I just don't 'get it' well enough to use these packages. All I wanted to do was install them, backup some stuff then write my own reading code. I think most packages required at least a day to install, configure and debug to get running :(

        I think NB was also the one that refused to let us re-use its tapes. We got some really funky messages when trying to erase them and eventually gave up and fed it some old ArcServe tapes instead. It seemed to accept them as being ready for use :-/

        1. Anonymous Coward
          Anonymous Coward


          NetBackup it one of the good ones - You should see Networker, the UI makes you want to put your eyes out, it reports things that aren't happening and is flakey as hell... I could go on...

          As it happens, there are a couple of reasons that you wouldn't be able to overwrite a tape in NBU, first you have to configure it to be able to overwrite different tape headers. Then you would use 'bpexpdate' to erase tapes or individual backup sets. If that doesn't work there is another command, to force a tape out of the catalogue database I think it's one of the 'volume manager' commands, but it's a good six years since I used NBU so I can't remember it. The support people at symantec should be able to tell you off the top of their heads, if not the internet will.

  8. Anonymous Coward

    You wanna talk about poorly-written backup software?

    How about enterprise-level backup software that can't restore a file where the total length of the original pathname is more than 240 characters?

  9. Anonymous Coward
    Anonymous Coward

    Should be but isn't - blame the software

    eg error messages such as "Back up failed" which for most of the time with my setup seems to mean "backup not 100% successfull" (it missed a small bit) but you get the same message for a disaster eg when the drive ate a tape.

    lack of guidance in the manuals from companies that call themselves experts. Try and find a recommended backup strategy in written in plain English.

    1. AndrueC Silver badge

      Damn straight

      It also scared me when I went to support forums. Weird error messages that only seemed to affect some people. Even more weird solutions that, again, only helped some people. Comments along the lines of 'Oh dear. We'll make a note of that for possible inclusion in the next release'. Oh and turning on logging was always good for a laugh. NB in particular does the same things over and over again. I think it asks for the NetBios machine name a dozen times while preparing to restore data.

      The impression I got from all of them was of software that had been cobbled together over the years from various sources and that only worked by some Heath-Robinson style miracle. Not very inspiring :(

  10. -tim

    That isn't backup?

    There are 5 distinctive classes of backups ranging from "opps, I deleted my presentation" to "Someone from the government is here looking for some records from 6.99 years ago". Disaster recovery is not the same as "wow that power supply blew up real good!. Lets rebuild its backup on a new box".

  11. Britt Johnston

    cloud backup = rain recovery

    It is easy to get wet, but difficult to predict or control.

    Icon = recovery button

  12. Anonymous Coward
    Anonymous Coward

    So you think you have backup down pat, eh?

    Now try a restore. A what? A restore. You know, trying to get that backed-up data back from the back-up. Pick out the right bits. Or just everything, on an empty system, devoid of even an OS. What's that great software suite environment thing doing for you now?

    1. Ammaross Danan


      Very good point! The Cloud people are great in that they provide backups (hopefully) for you. Now, just what point in time you want them to restore from? Good luck. User X deleted a NEEDED file middle of last month... you use it to generate month-end stuff perhaps. Will CloudX be able to restore it? Do they keep month-ends? Do they keep your required 7-year backups? Perhaps. Back it up locally you say? Why? You have the cloud, remember?!

      Now serious stuff: I've used several backup softwares, such as ArcServe (ick) or BackupExec (tape libraries, with severe issues with overwrite...very disheartening to check on a backup only to realize it was aborted due to trying to write over a previous backup or write to a tape not in its library (clean tape even). Anyway, compound that with the crap speed of the LTO drive we had and ick.

      Solution? Moved to disk-based backups. External 1TB drives cost about as much as our tapes did, perhaps slightly more. Benefit? Faster backup times. Perhaps even better resiliance. Definately better random-file recovery. As for backup software? Ever try restoring a file from an version of ArcServe? Does it even run on WinSrv08? What if you misplaced the license key or the software? Install disk damaged? Granted these are worst-case ideas, but you can't expect your install disk of BackupExec9 to be available when an earthquake causes a box to land on your stack of disks.... Ever try restoring a backup from a different/newer version of the software? Doesn't always work unfortunately. I tend to agree with the copy *.* P:/ solution mentioned earlier, however it is still a crap idea. Robocopy or SyncToy would have been a better suggestion. Not to mention the potential need to encrypt any data that goes off-site. This is where backup software starts to look better.

      Really, it all comes down to need. Do you need to have your data back online in under 5 minutes? Are your backup windows starting to converge? I ran into a problem a while back of having our week-end full backups approach 72hrs runtime (yes, it was across a T1, not on-site, but the data needs to go to where the IT people are most of the time), which starts to conflict with monday morning, as you can imagine. A switch to disk, and changing the backup software/method cut that time considerably (would you believe down to 15min?). As for Continuous Data Protection, it's a great idea. Recovering a file from 5 minutes ago is quite useful, as is having a historical archive for the past few days/weeks. The backup system I have in place now uses file-level deduplication between backup sets, which isn't as good a block/bit level, but still allows for hourly/dailies for the past couple of weeks, and weeklies and monthlies spanning back a good year; all accessible at a moments notice, rather than having to phone the cloud or run to the safety-deposit box for tapes.

      1. J. Cook Silver badge

        Damn Skippy!

        I am the backup admin for our company. The most often thing I'm doing as far as data restores with our backup software is pulling some poor schmuck's email datastore from N days ago and exporting it out to an external file for their client program to look at. As far as file recovery goes, we rely somewhat heavily on our SAN's snapshot feature and windows Shadow copy, which makes recovery of files almost trivial, unless it's a compliance/legal discovery type of recovery, or it's a really old file, or if the data centre gets hit by a very large lorry and burns to the ground.

        While it'd be cool to backup our data to the cloud, my company can't for various legal reasons, unless we _build_ our own cloud.

        1. AndrueC Silver badge

          Restrictions apply :)

          I could recommend the product I work on for restoring email backups. That's basically the entire reason it exists. We can pull data out of backup files without needing the original application and then let's you browse the Exchange data without having Exchange installed. Either extract to PST or just send it straight to the server.

          Trouble is I'm not sure the admin will let me mention it. I'll try though. It's called PowerControls.

  13. Anonymous Coward

    Physical backup

    I'm of the opinion that if it's important enough to backup then print off 3 copies and keep 2 of those copies in seperate locations in a fire/water/bombproof lockup (or bunker).

    Most generated bits/bytes are worthless junk not worthy of backup!

    Of course the biggest pain in the butt are databases and those nasty accounting applications that work on obscure cryptic files nobody can read or easily copy unless it's copied in it's entirety and the people who make it who lock you into a system of yearly upgrade cycles by way of ransom (I'm looking at you Sage).

    1. Anonymous Coward
      Anonymous Coward

      Print it off?

      I'll do that with our email archive, it's only the 300TB...

  14. Steve Davies

    Put all your eggs in someone else's basket(s) - no risk there then.

    Just like any outsource agreement that covers yourdata and its availability. Negotiate access to all their DR testing records and notification of the schedule of ALL their future tests and the right to attend and observe an the day - unannounced.

    Make the contract contingent on this and on their recovery working.

    And then do it.

    Better yet do it before you sign :)

    Smaller companies may not have the clout (or the knowledge) to do this but the bigger ones can and should.

    Keep 'em honest. Its your business not theirs.

    And FFS wok out the TCO ! (should that be the TCC nowadays).

    1. Anonymous Coward
      Anonymous Coward

      Re: Put all your eggs in someone else's basket(s) - no risk there then.

      "Just like any outsource agreement that covers yourdata and its availability. Negotiate access to all their DR testing records and notification of the schedule of ALL their future tests and the right to attend and observe an the day - unannounced. ... Smaller companies may not have the clout (or the knowledge) to do this but the bigger ones can and should."

      That won't work (as indeed I think you're suggesting).

      Small companies (who arguably might benefit most) won't, as you say, have the clout, and bigger ones might just as well manage the infrastructure in-house.

      After all, if you need to be that involved to prove they are to be trusted, you're going to have the expertise, and you might as well take control.

  15. Federica Monsone

    From Eran Farajun, EVP, Asigra

    Precisely, and we couldn’t agree more! The reality that backup should be easier but actually is not is yet another reason that skills and procedures around performing backups are declining and are being out-tasked to cloud services providers. This stubborn IT task called backup is the bane of IT administrators over several decades. Newer technologies and methods have come to market, like new exercise equipment, but at the end of the day it is consistency and attention to details that is the key ingredient. Working with the right qualified cloud backup service providers is like joining the gym and having the trainer do the exercise for you, but you get all the health benefits. Technology matters, but actually putting it to practice consistently over time is what makes it work. Consistency is the hardest part of backup.

This topic is closed for new posts.