back to article Backup frustration brought this CTO to forefront of ransomware protection

As CTO of The New York Times two decades ago, Andres Rodriguez became frustrated with the time-consuming and unreliable process of backing up massive amounts of data that was only tested when it failed. That experience led him in 2008 to launch Nasuni, building what has become a cloud-native global file platform that does away …

  1. Paul Smith

    ZFS?

    Is that not just a propitiatory version of ZFS?

  2. Anonymous Coward
    Anonymous Coward

    I am old enough to have used the old DEC systems that always versioned files when altered. I wish Windows had that as an option. That would not provide the resiliance of read-only remote duplication, but it does save time for recovering from a whole host of issues, which is why things like Sharepoint have a shitty versioning system in the file history.

    1. David 132 Silver badge
      Pint

      Ah! A person after my own heart!

      Yes, my first thought on reading “…a constantly-versioning file system…” was “oh, just like VMS then” with its automatic

      THESIS.DOC;1

      THESIS.DOC;2

      …etc.

      Have a pint of mild and let’s not be bitter.

  3. Doctor Syntax Silver badge

    "the time-consuming and unreliable process of backing up massive amounts of data that was only tested when it failed....a cloud-native global file platform that does away with traditional backups and instead constantly creates new versions of files that are not shipped to a backup system but instead are kept on the cloud-based platform. In addition, everything is managed – both in the cloud and on-premises – via the platform."

    If I were in the market for something like this I'd walk away at this point.

    As CTO wasn't it his job to test the backup restore process? If he did he'd have made it more reliable and probably less time consuming. And maybe the description under-sells - it probably does - it but if I read it right there's a single platform containing all versions. Lose that platform and...

  4. Anonymous Coward
    Anonymous Coward

    Let's see: The old files are still held on the nebulous "cloud platform". If they are on the same media then they are equally vulnerable to attack or or loss; If they are on separate media then welcome to "overly complicated backups by another name".

    The whole point of off-site backup is that you have a recovery point even if the whole datacenter is destroyed.

    Yes, I remember file versioning (I too an an old VAX/VMS alumni), but it wasn't always as useful as you wanted, wasn't always appropriate and burned through space.

    1. Greybearded old scrote

      Space is cheap, data is precious.

  5. VoiceOfTruth Silver badge

    to version the data within the file system

    You mean like snapshots?

    I recovered some systems a few years ago that held a lot of data on NetApps. Average users did not have direct access to the systems, only via mounts from Windows machines. One day, somebody got infected with some malware/ransomware and files started being encrypted. By the time it was noticed a day or so had passed.

    "Can you get our files back?", was the question. "I can the whole file system back to how it was yesterday, pre-infection. And for many files I can put them back based on the hourly snapshots", was my reply. "Good enough"...

    ZFS can do the same, as Paul Smith writes above.

  6. Greybearded old scrote

    Yeah

    Snapshots are great, but you still need more than one copy. Otherwise what do you do when your disk array/server/data centre go boom?

    Anyone know if the Interplanetary Filing System is any good? Looks interesting.

    1. VoiceOfTruth Silver badge

      Re: Yeah

      In our case the NetApps were replicated to another site. Not quite real time, about 5 minutes behind the 'live' system.

  7. DrG

    Any particular reason this wasn't tagged as an advertisement?

    See title.

    1. MiguelC Silver badge
      Happy

      Re: Any particular reason this wasn't tagged as an advertisement?

      It is tagged as an interview with the company's CTO, what did you expect?

      1. DrG

        Re: Any particular reason this wasn't tagged as an advertisement?

        Well thought out criticism of a backup solution that claims to be magic, with a fair dose of sarcasm.

        ...la routine habituelle quoi.

  8. Anonymous Coward
    Anonymous Coward

    Hardly a magic bullet

    While using a good filesystem which provides versioning and/or snapshotting might provide fast recovery when hit by ransomware, putting all your data on someone else's servers on the Internet exposes you to data loss, data theft and connectivity problems.

    When the proverbial brown stuff hits the fan and you can't access your data stored "somewhere" on the Internet, whether the loss is caused by ransomware, a poorly thought out "upgrade" by your storage provider or a JCB cutting your fibre, you still don't have access to your data.

  9. osxtra
    Linux

    Quacks Like A Duck

    So far as "modern" backup systems go...

    2005: Git, an open-source, on or off-premise versioning backup system for files, is released.

    2006: Carbonite, a closed-source, partially versioned off-premise backup system for files, is released.*

    2008: Nasuni, a closed-source, off-premise versioning backup system for files, is released.

    Per the article, Nasuni versions files which are "not shipped to a backup system but instead are kept on the cloud-based platform."

    A rose by any other name. Sure sounds like a backup.

    Like many, I use git for tracking changes to website content. One such has product catalog and sheet PDF's for a medium-sized wholesaler. Running for ten years now, the "objects" dir of that repo is around 13 gigs. Current content on the site itself is nearly 2 gigs (a good chunk of that being zips of hi-res product images and collections of PDF spec sheets, along with the individual spec, catalog and product sheet PDF's, all of which have of course changed over time).

    Sometimes we need to go back and find out when a section of the website verbiage, or some PDF or ZIP has changed. Thanks to diligent use of commit comments, that's easy to do as we're looking for a specific price, image or verbiage update. Just throw a commit hash to Git and restore the chosen file somewhere, then compare it to today's copy.

    A much harder prospect is to find out when your system got infected. Exactly which files to you inspect for changes? That can end up being a lot of diff.

    If you keep more than one copy of your backup based on time, it's versioned. While the idea of using an individual file-based method as opposed to monolithic copies of your entire dataset is a good one in that the potential for restoration is much faster, it's still just a backup. Also, no matter which method is used, in practice, it could prove problematic determining exactly when the "good" data turned "bad"' i.e., from what point you'd need to restore.

    It would also be interesting to see just how much overhead there is with this per-file method, whether or not the content is hosted on some shared network drive, or locally on individual workstations.

    (It's assumed a shared drive would be on "stronger" hardware, but depending on what all the target machine is hosting, the load could still be pretty large. With the above-mentioned website on my dev machine, and not counting actual upload time, these days it still takes some seconds for Git to sync everything when pushing updated content to the website repo.)

    * Not harping on Carbonite, just an example of one of the myriad cloud-based backup systems out there, and an early one for the WinDoze / consumer world. As others have posted, this concept has been around a while. According to their website, Carbonite keeps a single Daily version of files - not each and every change, presumably just the last change made that day - three weekly versions of the Daily, and two monthly versions of the Weekly. So, provided the infection happened within the past couple of months, you can recover...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like