Is that not just a propitiatory version of ZFS?
As CTO of The New York Times two decades ago, Andres Rodriguez became frustrated with the time-consuming and unreliable process of backing up massive amounts of data that was only tested when it failed. That experience led him in 2008 to launch Nasuni, building what has become a cloud-native global file platform that does away …
I am old enough to have used the old DEC systems that always versioned files when altered. I wish Windows had that as an option. That would not provide the resiliance of read-only remote duplication, but it does save time for recovering from a whole host of issues, which is why things like Sharepoint have a shitty versioning system in the file history.
"the time-consuming and unreliable process of backing up massive amounts of data that was only tested when it failed....a cloud-native global file platform that does away with traditional backups and instead constantly creates new versions of files that are not shipped to a backup system but instead are kept on the cloud-based platform. In addition, everything is managed – both in the cloud and on-premises – via the platform."
If I were in the market for something like this I'd walk away at this point.
As CTO wasn't it his job to test the backup restore process? If he did he'd have made it more reliable and probably less time consuming. And maybe the description under-sells - it probably does - it but if I read it right there's a single platform containing all versions. Lose that platform and...
Let's see: The old files are still held on the nebulous "cloud platform". If they are on the same media then they are equally vulnerable to attack or or loss; If they are on separate media then welcome to "overly complicated backups by another name".
The whole point of off-site backup is that you have a recovery point even if the whole datacenter is destroyed.
Yes, I remember file versioning (I too an an old VAX/VMS alumni), but it wasn't always as useful as you wanted, wasn't always appropriate and burned through space.
You mean like snapshots?
I recovered some systems a few years ago that held a lot of data on NetApps. Average users did not have direct access to the systems, only via mounts from Windows machines. One day, somebody got infected with some malware/ransomware and files started being encrypted. By the time it was noticed a day or so had passed.
"Can you get our files back?", was the question. "I can the whole file system back to how it was yesterday, pre-infection. And for many files I can put them back based on the hourly snapshots", was my reply. "Good enough"...
ZFS can do the same, as Paul Smith writes above.
While using a good filesystem which provides versioning and/or snapshotting might provide fast recovery when hit by ransomware, putting all your data on someone else's servers on the Internet exposes you to data loss, data theft and connectivity problems.
When the proverbial brown stuff hits the fan and you can't access your data stored "somewhere" on the Internet, whether the loss is caused by ransomware, a poorly thought out "upgrade" by your storage provider or a JCB cutting your fibre, you still don't have access to your data.
So far as "modern" backup systems go...
2005: Git, an open-source, on or off-premise versioning backup system for files, is released.
2006: Carbonite, a closed-source, partially versioned off-premise backup system for files, is released.*
2008: Nasuni, a closed-source, off-premise versioning backup system for files, is released.
Per the article, Nasuni versions files which are "not shipped to a backup system but instead are kept on the cloud-based platform."
A rose by any other name. Sure sounds like a backup.
Like many, I use git for tracking changes to website content. One such has product catalog and sheet PDF's for a medium-sized wholesaler. Running for ten years now, the "objects" dir of that repo is around 13 gigs. Current content on the site itself is nearly 2 gigs (a good chunk of that being zips of hi-res product images and collections of PDF spec sheets, along with the individual spec, catalog and product sheet PDF's, all of which have of course changed over time).
Sometimes we need to go back and find out when a section of the website verbiage, or some PDF or ZIP has changed. Thanks to diligent use of commit comments, that's easy to do as we're looking for a specific price, image or verbiage update. Just throw a commit hash to Git and restore the chosen file somewhere, then compare it to today's copy.
A much harder prospect is to find out when your system got infected. Exactly which files to you inspect for changes? That can end up being a lot of diff.
If you keep more than one copy of your backup based on time, it's versioned. While the idea of using an individual file-based method as opposed to monolithic copies of your entire dataset is a good one in that the potential for restoration is much faster, it's still just a backup. Also, no matter which method is used, in practice, it could prove problematic determining exactly when the "good" data turned "bad"' i.e., from what point you'd need to restore.
It would also be interesting to see just how much overhead there is with this per-file method, whether or not the content is hosted on some shared network drive, or locally on individual workstations.
(It's assumed a shared drive would be on "stronger" hardware, but depending on what all the target machine is hosting, the load could still be pretty large. With the above-mentioned website on my dev machine, and not counting actual upload time, these days it still takes some seconds for Git to sync everything when pushing updated content to the website repo.)
* Not harping on Carbonite, just an example of one of the myriad cloud-based backup systems out there, and an early one for the WinDoze / consumer world. As others have posted, this concept has been around a while. According to their website, Carbonite keeps a single Daily version of files - not each and every change, presumably just the last change made that day - three weekly versions of the Daily, and two monthly versions of the Weekly. So, provided the infection happened within the past couple of months, you can recover...
Column Sixteen years ago, British mathematician Clive Humby came up with the aphorism "data is the new oil".
Rather than something that needed to be managed, Humby argued data could be prospected, mined, refined, productized, and on-sold – essentially the core activities of 21st century IT. Yet while data has become a source of endless bounty, its intrinsic value remains difficult to define.
That's a problem, because what cannot be valued cannot be insured. A decade ago, insurers started looking at offering policies to insure data against loss. But in the absence of any methodology for valuing that data, the idea quickly landed in the "too hard" basket.
If claims hold true, AMD has been targeted by the extortion group RansomHouse, which says it is sitting on a trove of data stolen from the processor designer following an alleged security breach earlier this year.
RansomHouse says it obtained the files from an intrusion into AMD's network on January 5, 2022, and that this isn't material from a previous leak of its intellectual property.
This relatively new crew also says it doesn't breach the security of systems itself, nor develop or use ransomware. Instead, it acts as a "mediator" between attackers and victims to ensure payment is made for purloined data.
Feature US and European cops, prosecutors, and NGOs recently convened a two-day workshop in the Hague to discuss how to respond to the growing scourge of ransomware.
"Only by working together with key law enforcement and prosecutorial partners in the EU can we effectively combat the threat that ransomware poses to our society," said US assistant attorney general Kenneth Polite, Jr, in a canned statement.
Earlier this month, at the annual RSA Conference, this same topic was on cybersecurity professionals' minds – and lips.
A state-sponsored Chinese threat actor has used ransomware as a distraction to help it conduct electronic espionage, according to security software vendor Secureworks.
The China-backed group, which Secureworks labels Bronze Starlight, has been active since mid-2021. It uses an HUI loader to install ransomware, such as LockFile, AtomSilo, Rook, Night Sky and Pandora. But cybersecurity firm Secureworks asserts that ransomware is probably just a distraction from the true intent: cyber espionage.
"The ransomware could distract incident responders from identifying the threat actors' true intent and reduce the likelihood of attributing the malicious activity to a government-sponsored Chinese threat group," the company argues.
In brief Google on Friday pledged to update its location history system so that visits to medical clinics and similarly sensitive places are automatically deleted.
In this post-Roe era of America, there is concern that cops and other law enforcement will demand the web giant hand over information about its users if they are suspected of breaking the law by seeking an abortion.
Google keeps a log of its users whereabouts, via its Location History functionality, and provides some controls to delete all or part of those records, or switch it off. Now, seemingly in response to the above concerns and a certain US Supreme Court decision, we're told Google's going to auto-delete some entries.
Matt Ramberg is the vice president of information security at Sanmina, a sprawling electronics manufacturer with close to 60 facilities in 20 countries on six continents and some 35,000 employees spread across the world.
Like most enterprises, Sanmina, a big name in contract manufacturing, is also adapting to a new IT environment. The 42-year-old Fortune 500 company, with fiscal year 2021 revenue of more than $6.76 billion, was an early and enthusiastic adopter of the cloud, taking its first step into Google Cloud in 2009.
With manufacturing sites around the globe, it also is seeing its technology demands stretch out to the edge.
A cyberattack on a software company almost a week ago continues to ripple through labor and workforce agencies in a number of US states, cutting off people from such services as unemployment benefits and job-seeking programs.
Labor departments and related agencies in at least nine states have been impacted. According to the Louisiana Workforce Commission in a statement this week, Geographic Solutions (GSI) was forced to shut down state labor exchanges and unemployment claims systems, and as many as 40 states and Washington DC, all of which rely on GSI's services, could be affected.
In a statement to media organizations, GSI President Paul Toomey said the Palm Harbor, Florida-based company "identified anomalous activity on our network," and took its services offline. Toomey didn't elaborate whether GSI was hit with ransomware or some other type of malware.
China's government has outlined its vision for digital services, expected behavior standards at China's big tech companies, and how China will put data to work everywhere – with president Xi Jinping putting his imprimatur to some of the policies.
Xi's remarks were made in his role as director of China’s Central Comprehensively Deepening Reforms Commission, which met earlier this week. The subsequent communiqué states that at the meeting Xi called for "financial technology platform enterprises to return to their core business" and "support platform enterprises in playing a bigger role in serving the real economy and smoothing positive interplay between domestic and international economic flows."
The remarks outline an attempt to balance Big Tech's desire to create disruptive financial products that challenge monopolies, against efforts to ensure that only licensed and regulated entities offer financial services.
Carnival Cruise Lines will cough up more than $6 million to end two separate lawsuits filed by 46 states in the US after sensitive, personal information on customers and employees was accessed in a string of cyberattacks.
A couple of years ago, as the coronavirus pandemic was taking hold, the Miami-based biz revealed intruders had not only encrypted some of its data but also downloaded a collection of names and addresses; Social Security info, driver's license, and passport numbers; and health and payment information of thousands of people in almost every American state.
It all started to go wrong more than a year prior, as the cruise line became aware of suspicious activity in May 2019. This apparently wasn't disclosed until 10 months later, in March 2020.
In brief A Japanese contractor working in the city of Amagasaki, near Osaka, reportedly mislaid a USB drive containing personal data on the metropolis's 460,000 residents.
Biting the hand that feeds IT © 1998–2022