
All backups wiped
To everyone who asks me when I'll give up duplication to tape, I would hereby like to say "HAH!". And I'm not even going to mention offsite copies... oh... nevermind.
Thousands of Australian websites have irretrievably lost their data and email files following a malicious security hack on Australian domain registrar and web host Distribute.IT. The company has been scrambling to save data and get customers back online or moved to safe servers since the security breach occurred over a week …
why no tape backups.. (or any offline mechanism)
seems like a beginners mistake not taking your data somewhere disconnected
i assume they had offsite backup servers, in the event of natural disaster, and just left the links on and up... not the best, not the worst
at least all their clients can just restore from their backups to a new provider, since they did do regular backups of their code and databases, right, right..?
painful lessons learnt, i hope some people reading this are going into their hosted sites panel and getting copies of the db and what not, or urging their customers to pay up for a regular backup service :P
From another report of this mess... "I think I'm in shock ... I have lost everything .... I couldnt possibly replicate all those years of work again ... my whole lifes work is gone down the drain," wrote one.
How does someone entrust another party with their life’s work, with no copies of it themselves?
Epic, epic FAIL.
Where the effin' heck are their offsite (or at least offline) backups (I know, I know, huge files tons of data blah blah blah... still...).
Of course, maybe, just maybe, you the customer should have a backup of what you upload??
So, friends, do you STILL want to outsource your enterprise? How is that hopey-cloudy thing working out, eh?
We have a team of Malaysian students who meticulously copy all our data down on reams of paper in binary format, and then photocopy those pages, and store them in climate-controlled rooms on two separate sites, so if we are ever hacked and lose our data we can reconstruct it.
Of course, the team are currently 200-strong and about 3 years behind with the transcription process, but it's still a lot better than this newfangled fancy-dancy "cloud" rubbish.
Hum...
So, how many of these websites (especially the commercial ones) keep their primary copy of their data on the customer's local servers, where it is also fully backed up, including off-line copies?
Then this copy is used to regularly synchronise the online servers at the ISP, so that the ISP provided machines and accounts were merely an easily replaced conduit for traffic.
Now, hands up everyone who simply relies on their ISP, however cheapo, reliably hosting their data for ever and a day with no loss whatsoever...
Something wrong here. The hacker appears to highlighted a big hole in the hoster's backup policy. That is unforgiveable. It's very hard to keep a server safe, that's why backups are more important.
The worst a hacker should be able to achieve is wiping the server and possibly poison the last backup or so. That's why you should always archive backups. Then you can work your way back to a safe position and minimise the loss.
BOFH descriptions of "industry best practice" describe constant, off site, hot duplication of data.
So if I were to want to do something like this, perhaps I would come at the "problem" bass ackwards. After compromising the main system, I'd posion only the backups over enough time to "get" them all. And only then take down the main.
so the hosting company had no disconnected/offline backup or offsite tapes. many of the customers had no personal local copies of their data. the data that so many peoples livelihoods entirely depended on. are you freaking kidding me?! we're going to see more and more of this with budget "cloud" services appearing all over the place.