Literally a lorry load of data. Mind. Blown.
IBM and Google have each announced a competitor for Amazon Web Services’s “Snowball”. Amazon’s Snowball is a rugged box full of disk that it ships to users so they can fill it with tens of terabytes of data. Once full, the Snowball rolls all the way to an AWS bit barn where its contents are transferred into Amazon’s cloud …
> I want pics
Interesting point in their FAQ:
The Snowmobile comes with a removable connector cabinet that needs to be mounted on one of your data center racks where it can be connected directly to your high-speed network backbone. The connector rack provides multiple 40Gb/s interfaces that can transfer up to 1 Tb/s in aggregate.
Back in the day we finished a Windows PlateSpin migration with ~20 thumbdrives and one (large at the time) 240GB carry disk HDD. They were shipped in a single box via UPS from Atlanta to Rochester because the white collars didn't want to pay the extra cash for a decent network link.
When one of the folks went to the DC door to pick them up and got up to our war room, we all heard he dropped something as he crossed the door. A good ~45 seconds of silence followed, until he picked up the power source for the arcane external disk enclosure and said "Now I'm worth 16 million dollars, eh?" as that was the price of the migration/hosting contract.
The cheapest 1TB drive I found on Amazon was $34.99
For 120TB, that's roughly $4,200.
Or, for $474.50, I could get the Snowball delivered to a warehouse, on a trading estate a couple of hundred miles from where I live, stick it in a van, drive off and never be seen again, never mind ten days later.
I haven't checked out anything other than Snowball but the biggest issue with it was that the protocol used to put the data into a snowball (CIFS/NFS) is different than the one that is used to access it in the cloud once it is uploaded (HTTP). That adds significant complexity to adapting the business processes that are using the data to its new location. Yes, I realize that has been improved in more recent versions but still there is usually going to be a bit of manual fiddling for a lot of workflows. For the use case of capture a bunch of data for a fixed period of time, import it and do some processing on iit, the model works. For on-going workflows where data is being read and written more often than every few days it's pretty complicated. Most importantly where does new data go while the Snowball is in transit and how do sync up any changes made during that time.
In old days, USB sticks with some dozen GB of sensitive data on them tended to get lost on trains, cabs and such.
I think the fun really starts when such a snowball device with unencrypted mass data in the 100 TB range disappears in the street.
"Boss, I really don't know how this could happen, I wanted to take some files with me just to work with them at home later today.."