back to article HP doubles down with dedupe speed record

HP reckons it can claim the dedupe speed king crown, ingesting at 100TB/hour and spitting it our at 40TB/hour, faster by far than the dedupe dominator, Data Domain. The enhanced dedupe performance was announced at the HP Discover event in Las Vegas on Monday. HP gets to such giddy heights by combining its StoreOnce Catalyst …

COMMENTS

This topic is closed for new posts.
  1. unredeemed

    So BOOST and Catalyst are effectively the same in theory. But at what dedupe levels did HP use to calculate 100TB/hr?

    Symantec does the same thing with their client-side dedupe, quoting that throughput assuming a 10-15X level of compression.

    DataDomain on the other hand is not client direct (well in small cases it is) how about vendors just quote pure RAW ingest numbers?

  2. P. Lee
    Coat

    Dedupe the answer to the wrong problem?

    Where are the billions spent on not pointlessly replicating data to start with?

    Have we given up on data normalisation and the correct use of caching?

    I know, not a real-world question, but I get antsy when someone says, "buy a server" and then "buy this to fix the problems from incorrect use of the server."

    1. Anonymous Coward
      Anonymous Coward

      Re: Dedupe the answer to the wrong problem?

      I don't think you understand what dedupe does - In a backup scenario lots of the data you backup is rightly duplicated - file headers for files of the same type, executables which appear on more than one server, emails which contain forwarded texts. etc. You aren't going to be able to prevent the duplication of these at source and neither should you be able to.

This topic is closed for new posts.

Other stories you might like