Reply to post: Re: "Why does the size have to be identical? "

'First ever' SHA-1 hash collision calculated. All it took were five clever brains... and 6,610 years of processor time

Alan W. Rateliff, II
Paris Hilton

Re: "Why does the size have to be identical? "

Just a few thoughts.

PDFs are compressed containers, right? That being the case, you can change quite a lot of the payload and all you have to do is ensure the compressed output of your manipulated file has the same size and hash. Being that the input to the hash generator is essentially already manipulated in a manner which essentially obfuscates the source document.

Of course, that is a narrow application but still seems practical. Going a similar route, look at any compressed file you might download: .zip, .7z, .tgz, etc. For full confidence a check for all parts should be applied, not just the download but the individual hashes of each and every part. For instance, the hash of the archive, the hash of all parts combined, the hash of each individual file, hunk, etc. Doing so gets heavy and resource expensive rather quickly.

More broadly, think of the Open Document formats which are zipped containers, among others.

But what of TLS sessions? Consider that a spook or nefarious agency (arguably one in the same) has a packet capture of a TLS-encrypted session signed with a SHA-1 certificate. We already know sessions signed by a certificate generated with poor entropy (the debacle from a few years back) can be undone. $130k is nothing to throw at this for such agencies, maybe even enough to get the GPU calculation requirements down to something more reasonable than a year (how much was paid for the San Bernadino iPhone hack?) Whatever was in that session is at risk, be it an email, web search, forum posting, or penis-enhancement purchase confirmation page.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon