Re: Seriously, author?
There's a little more detail here.
The proposal is essentially that at the point at which a user "logs in" (or equivalent) to a website (it doesn't apply to unauthenticated sessions) the browser is prompted to generate a private key which is used to secure session cookies. That private key is then destroyed when the user "logs out" (or equivalent) or when the browser quits. Since the private key isn't reused, then there shouldn't be a privacy issue - however, since the key is tied to a "log in", then you're not talking about an anonymous user in any case so that's all really moot.
There's nothing inherently that requires a TPM - it would clearly have some potential benefit simply using a software key vault of some kind, but with greater risk of malware exfiltrating the keys and transparently continuing the session on another device. Though TPMs are not quite as inflexible as might be supposed by the time the OS has built an environment around them - assuming you have one to hand.
I think the big problem with the Google proposal is that it's essentially a hack: it requires the browser to periodically refresh its credentials (to prove it still has the private key) by making an http request to a specific URL associated with the application. Unlike, for example, Token Binding which is a proposed extension to TLS that is transaparent in that it doesn't depend on upper-layer application behaviour. The somewhat questionable justification for Google's approach is that in typical web application infrastructures, the security context (TLS connection) has been stripped by some border device and is unavailable to the web application code. And justifying your security hack on the basis that you junked all the security data you had at the earliest possible moment is very typical of modern practice, but I can't help feeling a better solution might be available.