Protect the Privates!
Wouldn't the path forward be to hash the data, then match on the hashes? The learning still takes place and becomes very good at it, but the data privacy is then preserved. Or am I missing something here?
Wouldn't the path forward be to hash the data, then match on the hashes? The learning still takes place and becomes very good at it, but the data privacy is then preserved. Or am I missing something here?