* Posts by Remote Wipe

1 publicly visible post • joined 9 Jul 2018

Google weeps as its home state of California passes its own GDPR

Remote Wipe

Quid pro quo

You really need to read the actual bill https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375

A few things stand out to me:

1) The fines and penalties are weak:

(a) Class action suits can be raised for civil action and the fines are for $100 - $750 or ACTUAL damages per consumer per incident. Proving damages is not so easy and in the instances of data breach it can be very difficult unless you can show that a breach caused your identity to be stolen and used to cause you harm.

(b) In order to be liable for the $7500 penalty the business must INTENTIONALLY violate the law - good luck proving that... Negligence doesn't even play into this!

2) This entire effort was to rush a bill through in order to get a ballot measure off the ballot. You see if the ballot measure was pushed forward then legislators would be hamstrung because if Californians vote and pass a bill it is law and legislators have a very difficult time changing a measure the public voted into law. The https://www.caprivacy.org/ was pushing for a ballot measure and the giant Silly Valley companies pressures legislators to broker a deal to get this ballot measure stopped by passing something fast. The new law, which doesn't even go into effect until 2020, has an explicit section 1798.198. (b) requiring the ballot measure be dropped.

3) Government, NPOs, political campaigns, and the like are completely protected... Hmm, I wonder why that is. Essentially, if you are using this data to get re-elected or to manipulate voters then you are not governed by this shat law.

This bill has all sorts of shortcomings and does little to actually protect the consumer. In fact, if you opt out of data collection the company providing the services can charge you for the services based on what they can make off selling your data. There are all sorts of carve outs if you “Pseudonymize" or de-identify data, keep in mind this is exactly where tech companies are going. If they separate your "personal" data from behavioral data and make it difficult for anyone without deep knowledge and access to the data schema to tie the elements back together then they are good to go and protected. Sure that helps in the event that data is breached but I think the real issue is how are we being manipulated or influenced to make decisions we would otherwise not make. The goal is to be fully informed and transparent not fully duped and deceived. How can we trust that the results we see in a search are not manipulated if we don't know the algorithm used to display the results? Why can't we control what data is or isn't collected about our use of devices, applications, the internet? How do I gain better control of my information and usage patterns without transparency?

What do you think about the actual law? What should be required of businesses that watch our every move so they can sell "marketing" information. What obligation do they have to protect their end points and other various data compromise threat vectors? Should they require solutions like https://drivestrike.com BitLocker FDE Biometric Security

Should they be required to tell us how the data could be used to harm us?