Welcome to 2019, where almost every product, service, and website tracks every bit of data it can about us and creates a giant profile it can use to make inferences and predict our every move and desire. Even if a company doesn’t sell our data, so many companies in our society today rely on the mass aggregation of data to inform their marketing decisions.
These systems can be pretty frighteningly precise. We’ve all heard about the time Target figured out that a high school girl was pregnant and began marketing maternity items to her before her parents knew, creating some…awkward discussions at home. As a white man of Jewish heritage in his 30s, who likes the San Francisco Giants and Shawshank Redemption, maybe I’m more likely to buy a Toyota that gets at least 40 MPG or less likely to drink spiced rum. Somebody out there probably knows.
One of the most interesting but unpredictable parts of the California Consumer Privacy Act is the portion of the law that requires companies to share not just the information collected about consumers, but also the inferences they’ve made based on this data. This requirement could potentially implicate companies’ marketing strategy or even trade secrets.
Section 1798.140(o)(1)(K) specifies that the law’s definition of “personal information” includes:
Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
Inferences are defined as “the derivation of information, data, assumptions, or conclusions from facts, evidence, or another source of information or data.”
This raises an interesting question: when a consumer makes a verified consumer request, a business is obligated to disclose, among other things, the “specific pieces of personal information the business has collected about that consumer.” This seems like it would include the inferences mentioned above.
This raises questions both of optics and practicality. Will companies object to this disclosure if it means giving up part of their marketing strategy, or maybe even trade secrets? Does a company like Prozac or Zoloft want to admit that it crunched numbers and inferred somebody was depressed before deciding to market to them? And for companies where the inferences are the product that they provide, would disclosing both the raw data and the inferences threaten their proprietary algorithms and even their business model?
I don’t know. Maybe businesses will try to get around it. I’m not even sure how companies would go about implementing this, as I’m sure the inferences are not written out in English in a database somewhere. Maybe businesses will go hard to resurrect SB 753, which would have exempted the exchange of online identifiers, often used in online advertising.
It should be noted that the inclusion of inferences in as a protected class of information under the CCPA is an additional level of protection that the CCPA has above the EU General Data Protection Regulation (GDPR). According to Sandra Wachter, a professor at Oxford University and research fellow at the Alan Turing Institute, the absence of “inferences” in the GDPR compromises the effectiveness of the GDPR as an actual protector of privacy.
GDPR focuses too much on the input stage, meaning when data is collected, but not enough on how it is assessed. Once the data is lawfully obtained we have very little control or understanding of what inferences can be drawn,” Wachter says.
https://www.forbes.com/sites/soorajshah/2019/01/30/this-lawyer-believes-gdpr-is-failing-to-protect-you-heres-what-she-would-change/#72d7f70e6fc4
According to Forbes (paraphrasing Wachter), “[the] major internet platforms are behind many of the highest profile examples where GDPR does not do enough, with Facebook potentially able to infer protected attributes such as sexual orientation and race, as well as political opinions and how likely a user is to attempt suicide, while third parties have used Facebook data to decide on the eligibility for loans and infer political stances on abortion.”
In other words, it makes no sense to protect information about, for example, somebody’s sexual orientation, if the company is allowed to reach the same conclusion through inferences without the same protections. That makes the inferences language in the CCPA make a bit more sense. Now we just need to see how the law will actually be followed and enforced.
Disclaimer: This information is given for legal education only. This post is not legal advice and does not create an attorney-client relationship. Please contact an attorney for legal advice.
Daniel Zarchy is a civil litigator and privacy attorney in San Francisco, California. Daniel is also a Certified Information Privacy Professional (CIPP/US). The views and opinions expressed herein are solely those of the author and do not necessarily reflect the views or opinions of any other party or law firm.
Latest posts by Daniel J. Zarchy
(see all) Like this:
Like Loading...