California’s attorney general, Xavier Becerra, filed a petition in early November that marks an investigation by California into Facebook’s privacy practices. Meanwhile, privacy legislation is still in motion. There was yet another announcement about potential changes to CCPA per a statewide ballot measure submitted in September. The plan would give Californians new data rights, place new requirements on companies, create opt-out provisions and add restrictions on access to information about consumers below 16 years of age. Notably, the submitted plan includes the creation of a data protection agency for Californians with the power to enforce the law and issue new regulations.
I believe the decision to establish an enforcement agency will provide a much-needed infrastructure to ensure CCPA compliance. It is also a sign that lawmakers are grappling with the serious undertaking that CCPA enforcement will be.
Enforcement aside, these additional measures once again place businesses in somewhat of a state of limbo. The new ballot measures and the steady stream of regulatory changes could result in an unsteady ground upon which companies can prepare in the countdown to CCPA that is going into effect on January 1, 2020.
I believe CCPA is in need of Marie Kondoing, but the train is still rolling toward the station, and all of these rules will supposedly be finalized and in effect in less than two months. It’s clear to me that most are unprepared — including those who are establishing the rules. After speaking with my colleagues, it’s become evident to me that there is a significant amount of uncertainty and a lot of confusion among businesses, not only on the adtech data side of the house, but among agencies, advertisers and those analyzing others’ data.
Because we know that CCPA today may not be CCPA tomorrow, we’re hanging on the edge of a void. I personally cannot wait for national legislation to finally come into effect. But until then, here are some questions about CCPA that I have heard and what they mean for businesses.
Companies should make certain that they’re giving notice to consumers about their data and provide consumers with the opportunity to tell them four things: “Yes, you can use my data,” “Show me what you have on me,” “What you have on me is not correct; I want to change it or amend it,” and “Don’t sell my data.”
We know that these disclosures and options need to be on our websites — visible and in plain sight — but I have yet to see any guidance on how big the disclosures need to be or what precisely they need to say.
How To Validate That A Person Is Who They Say They Are
Because of GDPR, many companies have mechanisms in place that allow removal requests. But, I haven’t seen any guidance yet in terms of how consumers are supposed to get in touch with you or how companies are to correctly validate that the person (who is saying, “Destroy my data,” or “Don’t sell my data”) is who they say they are.
There’s no system set up for that just yet. So someone can hypothetically come and say, “I’m Jeff Greenfield, and I’m a California resident, and you have data on me. Show me what you have.” Yet, how is a mere website owner supposed to know that you actually are a California resident without asking you to provide additional information about yourself? Or, how can they even know that you in fact are who you say you are? Will consumers need to provide a driver’s license to validate they are who they say they are?
That has not yet been made clear to the community at large.
Provisions For Kids And Teens
We know that the new CCPA proposal would require marketers and companies to obtain opt-in permission before collecting data from consumers younger than 16. We also know that the new proposal would require a company to obtain parent or guardian permission to collect data from consumers who are younger than 13. But how are companies supposed to know who is the actual parent or guardian of a minor under 13? And how do we validate that the consumer is younger than 16?
There’s a lot of meandering because — even assuming validation systems will be put in place — consumers should have the option to say, “Don’t sell my data.” However, definitions are unclear. For instance, what is the definition of “selling data”? For example, it may be unlikely, but does “selling data” include analytics providers who are paid to analyze that data?
Yes, it seems like quite a mess, but history tells us that being privacy-compliant will become less complex and more straightforward — eventually. Legislation often first emerges at the state level, is then adopted by several states, and is then followed by national regulation. For example, several local lawmakers have enacted or proposed regulations around the use of facial recognition technology by law enforcement. Earlier this year California passed the Body Camera Accountability Act, which bans law enforcement from using facial recognition software for the next three years, and only recently was a new federal bill announced to restrict federal police’s use of facial recognition across the United States.
I believe state-by-state legislation for privacy is a mere Band-Aid solution. The only approach that’s going to be viable for the advertising technology business is likely federally mandated protection at the national level akin to the broad-sweeping GDPR regulations. The technology economy is simply not set up at the state level. Until then, even Marie Kondo can’t help us make sense of this.
This article was originally published in Forbes.