A typical upbringing probably drilled into our heads that we should get permission before using someone else’s things. “Don’t touch dad’s tools, don’t play with your sister’s toys.” “Ask permission.”

As simple and fundamental as that lesson is to civil society, it seems to have gotten lost when it comes to information about us, particularly in the digital world. That you would have any rights to data about you has never been the first conclusion of business. Not in banking, not in healthcare and certainly not on the internet. In the first two instances, laws have been worked and reworked to give us a modicum of ownership and privacy. But each new industry that discovers the value of personal information also spurs the need for new laws to reaffirm what seems morally straightforward.

For media companies, it’s become a best practice to do more with customer data than the customer had intended when they provided the information. It started as reselling subscriber data through list rentals and has gotten increasingly sophisticated. In other instances, keeping data on users started as a way to improve their browsing experience and understand their content preferences quickly grew into using an individual’s information to tailor the ads that publishers displayed. That’s something most users hadn’t intended when they gave their information. Browser cookies were originally intended to allow sites to keep some state information on users with the goal to improve their experience, but cookies are also a great way to keep information on preferences and habits that can be used to hone marketing messages. All of these practices are now under scrutiny and will be regulated to some extent. As lawmakers define an individual’s right to know about and control how their data is being used, media companies will need rework their business practices to stay within the new rules. For those who have played fast and loose with customer data, their days might be numbered.

Look to the States

Data privacy in the digital realm will be argued, defined and defended to varying degrees over the next few years, but the outlook for those who make their living selling access to personal information doesn’t look good. And yet, if you’re looking to Congress to codify a set of rules for the US, don’t. From the late Ted Stevens’ blathering description of the internet as a set of tubes back in 2006 (he was concerned that net neutrality would mean his important messages would be hopeless blocked by someone streaming Sunset Boulevard for the 100th time and clogging those ‘tubes’) to a smug Orrin Hatch asking Mark Zuckerberg in 2018 how Facebook could possibly make money when it gives its services away (the answer came after a pregnant pause: “Senator, we run ads,” accompanied by even more smug Zuckerberg smirk), Congress is neither equipped nor inclined to do anything meaningful.

Instead, it’ll be the EU and other countries, along with states like California and Massachusetts where lawmakers are surrounded by the technology sector, that will ascribe ownership of data to individuals and prescribe the rights that come with that ownership. It’s been this way for more than a decade now, with no new meaningful laws coming from Congress since 2002.

Data privacy policies are a good guide: there’s no federal requirement for them. Instead it's the California Online Privacy Protection Act (CalOPPA) that requires the policies. The 2004 law applies to any commercial website collecting personally identifying information (PII) on a citizen of California. It was updated in 2012 and also specifically covers downloaded apps as well as websites. Purveyors of apps without a policy can be fined as much as $2,500 per download.

The FTC is the only broad-scope federal agency to get into the act. It can go after any company that don’t comply with its stated privacy policy, because it’s considered a deceptive practice, something the FTC is required to police.

In 2014, the California Attorney General had to issue specific guidance saying the law required actually following stated privacy policies. The law was not specific enough on that point and companies tried to claim there was no specific requirement that they comply with their own policy. This clarification was needed for the FTC to properly police non-compliance as a deceptive practice.

The Law so Far

The European Union is one of the first entities to tackle regulating the mass of insight data that Google and Facebook (and by extension every other media company) are gathering. The EU’s General Data Protection Regulation, or GDPR is now fully implemented, and similar rules exist in other countries to varying degrees and are now law in California, Massachusetts and other US jurisdictions with more to follow.

The impact of these rules will be far reaching, and the EU has already begun handing out warnings and fines. Google received the largest fine so far—some €50 million from French authorities who say Google has failed to properly show users how it collects data across services such as search, maps and YouTube. But Google isn’t alone. The EU has also gone after some of the more sloppy violations and sleazy data aggregators.

The tenets of these regulations are similar. Consumers have specific, enumerated rights when it comes to third parties storing or processing information about them:

  • The right to know what data is being collected. Individuals can ask companies to disclose exactly what data the company has on them.
  • The right to request that collected data be deleted.
  • The right to request that companies not sell an individual’s data (the right to opt out of allowing access to the data in the first place; depending on the jurisdiction, this typically requires express opt in)
  • The right to be forgotten.
  • The right to sue if your data is stolen while kept in a non-encrypted format.

These rights seem straightforward enough, but almost no publisher’s system was built intending to make it easy to let consumers view the data being collected about themselves, let alone delete or restrict the use of that data.

Google Privacy Terms
Google Maps describes data use. Source: Google Maps

The GDPR rules also dictate that collected data should not be used in ways not obviously intended at the time the user provided it (or let it be collected). This is where apps like Google Maps got in trouble with French authorities, since Google uses your location information for more than just showing you directions to a destination and doesn’t tell you about the other uses.

Google shouldn’t, for example, use your location data to sell ads to retailers with stores you frequent. You might like a lunch spot that’s near a Walmart. If Google helps Walmart market to you based on that, it certainly wasn’t part of your intention when you used Google Maps to find that lunch spot. EU authorities want that use of your data disclosed when you use Maps, and more likely it wants Google to stop the practice altogether, unless you explicitly opt-in for that use of your location information.

Use Data only for the Intended Use

In the B2B media world, if a customer signs up for a webinar and it’s made clear that the webinar’s sponsors will get attendee business card data, then the company collecting the data should be in the clear as long as the user was given the option not to have their data shared. If attendees give permission to share their data, that permission only extends to that one webinar and its enumerated sponsors. New opt in/opt out permission should be collected for each event (like a sponsored webinar) that the data-collecting company produces.

The content creating (data collecting/controlling) company itself does have broader leeway with data use, and can, for example, continue to market other similar webinars and other products to each registrant, as long as it’s easy for the registrant to opt out. This is the same opt out that’s been required under anti-spam laws.

GDPR treats data controllers differently than it treats data processors. In practice, most media companies are both controller and processor. Controllers must meet requirements for storing the data, including timely deletion and/or anonymization and/or encryption whenever possible and appropriate. Data controllers must also designate a data privacy officer within the company who will be responsible for following GDPR rules.

As a data processor, companies are required to keep track of how data was processed and what the outcome of that processing was. Google and Facebook will have a tough time with this since each does a lot of processing with your PII. Media companies who use data for ad targeting could have a similarly tough time. For B2B companies, the GDPR rules say that processing can only be done when "necessary for the purposes of the legitimate interests pursued by the controller or by a third-party,” as long as that processing doesn’t violate the rights of individuals. The language means that companies can use data for legitimate marketing purposes but cannot process it for unrelated purposes, like helping Walmart know who drives by the store regularly. The rule has been interpreted as “do what’s needed to get the right marketing message to the right audience, but do no more.”

The Limitations of Permission

Under the new laws, the assumed right to use a consumer’s data stops with the company providing the content and its enumerated sponsors at the time of collection. The sponsors cannot turn around and sell the collected data without informing the user that they’re doing it, nor without getting a new opt-in or at least offering an opt-out, depending on the jurisdiction. Likewise, data cannot be sold or given to a company residing in a country not governed by GDPR (including the US) unless that company specifically agrees to abide by GDPR. If enforced rigorously, these requirements could put some companies out of business and drive up the cost of leads, particularly in the B2B space. Whether that’s an intended consequence will only be known as it becomes obvious how the EU and adopting states in the US enforce the law. The EU has gone after more flagrant violators, but as of this writing has only issued 91 fines. The California law doesn’t go into effect until January 1st, 2020, and the state’s attorney general has said he’ll give companies another six months to comply.

In the time since the law passed, data privacy advocates have tried to toughen California’s law. Consumers have the right to sue over data breaches, but they don’t have the right to sue if they believe a company is using their data in ways they haven’t approved. Giving consumers the right to sue for data misuse was proposed and defeated, which means enforcement lies solely with the state’s attorney general. If that enforcement isn’t as rigorous as privacy advocates favor, they’ll be back with similar calls to let consumers litigate.

In the B2C market, it’s clear that legislators are after the likes of Google and Facebook, and the California attorney general’s office will be watching these companies closely to see how they comply by 2020. The B2B market will also be closely watched, but it is unclear if enforcement in this market will be a priority. Nonetheless, publicly traded companies that sell both consumer and business customer data will be scrambling to meet the 2020 deadline as shareholders will undoubtedly see limited compliance efforts as a risk to the stock value.

In other states, consumers do or will have the right to sue. Illinois Biometric Information Privacy Act of 2008 (BIPA) is specifically intended to let consumers go after data aggregators, including employers who sell their biometric data to third parties without appropriate opt-ins. This includes DNA, fingerprint and other similar data. Massachusetts’ Bill SD.341, “an Act relative to consumer data privacy,” builds on both the California and Illinois laws, and allows consumers to sue in certain instances; the law has provisions for both biometric and personally identifiable information.

The Massachusetts law also provides guidance on fines. The law states:

“Whenever the attorney general has reason to believe that any business, service provider, or other person is in violation of this chapter, and that proceedings would be in the public interest, the attorney general may bring an action in the name of the commonwealth against such person to restrain such violation by temporary restraining order or preliminary or permanent injunction. In addition, the attorney general, in an action in the name of the commonwealth, may seek a civil penalty of not more than $2,500 for each violation or $7,500 for each intentional violation.”

In this case, a violation will be tallied on a “per consumer” basis. So, if millions of consumers are involved, the fines could be huge. Companies of any size could face business-altering fines.

Wait, THAT’S What You Meant?!?

If taken far enough, these laws could affect many ways of collecting data that aren’t typically considered PII. For instance, ad retargeting has become both commonplace and a bit creepy. Most people have experienced visiting a product website and then seeing ads for that product presented, well... everywhere. Your browsing history is collected and used on other sites to determine the ads you see. Retargeting is huge business in the digital world, so at some point the question will come: is the fact that you visited a product page enough to imply consent to use your data to see ads for that product elsewhere?

New York Times
Ad retargeting on The New York Times website. Source: New York Times

If lawmakers decide consent is needed before programmatic algorithms can use browsing data to serve ads, how can that be done in a reasonable way, and how would consumers know who has the data so they can manage it? Even the simple notion that consumers should manage their own data is fraught by the sheer volume of data they’d need to consider.

In the EU, these issues were first addressed in 2002 by the ePrivacy directive (in the EU, directives are requirements for member states to do a particular thing, whereas EU laws are immediately enforceable without further action by member states). The result of the 2002 directive was the requirement for cookie banners on websites. Because it is a directive, member states took a while to coalesce around the cookie banner solution. From the outset, EU lawmakers intended to require site and app makers to notify and gain consent before storing any data on an individual’s device.

Balancing Business Models and Individual Privacy

The EU has been working on a law to supersede the 2002 directive for years now. The original notion was that both GDPR and the new ePrivacy law would be issued at the same time, but member states can’t agree on the lengths to which the law should go. Some favor notification and consent for virtually any tracking mechanism, including the use of smart pixels that allow companies to know when a particular part of a web page or email has been seen—this is often the basis for ad retargeting. Requiring opt-ins for use of such technologies could have a substantial effect on some business models, something some member states are looking to avoid. Opt-out mechanisms would have less effect on business, but also put the onus on individuals to find and use opt-out capabilities for non-invasive tracking technologies. That’s something other member states find to be a bridge too far.

Current speculation is that a compromise is inching closer with a new draft, but a new ePrivacy rule won’t take effect before 2021. In the interim, member states could act unilaterally, or states like California or Massachusetts could address the matter sooner. In all cases, the regulating bodies are well aware of the business implications of their actions and are working to balance business needs with an individual’s rights.

IKEA cookie policy
An example of a cookie policy pop-up on Ikea’s homepage. Source: Ikea

One thing that’s certain is that most lawmakers view the existing “cookie rules” as overdone. The constant need to grant consent is of limited use and is generally annoying. Future rules will likely be aimed at browser and mobile device makers with the goal to allow users a straightforward mechanism to set their own policy for cookies and other tracking technologies. Whitelists are the mechanism currently prescribed in the EU’s latest draft. The rules will likely extend to apps that track user behavior. Finally, the new rules will likely allow for cookies without consent as long as they don’t seek to collect PII, but instead let web-based apps track user state to improve the user’s experience on their site. Apps have this ability inherently, and users grant apps the right to store data locally at installation.

Be Transparent and Honest

All of the governing bodies have two goals: to increase the transparency of data collection and to let individuals decide how much data companies can have about them. Data privacy is a clear right in the EU and California (one of the few states where its constitution calls out an inalienable right to privacy). Elsewhere in the US, that right is less clear. It isn’t explicitly enumerated in the US Constitution, for example, except in the fourth amendment. But there, the framers clearly wanted to keep government out of your personal effects and the word privacy isn’t used. Extending that to business is a bit more fuzzy.

In the end, content creators will do what they’ve always done and morph their business to a model that makes money and passes regulatory requirements. For the short run, it will be important to watch how privacy laws are enforced. Right now, that enforcement is left almost exclusively to government entities (though the Illinois law has resulted in a lot of individual litigation).

Observing enforcement of HIPAA in the healthcare industry is starting point. Because of limited funding for enforcers, there are fewer fines being levied than originally expected, and investigations are reactive, not proactive. HHS and its office of Civil Rights say that 98% of complaints are investigated, but HIPAA enforcement could certainly be much more aggressive. Egregious violations by the biggest players get most of the attention, while corner cases are left with warnings. And good intent in HIPAA compliance goes a long way.

The bottom line is that publishers collecting data on individuals will be regulated, and those regulations will result in rules on what companies can do—whether it be selling data to third parties or applying machine learning algorithms to anticipate consumer behavior—and what consumers can do to control their digital presence and protect their data privacy.

The goal of these laws is to create a more honest and open relationship between consumers and those who keep data on them, and media companies will be impacted greatly.