TVREV

View Original

ADPPA And Why This Is A Blockbuster Of A Bill… Draft, That Is

Just about 10 years ago, on May 9, 2012, a class-action lawsuit was quietly settled between an individual and a massive Video Rental conglomerate - Blockbuster.

The company was collecting users' video rental history along with credit card information. At the time this violated the Video Privacy Protection Action (VPPA), which prohibits movie rental services from disclosing information about the movies people watch without their consent

Six years later (Oct. 2018), the streaming TV OEMs faced a similar claim. But in this case, the Streamers (Samsung, SONY, and LG) reigned supreme and avoided a similar fate.  The court ruled in favor of Samsung, LG, and Sony, finding that the companies did not “disclose information that would, with little or no extra effort, permit an ordinary recipient to identify a particular person’s video-watching habits.” And the VPPA definition of, and utilization of, IP addresses was not sufficient enough to personally identify an individual. Since then, California Consumer Privacy Act (CCPA) and the soon to be deployed California Privacy Rights Act (CPRA), bring additional clarity to the definition of PII to include IP addresses and utilization to “reasonably” identify an individual or household. According to the laws businesses have an obligation to develop programs to manage and be compliant with these new laws, CCPA new regulation update now includes a “purpose limitation clause,” meaning exclusive consent for use beyond the stated purpose is needed.

How is PII leveraged today?

To better digest this we’ll use the connected television (CTV) industry, which is entering a golden era of addressable monetization, as an example. With the breakneck growth rate of subscription video-on-demand (SVOD), advertising video-on-demand (AVOD), and FAST nets, the CTV advertising marketplace has become the hottest screen on which to deliver performance-based marketing and targeted campaigns (literally, down to the individuals in one household). Most of this is predicated on the ability to “match” PII household, device, and individual levels such as IP address (the Identifier of choice for a CTV) identifiers for advertisers (IDFA), mobile advertising IDs (MAIDS) and email addresses. 

Data clean room and “matching” providers then take this PII information and run both probabilistic and deterministic models to determine if an individual dwells within the household that is represented by the matched IP address. If all matches up the individual receives a targeted advertisement on their CTV from a brand. Another example can occur when the individual shops at a Big Box retailer and joins its loyalty program, which then combines the customer’s email address with purchase/transaction history.  This information is then served through a ‘clean room’ matching process while running an ad campaign to target the specific customer’s “hashed” email to IP  with customized and relevant ads on a FAST network. 

What could change under the new Draft ADPPA law?

On June 3, 2022, the federal government made public a draft of the American Data Protection and Privacy Act (ADPPA) that is being sent to the House of Representatives.  It reveals a few tell-tale signs of where the Government and FTC are heading. As it relates to the TV/video viewing and streaming marketplace, it could be a monumental shift in how every video platform and video/CTV AdTech company thinks about the notion of “consent” and “sensitive data.” 

There are a few key sections that clear up the confusion around the utilization of the IP address and what constitutes “sensitive data.”

Sensitive Data:  information revealing individual's access to or viewing of TV, cable, or streaming media services.

Any other covered data collected, processed, or transferred for the purpose of identifying sensitive covered data is also considered sensitive.

This translates to the combining of a consumer's CTV viewing history (IP address with session level viewing insights) with ANY other “covered data” like device ID, email, and home address then those data points become “sensitive data.” And if any of that data can be reasonably used to re-identify the consumer or their household, then it is NOT considered de-identified data either (sorry hashed emails and synthetic IDs) 

Also, according to the draft of the ADPPA: 

Right to Consent and Object, is defined as: Sensitive covered data may not be collected, processed, or transferred to a third party without the express affirmative consent of the individual to whom it pertains. Individuals must be provided the means to provide and withdraw consent… Covered entities engaged in targeted advertising must provide individuals with clear and conspicuous means to opt-out prior to any targeted advertising and at all times afterward.

To translate this further, this essentially cleans up any misconception around the validity of the IP address (or any other identifier that could “reasonably” re-identify a household or individual) being leveraged by video platforms and providers and qualifying as PII. And now, if the draft ADPPA were to be law today, it would be classified as “sensitive data” that requires “express affirmative consent” for its exploitation. 

According to the FTC “Affirmative express consent” means that:

  • . Prior to the initial operation of any covered software, it shall be clearly and conspicuously disclosed, separate and apart from any “end user license agreement,” “privacy policy,” “terms of use” page, or similar document, the following

    • For any covered software that displays advertising,

      • the fact that the covered software will display advertisements, including any pop-up advertisements; and

      • the frequency and circumstances under which such advertisements are displayed to the consumer; and

      • For any covered software that transmits, or causes to be transmitted, covered information to a person or entity other than the consumer,

      • The fact that the software will transmit, or cause to be transmitted, the covered information to a person or entity other than the consumer;

      • The types of covered information that will be transmitted to a person or entity other than the consumer;

      • The types of covered information that the receiving person or entity will share with third parties, which does not include an entity with common corporate ownership and branding of Respondent or the software provider, a third- party service provider, or any person or entity otherwise excluded.

While we now see Opt-in prompts when we first set up our Samsung, VIZIO, Sony or LG Connected TV’s do those Notice & Disclosure prompts truly qualify as “Express Affirmative Consent?” Let’s keep in mind that I may have provided a consent election when I initially set up my TV’s, 2 or 3 years ago. I haven’t been provided with any other Notice, Disclosure or Consent prompts since. My election on one Connected TV in the household doesn’t carry over, or is interoperable, to any other Connected TV brand or IOT Device in my household. So, opting OUT of Data Sharing on my VIZIO didn’t carry over to my Samsung TV. 

As SafeGuard Privacy’s Co-Founder Wayne Matus states,

What’s conveyed here is that consent is temporal.  It is for the current disclosed purpose - not permission for the life of a TV set.  Perhaps this should be each time the TV set is turned on - which might address the issue of consent coming from the actual person giving the consent. And the new draft federal law embodies this concept too. All of this flows from the GDPR’s concept that consent needs to be meaningful.  Is consent really meaningful if given once for the life of aTV by one person for whomever else uses it?

Beyond this, the average consumer doesn’t understand that individual streaming apps may have a different Privacy Policy and consent practice from the actual connected TV. That specific streaming  app may not even request consent and therefore targets the consumer in their household even though they thought they opted out when they set their privacy elections while setting up their connected TV. This is a broken consumer experience and consent problem. 

Is your head spinning yet? 

With all the state laws coming about and every industry clamoring for a sweeping federal law, as a consumer, there is reason to be positive. The stakes couldn’t be higher for all companies dealing with the changing consumer demands and data privacy acts, no industry is immune to these rules.  Recently, the FTC made it even more real by fining Twitter $150M for violating data privacy laws.  The company collected consumer data for its security and compliance purposes, totally legal and acceptable with consumer consent, however, it then dropped all this data into its advertising mechanism without the ‘first-person’ consent to do so - not legal or acceptable.  This is one of the first shots fired and the reason every marketer, advertiser, brand leader, security/compliance officer, and C-suite executive is running full speed to address the current practices and change the approach.

And, if there’s one thing that has become abundantly clear, it’s that the notion of “Consent” and “Express Affirmative Consent” should be top of mind for every publisher, brand, video streamer, and advertiser as they think about how they compliantly collect qualified and valid consent from their consumers.