ADPPA's data minimization and duty of loyalty: a deep dive
Work in progress!
One of ADPPA's strong points is its focus on data minimization. This principle has been a bedrock of privacy law as one of the Fair Information Practices since the 1970s, and is also included in Europe’s GDPR and California’s CPRA.
ADPPA complements data minimization with a “duty of loyalty”, a powerful and relatively-recent innovation in privacy law. As privacy scholars Neil Richards and Woodrow Herzog write in We’re so close to getting data loyalty right
Done correctly, duties of loyalty would change a company’s business incentives away from manipulative and exploitative practices toward long-term, sustainable and mutually beneficial information relationships between people and companies
Of course, as widely-respected privacy scholar Prof. Daniel Solove says in Further Thoughts on ADPPA, the Federal Comprehensive Privacy Bill, "many parts of privacy laws have pretty-sounding rhetoric but ultimately are not any deeper”. Virginia’s and Utah’s privacy laws – which even industry lobbyists describe as “weak and loophole-ridden” – are also based on data minimization principles, but the wording is so weak that there’s virtually no protection.
So let’s dive more deeply into these relevant sections of ADPPA. I’ve included some of the key language from the current version of bill;, but if you want to follow along in the text, section numbers (§) refer to the July 19 ADPPA version as amended by the six amendments that passed. The redlined version from IAPP and Future of Privacy Forum, including all the amendments and highlighting changes from the subcommittee's version, is also very useful.
Data minimization and permissible purposes
A covered entity may not collect, process, or transfer covered data unless the collection, processing, or transfer is limited to what is reasonably necessary and proportionate to—
(1) provide, or maintain a specific product or service requested by the individual to whom the data pertains; or
(2) effect a purpose expressly permitted under subsection (b).
– American Data Privacy and Protection Act (ADPPA), §101(a):
ADPPA’s data minimization rules apply to everything that the businesses and non-profits it regulates do to any data that it covers. There are actually some big exemptions hidden behind these definitions,* but put that to the side for now and focus on the entities and data that it does cover.
Once a company collects data for specific purpose, ADPPA's data minimization rules say they can only process or transfer it it for that purpose and for the other permissible purposes listed in §101(b).
Why have any permissible purposes beyond providing the individual with the product or service the individual requests? Well, imagine if companies had to get consent from everybody before each time they analyze email to see whether it should go into a spam folder. Spammers wouldn’t give consent, so good luck with spam detection. And for the rest of us, you can imagine how this would be even more annoying than cookie dialogs.
We’ll go through the specific permissible purposes below, but first let’s talk about how the ADPPA’s “duty of loyalty” interacts with the data minimization requirements.
“Duty of loyalty”
As Richards and Herzog discuss, ADPPA’s “duty of loyalty” (§102) only covers data minimization – just one piece of what a duty of loyalty really needs to cover, and the reason I put air quotes around it. Still, it’s certainly an important aspect, and ADPPA’s “duty of loyalty” adds some significant protections for sensitive data (defined in §2(28))
Sensitive data is a fairly broad category under ADPPA, including health data; biometric and genetic information; “precise geolocation information”; credit card numbers and financial information; drivers license and passport numbers; photos, videos, and private communications including emails, voicemails and texts (unless they’re on employer-issued machines); and quite a few other categories. There are some important exceptions; for example, sex, sexual orientation, and immigration status aren’t considered sensitive data, and as Californians for Consumer Privacy discusses the definition of “precise geolocation information” (§2(24)) excludes location information from surveillance cameras and photos people take. Once again, though, let’s put these aside for now.
For sensitive data, ADPPA’s “duty of loyalty” adds several important protections to the basic data minimization rules:
- Prohibiting transferring (selling or sharing) sensitive data to third parties** without consent, with a few exceptions (listed in §102(3))
- A heightened standard of strictly necessary and proportionate (as opposed to just reasonably necessary and proportionate)
- Prohibiting collecting and processing sensitive data for targeted advertising or as part of a merger or acquisition***
- Allowing individuals to sue under the private right of action if their sensitive data is misused****
Looking at it differently, ADPPA’s “duty of loyalty” says that businesses and non-profits can collect or process even sensitive data as long as it’s strictly necessary for permissible purposes (1)-(15) without asking consent.
And it's not just they don't have to get your consent. There’s an exception to ADPPA’s opt-out rights (§204(b)(2)) which says you can’t even opt out of having your data collected, processed, and transferred for these purposes.
What could possibly go wrong?
The example permissible purpose I talked about above, spam detection, seems innocuous enough. Others, though, could potentially open up big loopholes. For example,, §101(b)(2)(C) allows companies to collect and process data “to conduct internal research or analysis to improve a product or service for which such data was collected.” This also sounds innocuous, but here’s what Washington’s Attorney General Ferguson had to say in June about the effects of a slightly-differently-worded version of this in the June ADPPA discussion draft
This broad exemption … may be used by technology companies to maintain all data indefinitely. For example, a company may deny all requests to delete biometric data or retain user photos (including of children) because the data is used to improve their photo tagging technology. Technology companies and their teams of corporate lawyers will defend their data processes, no matter how harmful, as internal research intended to improve the company’s products or services. In short, if Congress passes legislation with this exemption, it will undermine the entire purpose of data privacy legislation.
And remember, opt-out doesn’t apply. So no matter why a company’s collected your data, you can’t stop them from using it without your consent for “internal research or analytics” as long as they can claim the way they’re using it is reasonably necessary and proportionate to improving their product or service. If it’s not sensitive data, they can transfer it to third parties without your consent as well.
Ferguson’s comments were on an earlier draft of ADPPA and the current draft has a slightly narrower definition, but the lawyers I checked with didn't think the changes had fully addressed this issue. It’s a good example of how a innocuous-sounding permissible purpose could open up major loophole.
Fraud and illegal activity
With that as background, let’s move on to a pair of permissible purposes that EFF expressed concern about in their June 14 letter (which like AG Ferguson’s letter applied to an earlier draft.
§101(b)(6) allows collecting and processing data (without consent or the ability to opt out) to prevent, detect, protect against or respond to fraud. In Privacy bill triggers lobbying surge by data brokers, Alfred Ng quotes a data broker’s deputy general counsel as saying that their lobbying was to help “ensure that fraud prevention products can continue providing meaningful consumer protections.” Ng also quoted staffers as saying the lobbying hasn't had any effect, but I'm not so sure: the latest version added new exceptions for fraud in the “duty of loyalty” and several other places.*****
Think about the kinds of companies that would want to take advantage of this exception. Data broker LexisNexis is currently being sued by immigration advocates for collecting, combining, and selling personal data without consent. Surveillance technology company Clearview AI was banned from selling facial recognition technology in the US after secretly taking facial recognition data from people without their consent in May, and just two weeks later introduced the laughably-named Clearview Consent which "is all about making everyday consumers feel more secure in a world that is rife with crime and fraud." Facebook, Google, and other ad tech companies are concerned with advertising fraud. Companies in the connected vehicle data ecosystem focus on insurance fraud. The list goes on.
How much non-consensual data collection and processing will these companies be able to do as a result of this exception?
§101(b)(6) also allows collecting and processing data to prevent, detect, protect against or respond to illegal activity, which is defined as a felony or misdemeanor that causes harm.
In states which have criminalized abortion, does this mean that pregnant people's data can be collected processed (without consent or the ability to opt out) if it's "strictly necessary" to prevent abortions?
What about states which have criminalized gender-affirming health care?
Trespass and public safety incidents
§101(b)(5) allows collecting and processing data (without consent or the ability to opt out) to prevent, detect, protect against or respond to security incidents. In the latest version,
- the definition of security incident in §101(b)(5) was broadened to add trespass.
- a new permissible purpose §101(b)(15) was added, allowing government entities to require contractors to process to prevent, detect, protect against, or respond to public safety incidents, including trespass.
- and a new "duty of loyalty" exception 102(3)(D), allows government contractors transfer an individual’s sensitive data to third parties without consent if the transfer is necessary to prevent, detect, protect against or respond to a public safety incident including trespass, natural disaster, or national security incident
What risk does this create in cities and states that use “trespass” laws to target unhoused people?
What about law enforcement agencies surveilling activists (who after all might cause a protest that escalates into a public safety incident)?
But wait, there’s more!
This version of ADPPA’s has 17 permissible purposes, up from 12 in the previous version. Most of them raise at least some questions. For example:
- Does §101(b)(7) (which allows companies to use data without consent “to investigate, establish, prepare for, exercise, or defend legal claims involving the covered entity or service provider”) mean that Google can use whatever you’ve got in your email and Google Docs to prepare for potential future legal claims against them?
- Anti-abortion “crisis preganancy centers” have a “good faith” believe in imminent risk of serious harm caused by abortions. Does §101(b)(8) allow them to process any data they’ve obtained for other purposes to prevent or detect people who might be about to get an abortion – or health care providers who might be performing, or helping with, abortions? What about gender-affirming care? Does the similarly-worded “duty of loyalty” exception §102(3)(C) allow transferring (sharing and selling) this data?
- How broad are the exceptions for “public interest research” (§101(b)(10)), especially after the latest amendment adding an exemption for research that’s excluded from criteria of institutional review boards?
- §101(b)(13) lets companies transfer data in the context of mergers, acquisitions, and bankruptcies. While they do have to provide notice and a “reasonable opportunity” to request deletion, ADPPA's deletion rights have significant exceptions – for example, companies can ignore requests that interfere with "investigations, or reasonable efforts to guard against, detect, prevent, or investigate fraudulent, malicious, or unlawful activity." (§203(e)(3)(A)(vii)). What kinds of loopholes does this open up?
- §101(b)(16) covers first-party advertising and marketing, and §101(b)(17) covers third-party targeted advertising. Unlike the other permissible purposes, the “duty of loyalty” doesn’t currently let companies or non-profits use sensitive data for these purposes. Still, because they're permissible purposes, companies can use non-sensitive data for advertising without consent (although like CPRA, ADPPA does provide an opt-out for targeted advertising (204(c))). What are the implications?
Cynicism is justified
Maybe I’m just being cynical to think that industries that make money by exploiting people’s data are trying to get loopholes into ADPPA that let them legally exploit as much data as possible without asking for consent. Perhaps data brokers and surveillance-industrial complex companies won’t try to take advantage of the exceptions for “internal research”, “trespass,” “fraud,” “illegal activity” and government contractors. Or maybe ADPPA’s standards of “strictly necessary” for sensitive data, and “reasonably necessary” for non-sensitive data are strong enough to prevent shenanigans.
But then again, as What Microsoft, IBM and others won as the privacy bill evolved and Privacy bill triggers lobbying surge by data brokers discuss, there’s been a heckuva lot of lobbying to weaken ADPPA since it was first introduced, with a fair amount of success – and the next version of ADPPA may be even weaker. Especially if you look at the past behavior of the entities this bill is trying to regulate – and the way big tech has pushed weak privacy legislation across the country – it’s clear that cynicism is justified.
So it would be great to see some more detailed legal analysis of ADPPA’s data minimization sections. If there are some issues, there’s still time to amend ADPPA; there may well be existing language in other bills worth looking at. For example, California’s CPRA limits the “serious harm” exception to sharing data with law enforcement (as opposed to ADPPA which also allows sharing data to vigilantes), and puts in some process requirements for emergency requests. HIPAA similarly only allows sharing with law enforcement, and then only if there’s a warrant, subpoena, or summons. These all seem like potential improvements to ADPPA’s current protections.
Of course there’s also the political aspect: even if it turns out that improvements are needed to better protect people at risk, are the votes there? It’s hard to know. Stay tuned!
* For example, ADPPA exempts “de-identified” data, employee data including benefits information, publicly available information, and inferences from publicly available information (§2(9)). It also exempts government agencies, banks, airlines, and other businesses not regulated by the FTC Act (§2(9)). And, when a covered entity is collecting, processing, or transferring data on behalf of a government agency or another covered entity, they’re not considered a covered entity; they’re a service provider instead.
*** Note that ADPPA's definition of "third party" specifically excludes affiliates. Also, when a service provider collects, processes, or transfers data on behalf of a business, non-profit, or government agency, they’re also not considered a third party. These are both potentialcans of worms on their own – EFF has concerns that an amendment to the latest version gives service providers much more leeway than it should. But this writeup is already long and complex enough that I’ll save this discussion for a potential future post.
*** the “duty of loyalty” only allows collection and processing for permissible purposes (1)-(12) and (14), so these other purposes are excluded
**** The private right of action doesn't apply to violations of data minimization (§209(e)(1)), but does to the “duty of loyalty”
***** §102(1), §203(E)(1)(e), §203(E)(3)(A)(vi), and §209(b)(2). The redlined version from IAPP and Future of Privacy Forum is very useful for looking at how the bill is changing over time!