And, here is this week's free edition of the Wavelength. Please consider joining those who have already subscribed.
A modified 143 page “American Data Privacy and Protection Act” (ADPPA) (H.R. 8152) requires multiple editions of the Wavelength. This one covers Titles I and II.
As the sponsors of ADPPA, three of the four so-called corners of the Commerce Committees in Congress, modify the package to keep stakeholders happy and onsides, the bill has grown. Numerous changes and tweaks have changed many of the details even if the basic framework remains the same. In some places the bill’s protection of people in the United States (U.S.) is increased, while in others it is weakened.
The House Energy and Commerce Committee marked up and reported out a new version of ADPPA on 20 July. This article will begin the process of comparing the amended bill sent to the House with the version of ADPPA used at the 23 June subcommittee markup. There are significant changes, as one would expect from wide-reaching legislation, as the sponsors try to accommodate Members and external stakeholders. And, for the moment let us put aside the significant political roadblocks and examine the bill.
As mentioned, there are some very significant changes to the underlying bill, and so my analysis will be broken up across a number of editions. Yesterday, I just examined the definitions, and today I will look at Titles I and II.
Title I details the “duties” covered entities (CE) and, in many cases, service providers (SP) will have with regard to collecting, processing, and transferring the covered data (CD) of people in the United States (U.S.)
In Section 101 that details the “duty” of data minimization, in subsection (a), a provision from the 23 June version is cut. 101(a)(2) permitted CEs to “deliver a communication that is reasonably necessary anticipated by the individual recipient within the context of the individual’s interactions with the covered entity.”
Section 101(b) has been changed, which are the permissible purposes for which CEs may collect, process, and transfer CD so long as these acts are reasonably necessary and proportionate to the purpose. First, the last four digits of credit card numbers are removed from the permissible purpose of conducting a transaction.
Next, the exception in 101(b)(2) is changed on how CEs may use CD for what may be generally considered using collected data for their own purposes (e.g. systems maintenance, product and service improvement, market research, and the like.) The previous version of the bill permitted CEs to use legally collected CD to “to maintain a product or service for which such data was collected.” The revised language permits CD to be used “to develop, maintain, repair, or enhance a product or service for which such data was collected.” The latter is an enlargement of this exception and may be used by unscrupulous or uninformed CEs in ways contrary to the letter and spirit of ADPPA. Supposedly, CEs will only utilize this permissible purpose in ways that are “reasonably necessary and proportionate.” Of course, what constitutes “reasonably necessary and proportionate” is debatable and will undoubtedly be interpreted differently by CEs and regulators. Having said that, such possibilities are precisely why strong enforcement is necessary if ADPPA is not to just be high-minded words in the U.S. Code.
In what should be agreeable to all, a new permissible purpose was added: to “fulfill a product or service warranty.”
The permissible purpose “[t]o prevent, detect, protect against, or respond to a security incident” is expanded to include physical security and life safety and trespass. As mentioned before, how those terms are defined will be crucial. Moreover, what sort of duties will CEs have in preventing, detecting, protecting against or responding to threats to physical security and life safety? Is this an affirmative duty requiring, say, Facebook or Twitter to take monitor online abuse and do their best to stop doxing? This is all to be determined.
In the permissible purpose regarding research, the requirement that this permissible purpose adhere to the regulations “for human subject research established under part 46 of title 45, Code of Federal Regulations.”
101(b)(11) is clarified to stipulate that CEs may not use the exception permitting communication with a person for advertising. However, this provision is also changed from limiting its use to communications initiated by a person to communicating “if the communication is reasonably anticipated by the individual within the context of the individual’s interactions with the CE.” As with so much of ADPPA, and to be fair legislation generally, what “reasonably anticipated” means will determine the scope of this purpose.
There are other new permissible purposes. 101(b)(13) pertains to transferring assets, including CD, in a merger, acquisition, bankruptcy or other similar transaction. New 101(b)(14) allows the use of CD is reasonably necessary and proportionate ways to effectuate Section 208’s data security and protection requirements.
New 101(b)(15) adds this new permissible purpose for SPs:
With respect to covered data previously collected in accordance with this Act, a service provider acting at the direction of a government entity, or a service provided to a government entity by a covered entity, and only insofar as authorized by statute, to prevent, detect, protect against or respond to a public safety incident, including trespass, natural disaster, or national security incident. This paragraph does not permit, however, the transfer of covered data for payment or other valuable consideration to a government entity.
I suspect this exception was vetted by Members and stakeholders concerned about civil liberties and privacy given the U.S. government’s significant use of the so-called third party doctrine that allows it to acquire information on people it might not be able to get without a warrant under the Fourth Amendment. However, it appears this permissible purpose for SPs’ contracting with any government entity could allow these agencies to circumvent the Fourth Amendment in the same fashion government agencies already are. Nonetheless, SPs may act in the event of “public safety incidents,” “natural disasters,” or “national security incidents.” Does this latter term include events like the 6 January insurrection and all the planning that occurred on social media platforms before that day? Does the first term include demonstrations and marches like those in 2020 for the Black Lives Matter movement?
New 101(b)(17) permits the use of CD for “targeted advertising” so long as the collection, processing, and transferring of CD in accordance with ADPPA, especially Section 204(c) (i.e. the right to opt out of targeted advertising.)
The Section 101(d) ban on deceptive marketing of products and services is changed, and now third parties are no longer subject to this prohibition.
A new 101(e) was added that makes clear that nothing in ADPPA can be construed to limit or dimmish the First Amendment. This is unnecessary, for Congress cannot affect the Constitution through legislation.
Moving onto Section 102 (i.e. Loyalty Duties), (3)(D) is replaced with new unrelated language. The former text pertained to the use of sensitive CD, specifically the transfer of biometric information, to facilitate data security or authentication. The new language mirrors new language in Section 101(b)(15) (see above) regarding a when a SP acting under orders from or on behalf of a governmental agency may transfer the sensitive CD of a person. There is a nominal bar on such transfers, but 102(3) details when this may happen. The new language gives SP and governments wide latitude to transfer sensitive CD without a person’s consent or providing notice.
New Section 102(4) bars cable and satellite television providers and streaming services from transferring CD to unaffiliated third parties that “reveals the video content or services requested or selected by an individual from such service, except with the affirmative express consent of the individual or pursuant to one of the permissible purposes enumerated in [Section 101’s permissible purposes.]” And so, these entities could ask for consent or use one of the broad exceptions such as to “to develop, maintain, repair, or enhance a product or service for which such data was collected.” In this latter scenario, Netflix or Comcast could transfer my CD about what I watch to a third party in order to improve their products or services.
It bears note that old Section 102(4) prohibited the collection, processing, or transfer of “an individual’s aggregated internet search or browsing history” except with someone’s consent or under a permissible purpose. However, as noted in yesterday’s edition, “[i]nformation identifying an individual’s online activities over time and across third party websites or online services” has been added to the definition of sensitive CD. And yet, as discussed above, there are circumstances under which sensitive CD may be collected, processed, and transferred under an exception to the prohibitions on collecting or processing and transferring this type of information.
As mentioned yesterday, the Castor/Walbergamendment modifies Section 103 (i.e. Privacy By Design) with respect to the newly added class of people (covered minors, meaning those 16 and younger). The new language requires CE and SPs to have practices and policies to mitigate privacy risks related to covered minors with lesser responsibilities being placed on CEs that qualify as small businesses.
In the next two subsections (a)(3) and (a)(4) ( mitigating privacy risks of products and services and training and safeguards to promote compliance with all privacy laws), there is new qualifying language that seems to massage the responsibilities of some CEs and SPs would otherwise have to carry out: “taking into account the role of the covered entity or service provider and the information available to it.” This suggests a lesser level of responsibility may be possible, and how a regulator or a court makes this determination is likely to be fact-intensive.
The prohibition is strengthened regarding CEs using loyalty programs to coerce people not to exercise the rights bestowed by ADPPA:
A covered entity may not retaliate against an individual for exercising any of the rights guaranteed by the Act, or any regulations promulgated under this Act, including denying goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods or services.
The 23 June language barred conditioning products and services on people waiving rights. While some of the bill’s sponsors claimed CEs would not be able to charge different prices or provide a product or service if a person did not waive their rights, my read of the bill did not lead to this conclusion. CEs would have been able to do just that. However, with the revised language, that will prove harder but impossible. A new section, 104(b)(6), permits CEs to decline to provide a service or product if it is necessary to collect and process CD. I expect many CEs will try to argue collection and processing is necessary, and present this claim to people as a fair accompli. How would a person argue against this without insight into the CE’s operations?
The revised Section 104 has a definition of “bona fide loyalty program” whereas the former bill did not: the term includes but is not apparently limited to: “rewards, premium features, discount or club card programs.” Presumably this definition would work against CEs trying to shoehorn a loyalty program into Section 104 so they can take advantage of its exceptions.
Moving onto Title II, in Section 202, there is a clarification that large data holders (LDH) need not include in their logs of material changes to their privacy policies any such policies that predate enactment. In 202(f), there is a new requirement that a LDH’s short-form notice on CD practices not be misleading.
A requirement in Section 203(a) regarding how CEs must interact with SPs and Third Parties (TP) with respect to a person’s verified requests. In the 23 June bill, CEs were given the duty to “correct any verifiably material inaccuracy or materially incomplete information with respect to the covered data of the individual that is processed by the” CE. Moreover, in that draft, if instructed by the requester, CEs had to notify any SPs and TPs to which the CD was transferred of the corrected information. The new language substitutes “makes reasonable efforts to notify” for “notify,” meaning CEs have a lessened responsibility to make SPs and TPs aware of corrected information. If a person is looking to correct a material inaccuracy, this lessened standard will make doing so harder in the event a CEs “reasonable efforts” do not result in SPs or TPs being actually notified. In 203(a)(3) (i.e. the right to delete) there is identical language reducing the duty from notify to making reasonable efforts to notify, again making the privacy-minded person’s task that much harder.
In Section 203(c), the subsection on the timeline for CEs to respond to requests, there is new language that will likely result in more requests not being acted upon. For LDHs, CEs, and CEs that are small businesses, they have specified timelines under which they must complete requests, but there is a clause added to each subsection: “unless it is demonstrably impracticable or impracticably costly to verify such individual.” It is easy to imagine some CEs claiming in many cases that granting requests to exercise rights is either impracticable or “impracticably costly to verify” the person’s identity. This has been seen with respect to similar requests made under the “California Consumer Privacy Act” (AB 375.) However, the FTC may be able to head off some of these issues through the notice and comment rulemaking it must undertake to effectuate Section 203.
Section 203(e) was changed to provide CEs another exception they may use to deny a person’s request to exercise their rights. In the new 203(e)(1)(E) allows a CE to turn down requests if it “reasonably believes that the request is made to further fraud, support criminal activity, or the exercise of the right presents a data security threat.” It is not hard to foresee some CEs using this and other exceptions to create as much as friction as possible to defeat people who want to exercise the rights ADPPA grants them.
However, there is new language in Section 203(e)(3) that may curtail abuse of the exceptions that allow CEs to turn down requests to exercise rights. This revised subsection mandates that CEs must provide an “adequate explanation to the individual” if a request to use a right is wholly or partially denied. However another change allows more wriggle room for CEs in Section 203(e)(3)(A)(ii), requests can now be denied if they are “prohibitively costly” to comply with. However, in (e)(3)(C), it stated that “the receipt of a large number of verified requests, on its own, may not be considered to render compliance with a request demonstrably impracticable,” one of the other grounds a CE may use to deny a request. It is notable this subsection does not address compliance with requests that are “prohibitively costly,” making this restriction on how CEs handle requests limited.
New 203(f) requires LDH to track and release publicly metrics on how they handled requests to use rights during the previous year. Such information will give privacy advocates inside and outside the U.S. government valuable information that could inform regulation and enforcement provided the data are accurate.
The direction to the FTC to promulgate regulations to implement Section 203 now require that the agency consult with the U.S. National Institute of Standards and Technology (NIST) “for ensuring the deletion of covered data...where appropriate.”
Moving onto Section 204 (Right To Consent And Object), there are slight changes in subsection (b) regarding a person’s right to object to the transfer of her CD to a TP but the substance remains the same. The same is true of (b)(2) which is an exception to the general rule that people can opt out of transfers of CD to TPs. There is new language that clarifies that utilizing the “Do Not Collect” at the FTC regarding data brokers is not something CEs can use the exception to disregard. As matters stand the (b)(2) exception moots the right to opt out of CD transfers so long as the CE is acting to a permissible purpose in Section 101(b).
The right to opt out of targeted advertising is modified, too. In 204(c), the language is changed from “[a] CE that engages in targeted advertising shall...” to “[a] CE or SP that directly delivers a targeted advertisement shall....” The former is broader with respect to CE while the latter obviously sweeps in SPs. However, the potentially probative part regards the change from engaging in targeted advertising to directly delivering advertising. Under the previous language, a CE like the Gap, would have responsibilities regarding all its targeted advertising. In the new language, only targeted advertising the Gap itself directly delivers triggers the Section 204(c) duties. Hence, if Gap pays an ad exchange to place an ad on search engine page my daughter is using that targets her, then presumably the search engine is the CE and not the Gap. This seems to add players and steps to the process of opting out of targeted advertising. It would, of course, be simpler to add SP to the 23 June language and remove some of the players. It is also simpler to make targeted advertising opt in, but advertising and other stakeholders probably lobbied hard to make the default people being shown targeted ads and placing the onus on people to change this dynamic. Nonetheless, new and revised language in (c) require CEs to inform SPs of a person’s opting out, and the same is true of CEs or SPs directly delivering the targeted advertising with respect to alerting the entity doing the advertising.
Section 205(b)(2) creates an exception to the bar on CEs transferring the CD of covered minors to TPs if the entity is submitting information on potential child victimization to law enforcement or certain non-profits.
In 205(c), the start date for the new Youth Privacy and Marketing Division in the FTC is pushed back from 12 months to 24.
In the provisions on third-party collecting entities (TPCE) (aka data brokers), there is new language requiring the notice of their nature on their websites to not be misleading and to be readily accessible. The previous version merely required that they be clear and conspicuous.
A new provision on the to-be-established Do Not Collect function hosted by the FTC would allow TPCE to deny requests “from an individual who it has actual knowledge has been convicted of a crime related to the abduction or sexual exploitation of a child, and the data the entity is collecting is necessary to effectuate the purposes of a national or State-run sex offender registry.” The reasons for this change are obvious, and by confining this discretion to an actual knowledge standard presumably reduces the chances of people being erroneously labeled convicted of those crimes.
Section 206(c)’s penalties for TPCE that do not register with the FTC or post notice of their status are increased form $50 to $100 a day, which still strikes me as an inadequate incentive for recalcitrant TPCEs to comply. And the cap of $10,000 is still laughably low, and some TPCEs may just pay the fines save for new language in (c)(2) making clear that the penalty language does not limit the FTC’s other authority granted in ADPPA. Presumably the agency could seek injunctive relief against non-complying TPCEs.
The constraints on LDHs’ use of algorithm that may cause harm are loosened in Section 207. In (c)(1)(A) regarding covered algorithm’s impact assessments, in the 23 June bill, it was provided that “a large data holder that uses an algorithm that may cause potential harm to an individual, and uses such algorithm solely or in part, to collect, process or transfer covered data must conduct an impact assessment.” Now the trigger for LDHs to conduct impact assessments is the use of “a covered algorithm in a manner that poses a consequential risk of harm to an individual or group of individuals.” On the one hand, the harm threshold that requires an impact assessment is widened to include groups of individuals. On the other hand, “may cause potential harm” is broader than “a manner that poses a consequential risk of harm” for the latter would include only harm of consequence (whatever this means) while the former included all “potential harm.”
In terms of the scope of impact assessments, LDHs would no longer need to explain why the algorithm they deploy is superior to others. The potential harms a LDH must mitigate have been expanded to include “disparate impact on the basis of individuals’ political party registration.” One can surmise this was added at the insistence of some Republicans who repeated an unproven claim that so-called Big Tech is biased against then and their viewpoints.
In terms of a CE or SP’s evaluation of algorithm design and the CD it causes to be collected, processed, or transferred, the criteria have been narrowed. Where previously these entities needed to evaluate the algorithm to reduce potential harms before deployment, now they would need to do so only with those algorithms designed to make “consequential decisions.” And so, any harm short of that related to consequential decisions would no longer need to be evaluated and reduced.
CEs and SPs that submit their impact assessments and evaluations under Section 207 would now be allowed to redact and segregate trade secrets and confidential and proprietary information with respect to public release. Moreover, the FTC has to treat these submissions the same way it does requests made to entities under Section 6 of the FTC Act requiring the agency to keep this sort of information confidential. However, the FTC’s power to use these submissions for enforcement is expanded to allow for their use in enforcing consent orders.
There is a significant tweak to the data security and protection requirements in Section 208 that limits a CE or SP’s responsibilities to its “own system or systems.” In the same subsection, a new provision requires LDHs to reasonably investigate unsolicited reports of vulnerabilities. In (b)(2) there is a contraction of a CE or SP’s responsibilities for preventative and corrective action. The previous language required mitigation of reasonably foreseeable risks while the new language requires mitigation of the same “consistent with...the entity’s role in collecting, processing, and transferring the data.” This could allow for lesser data security and protection by a smaller player in a data ecosystem even if it is handling the same CD as entities that have larger roles. This could create weak spots in chains of CD that subsequently are easier to exfiltrate or access.
208(b)(4) adds a new requirement for SPs to establish practices to delete or return CD once services have been rendered.
There are two significant changes to the provisions on small businesses. First, a small business that is a CE is exempted from the requirement in Section 301(c) to appoint data security and privacy officers. Second, the second criteria for determining which CE are small businesses is relaxed considerably for now collecting and processing CD for “the purpose of initiating, rendering, billing for, finalizing, completing, or otherwise collecting payment for a requested service or product” no longer counts towards the 200,000 threshold.
In Section 210’s language on unified opt-outs, first party advertising and marketing under 101(b)(16) are now exempted from a person exercising her rights in this fashion. There is a new 210(b) that spells out the requirements for a centralized opt-out mechanism that requires, among features, informing people about their opt-out choice, need not be the default choice but may be if people are given the opportunity to consent, be easy-to-use, and allow a CE or SP to have an authentication process.
The United Kingdom’s Competition and Markets Authority (CMA) announced it is proceeding with its investigation of the Microsoft/Activision Blizzard deal because it “is concerned that Microsoft’s anticipated purchase of Activision Blizzard could substantially lessen competition in gaming consoles, multi-game subscription services, and cloud gaming services (game streaming).”
The United States (U.S.) Department of the Treasury's Office of Foreign Assets Control (OFAC) amended “the Cyber-Related Sanctions Regulations and reissuing them in their entirety to further implement an April 1, 2015 cyber-related Executive order, as amended by a December 28, 2016 cyber-related Executive order, as well as certain provisions of the Countering America's Adversaries Through Sanctions Act.”
The United States (U.S.) Senate Judiciary Committee started and then stopped its markup of the “Journalism Competition and Preservation Act of 2021” (S. 673) when an amendment offered by Senator Ted Cruz (R-TX) was adopted that would limit news organizations in negotiations with companies like Meta or Google if they moderate content.
Five United States (U.S.) Democratic Senators wrote “the Consumer Financial Protection Bureau (CFPB), pushing the agency to better protect users of peer-to-peer payment applications (P2P apps) from scams.”
Australia’s eSafety Commissioner is asking for feedback “on draft industry codes intended to reduce the risk of illegal and harmful online content.”
Connecticut Attorney General William Tong “announced a settlement with Frontier Communications worth over $60 million to dramatically expand access to high-speed internet for Frontier customers in economically distressed communities, end a hidden monthly $6.99 internet surcharge, and force significant improvements in Frontier’s marketing and customer service.”
Indiana Attorney General Todd Rokita made public “a $15 million settlement with Frontier Communications that will ensure that Hoosiers receive the services for which they have paid.”
Washington Attorney General Bob Ferguson announced that “a King County Superior Court judge ruled that Facebook parent company Meta repeatedly violated Washington’s campaign finance transparency law.”
The United Kingdom’s (UK) Competition and Markets Authority (CMA) updated its “Music and streaming market study,” released a research paper titled “Online Choice Architecture: How digital design can harm competition and consumers,” and published its completed “market study into mobile ecosystems.”
United States (U.S.) President Joe Biden announced his “intent to appoint highly qualified and diverse industry and government leaders as members of the President’s National Infrastructure Advisory Council (NIAC), which advises the White House on how to reduce physical and cyber risks and improve the security and resilience of the nation’s critical infrastructure sectors” including a new chair and vice chair.
The United States (U.S.) Federal Communications Commission announced that its “Broadband Data Task Force (Task Force), together with the Wireline Competition Bureau (WCB) and Office of Economics and Analytics (OEA), announce that as of September 12, 2022, state, local, and Tribal governments, service providers, and other entities can begin to file bulk challenges to data in the Broadband Serviceable Location Fabric (Fabric), which serves as the foundation for the Broadband Data Collection (BDC) fixed availability maps.”
The White House’s Science and Technology Policy Office (OSTP) is requesting “information on how Federal agencies can better support collaboration with other levels of government, civil society, and the research community around the production and use of equitable data.”
The United States (U.S.) National Science Foundation and the Department of Defense “selected 16 multidisciplinary teams for the Convergence Accelerator program 2022 cohort for the research topic — Track G: Securely Operating Through 5G Infrastructure” per their statement.
Tweet of the Day
“Revealed: jailed Saudi woman was convicted of ‘spreading lies through tweets’” By Stephanie Kirchgaessner — The Guardian
“The T-Mobile / Sprint merger hasn’t created jobs — it’s cut thousands” By Jasmine Hicks — The Verge
“Amazon Care is dead, but the tech giant’s health-care ambitions live on” By Caroline O'Donovan — Washington Post
“Washington state judge rules Facebook violated campaign finance rules” By Naomi Nix — Washington Post
“These are the top 5 states with the best – and worst – fixed internet coverage” By Diana Goovaerts — Fierce Telecom
“Russia’s War on Ukraine Deepens International Cyber-Defense Cooperation” By Dustin Volz — Wall Street Journal
“No more snow days? NYC schools say remote learning eliminates the need.” By Julian Mark — Washington Post
“Twitter Ramps Up Fact-Checking Project Ahead of US Midterms” By Queenie Wong — C|Net
“Facebook whistleblower is still pushing for change” By Ina Fried — Axios
“How police work with Google to obtain cellphone location data for criminal investigations” By Ramon Padilla and Javier Zarracina — USA Today
“A Smartphone That Lasts a Decade? Yes, It’s Possible.” By Brian X. Chen — New York Times
“The Rise of Mobile Gambling Is Leaving People Ruined and Unable to Quit” By Maxwell Strachan — Vice
“Antitrust regulators expand their global reach.” By Ephrat Livni — New York Times
“Facebook Engineers Admit They Don’t Know What They Do With Your Data” By Lorenzo Franceschi-Bicchierai — Vice
§ 8 September
o The United States (U.S.) Federal Trade Commission (FTC) is “hosting a public forum regarding its Advanced Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security practices.”
§ 12 September
§ 13 September
§ 19 September
o The President's National Infrastructure Advisory Council Meeting will hold a meeting.
§ 29 September
§ 10 October
o The European Data Protection Board will hold a plenary meeting.
§ 11 October
o The European Data Protection Board will hold a plenary meeting.
§ 19 October
o The United States (U.S.) Federal Trade Commission (FTC) will hold a virtual event “to examine how best to protect children from a growing array of manipulative marketing practices that make it difficult or impossible for children to distinguish ads from entertainment in digital media.”
§ 26 October
o The United States (U.S.) Information Security and Privacy Advisory Board (ISPAB) will hold a meeting.
§ 27 October
o The United States (U.S.) Information Security and Privacy Advisory Board (ISPAB) will hold a meeting.
§ 1 November
o The United States (U.S.) Federal Trade Commission (FTC) will hold PrivacyCon.