This is the free edition of The Wavelength. Subscribe to get content like this three other times a week.
This week, Connecticut’s Senate passed an amended version of a data privacy bill, and that bill now moves to the House for possible consideration. This bill is stronger than the recently enacted data privacy law in Utah and bears some resemblances to legislation passed this year to change Virginia’s data privacy act. Whether it retains the features that are more appealing to privacy and consumer rights advocates remains to be seen. As matters stand, the bill has been weakened since its introduction, and there is every reason to believe that proponents of industry-friendly data privacy legislation will continue to advocate for such a bill.
As noted, a few days ago, the Senate took up SB 6 and added an amendment in the nature of a substitute. The original legislation was introduced in early February and went through committee consideration last month. At this point, stakeholders submitted their views on the first version of the bill, and here is a sampling of some of the views:
§ Connecticut Attorney General William Tong supported the bill
§ Consumer Reports supported the bill but suggested modifications
§ The Digital Advertising Alliance opposed the bill
§ Comcast proposed modifications
§ The State Privacy and Security Coalition (SPSC) proposed extensive modifications
It is noteworthy that none of the tech giants or other large stakeholders testified on SB 6 during committee consideration. This is most likely because a trade organization, the SPSC, did so on their behalf. According to one media account, the SPSC “describes itself as a coalition of leading tech, telecom, media and retail companies…[and] [m]embers include AT&T, Apple, Google, Amazon and Meta” (however, Apple has quit the organization since this piece was written because of concerns about weakening state privacy laws.) The same piece quoted a lawyer retained by the organization as saying “I really want to be upfront about this and my hope that a Utah model could be copied in other states.” This makes sense since Utah’s recently enacted “Consumer Privacy Act” (SB227) is the most industry favorable of the data privacy bills that are now law (see here for more detail and analysis.) But the group has come under criticism for serving as a front for large companies to influence legislation without having their names associated directly with what some would call anti-privacy and anti-consumer legislation.
The aforementioned is relevant to an analysis of SB 6 because when one compares the bill as introduced to the version passed by the Senate, it seems quite likely the SPSC and like-minded entities prevailed in what they would consider improving the bill. Consumer rights and privacy advocates probably do not agree. The bill may undergo further changes if the House takes it up.
Nonetheless, SB 6 uses the familiar controller/processor/third party structure that many other bills use and provides similar rights to people with similar exceptions that will tempt controllers to read the bill in ways that limit or defeat the rights of people.
In terms of definitions, one that jumps out immediately is how what is “biometric data” changed from the first bill to the passed bill. Notably, biometric data is defined as “data generated by automatic measurements of an individual's biological characteristics, such as a fingerprint, a voiceprint, eye retinas, irises or other unique biological patterns or characteristics that are used to identify a specific individual.” This is standard as far as data privacy bills go, but what about data generated by means other than automated? Setting that concern aside, a new sentence was added specifying that the definition
not include (A) a digital or physical photograph, (B) an audio or video recording, or (C) any data generated from a digital or physical photograph, or an audio or video recording, unless such data is generated to identify a specific individual.
As a result, it would appear that many current data collection and processing practices may fall outside what is biometric data. For example, Clearview AI has a massive collection of photographs it has scraped from the internet. Could the company argue that no data has been generated since the company merely collects photos and allows law enforcement agencies and other clients to check a person’s photo against the database? Conceivably, the company could make a colorable argument it is not generating any data that identifies a specific person. The same with what the European Data Protection Board calls “virtual voice assistants” (e.g. Siri, Alexa, etc.) Could Apple or Amazon claim they are collecting audio to improve these services and not to identify specific people. Moreover, what if the companies argue the voice recordings are used to identify households? This seems like a significant change.
Nonetheless, biometric data is one class of “sensitive data,” which are subject to heightened requirements and protection. Hence, the above examples would not be subject to these higher requirements.
Just to understand what besides biometric data is “Sensitive data,” here is the rest of the definition:
personal data that includes (A) data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation or citizenship or immigration status, (B) the processing of genetic or biometric data for the purpose of uniquely identifying an individual, (C) personal data collected from a known child, or (D) precise geolocation data.
Sensitive data includes personal data a controller collects from a person it knows to be under the age of 13. However, even though parental consent is needed to use this type of sensitive data, SB 6 does not use the consent mechanism it lays out in favor of the mechanism used in the “Children's Online Privacy Protection Act of 1998, 15 USC 6501 et seq., and the regulations, rules, guidance and exemptions adopted pursuant to said act.” Specifically, SB 6 provides that “[c]ontrollers and processors that comply with the verifiable parental consent requirements of COPPA shall be deemed compliant with any obligation to obtain parental consent.” When one references COPPA, one finds:
The term “verifiable parental consent” means any reasonable effort (taking into consideration available technology), including a request for authorization for future collection, use, and disclosure described in the notice, to ensure that a parent of a child receives notice of the operator’s personal information collection, use, and disclosure practices, and authorizes the collection, use, and disclosure, as applicable, of personal information and the subsequent use of that information before that information is collected from that child.
As we will see when we get to the consent and opt out rights for using sensitive data and other personal data, COPPA’s verifiable parental consent is a lower standard to meet.
Precise geolocation data is basically information obtained through technology that “directly identifies the specific location of an individual with precision and accuracy within a radius of one thousand seven hundred fifty feet” (which is about a third of a mile.) Hence, this class of location data to get a fairly good idea where a person is and considerable insight into what the person is doing.
The definition of "process" or "processing" are broad and seem to encompass all private sector activity with respect to personal data. This term means “any operation or set of operations performed, whether by manual or automated means, on personal data or on sets of personal data, such as the collection, use, storage, disclosure, analysis, deletion or modification of personal data.”
The definition of the "Sale of personal data" seems to encompass sharing and disclosing personal data. SB 6 defines the practice as “the exchange of personal data for monetary or other valuable consideration by the controller to a third party.” First, this is a tighter definition that other state bills, many of which pertain strictly to personal data exchanged for monetary consideration. SB 6 includes “other valuable consideration,” which would seem to sweep up many of the other ways data flows throughout the economy that do not involve actual cash changing hands. It bears some emphasis that there is a lengthy list of exceptions to what is a sale of personal data.
The scope of companies covered by SB 6 was loosened during the legislative process. Entities that could be controllers or processors were originally those controlling or processing the personal data of 65,000 or more Connecticut residents in a year. The bill sent to the House moved this figure up to 100,000. For either threshold, SB 6 always excluding personal data used solely for payment transactions. The other threshold that makes on a controller or processor did not change, and it is still controlling or processing the personal data of 25,000 or more residents and deriving 25% or more of gross revenue from selling personal data. As noted earlier, a sale need not involve money and could be the exchange of personal data for “other valuable consideration,” such as other personal data. Hence, this may be a tricky threshold to define. Certainly one can expect challenges to being labeled this second class of controller or processor through arguing about how much gross revenue comes from trading or sharing personal data.
Like many other data privacy bills, SB 6 exempts vast swaths of the economy that are currently regulated with respect to data privacy to varying degrees under federal and state laws. For example, any “financial institution or data subject to Title V of the Gramm-Leach-Bliley Act” and many of the entities subject to the Health Insurance Portability and Accountability Act of 1996 (HIPAA) do not need to comply with the new data privacy law. Nor do non-profit organizations or institutions of higher education. There is a separate but related carveout for certain “information and data” which includes but contains more than “(1) Protected health information under HIPAA; (2) patient-identifying information for purposes of 42 USC 290dd-2; (3) identifiable private information for purposes of the federal policy for the protection of human subjects under 45 CFR 46.” This latter carveout also includes “Fair Credit Reporting Act,” “Driver's Privacy Protection Act of 1994,” and the “Family Educational Rights and Privacy Act.” These are becoming fairly standard exceptions in state data privacy laws and will perpetuate fragmented regulation of data privacy in the U.S.
Moreover, employment matters are also exempted through a narrowing of how a “consumer” is defined, which includes “an individual acting in a commercial or employment context or as an employee, owner, director, officer or contractor of a company, partnership, sole proprietorship, nonprofit or government agency whose communications or transactions with the controller occur solely within the context of that individual's role with the company, partnership, sole proprietorship, nonprofit or government agency.” And so, would a person in Connecticut using a work computer to shop or surf the internet be a consumer? It appears so.
The customary data rights U.S. legislation bestows are in SB 5, with a few wrinkles worth explaining. Thus, “[a] consumer shall have the right to:
1) Confirm whether or not a controller is processing the consumer's personal data and access such personal data, unless such confirmation or access would require the controller to reveal a trade secret;
2) correct inaccuracies in the consumer's personal data, taking into account the nature of the personal data and the purposes of the processing of the consumer's personal data;
3) delete personal data provided by, or obtained about, the consumer;
4) obtain a copy of the consumer's personal data processed by the controller, in a portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data to another controller without hindrance, where the processing is carried out by automated means, provided such controller shall not be required to reveal any trade secret; and
5) opt out of the processing of the personal data for purposes of
o targeted advertising,
o the sale of personal data, except as provided in subsection (b) of section 6 of this act, or
o profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.
The language in these rights changed has been bolded to distinguish the bill as introduced from the bill as passed. And so, residents will have a right to confirm and access personal data a controller is processing but only if trade secrets are not revealed. If the right to repair fight provides any insight into how controllers will construe what is a trade secret, then we can expect a significant number of companies, especially those employing algorithms in processing, to claim that they cannot grant confirmation or access requests because trade secrets would be revealed. Or, another possibility is that such controllers might deliver incomplete and redacted information or even merely just the actual personal data collected as opposed to the processed data. These possibilities exist for the right to request a copy of one’s personal data, too.
Moreover, under SB 6, one may opt out of data processing related to targeted advertising, the sale of personal data subject to exceptions discussed below, and profiling in order to make decisions with legal or significant effects. For this last category, the underlying language was changed to restrict this right to “solely automated” decisions, suggesting that hybrid decisions under which there is a modicum of human decision-making are outside this right, and as a result, residents of Connecticut could not opt of this type of processing.
The contours of the procedure by which residents can exercise the above rights is familiar. People would submit their requests to a controller, and the controller would get 45 days to act unless it determines another 45 days is necessary due to the complexity or volume of the person’s requests at which point the time period could be unilaterally extended so long as notice is provided. However, there is a provision that seems to limit the 45 day extension possibility, for it is provided that if a controller declines a request, it must do so within 45 days and explain why to the requester.
Originally, SB 6 allowed residents to make two free requests a year, but the bill was changed and now only one free request a year is permitted. Therefore, a privacy minded resident that wanted a controller to delete her information monthly would be faced with either having to pay for this privilege through the imposition of a “reasonable fee” the controller would have to justify by “demonstrating the manifestly unfounded, excessive or repetitive nature of the request.”
There is problematic language common to state data privacy bills allowing controllers to decline any of the above rights if they cannot verify the requester’s identity using “commercially reasonable efforts.” In California, this has been encountered a significant percentage of the time if this study is close to accurate. This will likely prove a problem in other states, for the more steps and friction in a process, the lower the chance people will follow through. However, in the event a controller declines a request, SB 6 requires notice directing the requester to provide additional information to verify the request. This seems like another opportunity for a reluctant controller to make mischief through unreasonable or detailed demands.
Be that as it may, controllers cannot require people to authenticate opt out requests to stop the use of personal data for targeted advertising, sale of personal information, or profiling through automated decision-making. And yet, there is a significant exception, for “a controller may deny an opt-out request if the controller has a good faith, reasonable and documented belief that such request is fraudulent.” This seems like very strange grounds on which to allow controllers to deny opt out requests because in my extensive reading on data privacy, I have yet to come across a person committing fraud so that another person’s personal data will not be sold. This may be along the lines of companies like Apple and Microsoft claiming they cannot provide parts and manuals for third party repair shops to fix smartphones because this would weaken the devices’ security. Additionally, this language was added to the original bill, which may suggest that entities positioned to benefit from such a carveout convinced lawmakers to add it. I would be interested in knowing why any other stakeholder would add such language. And yet, controllers must notify a person if they decline the opt out request because of fraud. However, the bill does not seem to have language requiring controllers to allow requesters to dispel any such concerns about fraud.
Next we come upon language that was likely inserted at the behest of industry stakeholders, for there is language in SB 6 that is almost exactly the same as recently enacted legislation that changed the “Virginia Consumer Data Protection Act” (VCDPA) (HB 2307/SB 1392). SB 6 provides an exemption for deletion requests if it pertains to personal data the controller did not collect directly from the requester, which is, in other words, data they have bought or otherwise obtained. Specifically, this section states:
A controller that has obtained personal data about a consumer from a source other than the consumer shall be deemed in compliance with a consumer's request to delete such data pursuant to subdivision (3) of subsection (a) of this section by (A) retaining a record of the deletion request and the minimum data necessary for the purpose of ensuring the consumer's personal data remains deleted from the controller's records and not using such retained data for any other purpose pursuant to the provisions of sections 1 to 11, inclusive, of this act, or (B) opting the consumer out of the processing of such personal data for any purpose except for those exempted pursuant to the provisions of sections 1 to 11, inclusive, of this act.
As I wrote this week of the same language in a bill signed earlier this month to change the VCDPA:
And so, it appears that controllers will be able to treat differently personal data obtained from third parties in the event a person requests that the controller delete the personal data it has. There are two options for controllers. For the first option, it sounds very much like controllers can do something less than delete the personal data. If a controller has personal data obtained from any source other than the person requesting deletion, it does not need to delete the information. However, the language used in this new passage of the VCDPA is a bit confusing. May a controller keep the deletion request on file plus the “minimum data necessary for the purpose of ensuring the consumer's personal data remains deleted from the business's records?” Or does the controller get to keep a record of the deletion request and a record of “the minimum data necessary for the purpose of ensuring the consumer's personal data remains deleted from the business's records?” This matters because in the former scenario, a controller would get to keep a minimal amount of personal data instead of deleting it all. Come to think of it, even in the second scenario, the controller still gets to keep some of a person’s personal data despite the deletion request. Admittedly, the next clause of the first option bars controllers from using the retained information “for any other purpose pursuant to the provisions of this chapter.” This seems definitive, but may a controller use the retained data to comply with a federal statute? It would appear so.
Moving onto the second exception to the right to delete, with respect to personal data obtained in any way except directly from the requester, controllers can also comply by not processing the personal data except for the excepted purposes in the VDCPA. However, the VDCPA has numerous exceptions that allow for processing in any number of situations, some likely to be utilized in ways as to negate many of the rights residents of Virginia will have. For example, as I wrote when the bill became law, this VDCPA contains a long list of exceptions, including compliance with federal and state law and court orders and warrants. Many of these are fairly standard, but there are some that may lend themselves to creative, expansive interpretations by controllers and processors looking to get out of complying with the act such as:
§ Prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any illegal activity; preserve the integrity or security of systems; or investigate, report, or prosecute those responsible for any such action;
§ Conduct internal research to develop, improve, or repair products, services, or technology;
§ Effectuate a product recall;
§ Identify and repair technical errors that impair existing or intended functionality; or
§ Perform internal operations that are reasonably aligned with the expectations of the consumer or reasonably anticipated based on the consumer's existing relationship with the controller or are otherwise compatible with processing data in furtherance of the provision of a product or service specifically requested by a consumer or the performance of a contract to which the consumer is a party.
And so, controllers using the second exception to the right to delete could keep and continue processing some of a person’s personal data. This seems contrary to the idea of deletion, and seems to eviscerate the right with respect to personal data obtained through means other than directly from the person. The original SB 6 did not have this language.
Moreover, this exception only limits a controller’s use of the personal data with respect to processing but not selling or sharing the personal data. And so, in the event that a controller grants a person’s request to delete personal data, it would only be obligated to delete the information it collected from the person and could keep using the personal information it received from other sources like data brokers or other controllers.
Everything written above about the deletion exceptions in the VCDPA hold true with SB 6, especially with respect to the exceptions to data processing a controller may conduct. Moreover, I am left to wonder about the policy rationale for limiting a deletion request with respect to personal data not obtained directly from the requester. Presumably a person who wants his data deleted will want all of it deleted and would be surprised to find that not of it is, and it may be used for other matters.
Also, SB 6 has a substantially similar change that mirrors the other bill in Virginia enacted this month that expands the definition of non-profit to include entities organized under Sections 501(c)(4), 501(c)(6), and 501(c)(12) of the Internal Revenue Code. The original version of SB 6 exempted only 501(c)(3) entities.
Returning to the procedure of making requests, controllers must have an appeals process, and for appeals that are denied, residents can file claims with the attorney general’s office. However, given how stretched most state attorneys general’s offices are, even if Connecticut’s Attorney General is a vocal proponent of privacy rights, her office cannot possibly investigate all such claims. Consequently, some controllers may decide to turn down requests and deny appeals in light of the low odds of investigation.
Like California’s privacy laws, in Connecticut, a person could designate someone else an agent to exercise their privacy rights, notably through a universal opt out mechanism like a browser extension. However, this right has the same weakness as the other means for exercising privacy rights with a new twist: not only would controllers need to verify the requester’s identity, it would also need to verify the agent’s authority to act on behalf of the person.
SB 6 limits controllers to data collection that “is adequate, relevant and reasonably necessary in relation to the purposes for which such data is processed.” Of course, what is “adequate, relevant and reasonably necessary” to one person may be something completely different to another, and the bill does not provide guidance on this issue. Moreover, there are no provisions in the bill for guidance or rulemaking to clarify matters like these, meaning it will be left to a Connecticut court to figure this out.
Additionally, controllers cannot process “personal data for purposes that are neither reasonably necessary to, nor compatible with, the disclosed purposes for which such personal data is processed, as disclosed to the consumer, unless the controller obtains the consumer's consent.” It seems likely controllers will inundate residents with request to consent to processing of personal data outside the scope of the original collection and processing.
SB 6 has the usual language requiring controllers to implement reasonable security measures:
[A controller shall] establish, implement and maintain reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of personal data appropriate to the volume and nature of the personal data at issue.
The bill limits the processing but not the selling of sensitive data, for it provides a controller cannot “process sensitive data concerning a consumer without obtaining the consumer's consent.” Again, come controllers may seek to wear down residents through repeated consent requests to process sensitive data. Additionally, as noted earlier, processing the sensitive data of known children would be subject to COPPA and its implementing regulations and not SB 6. Generally speaking, COPPA is less restrictive than SB 6, and so this represents a significant loophole.
Controllers must “provide an effective mechanism for a consumer to revoke the consumer's consent…that is at least as easy as the mechanism by which the consumer provided the consumer's consent.” This seems straightforward, but let’s see how well this requirement is complied with.
SB 6 bars different services or products if a person exercises her rights:
A controller shall not discriminate against a consumer for exercising any of the consumer rights…including denying goods or services, charging different prices or rates for goods or services or providing a different level of quality of goods or services to the consumer.
However, this prohibition against discriminating in terms of price or different services and products does not apply to loyalty or rewards programs so long “the offering is in connection with a consumer's voluntary participation in a bona fide loyalty, rewards, premium features, discounts or club card program.” However, the following sentence in the first version of SB 6 was struck that barred the sale of a person’s personal data as part of a rewards or loyalty program if the person had opted out of a sale of their personal information. It would stand to reason that this practice is permitted, especially since the legal discrimination in terms of prices or service is related to a person’s voluntary participation in these programs.
Controllers must provide “reasonably accessible, clear and meaningful privacy notice[s]” that disclose the categories of personal data processed, the purposes for processing, how one can use their rights, categories of personal data shared with third parties, and an email address or online mechanism people can use to contact the controller. There must be additional disclosure if the controller sells personal data or uses it for targeted advertising. Controllers must provide people with safe and reliable means to exercise their rights, including “a clear and conspicuous link on the controller's Internet web site to an Internet web page that enables a consumer, or an agent of the consumer, to opt out of the targeted advertising or sale of the consumer's personal data.” Moreover, by 1 January 2025, controllers must have the means to accept a person’s use of global privacy controls.
Controllers and processors must operate under contracts that detail the latter’s responsibilities and how it can help the former meet its legal obligations under SB 6. As in other data privacy bills, “[d]etermining whether a person is acting as a controller or processor with respect to a specific processing of data is a fact-based determination that depends upon the context in which personal data is to be processed.” The bill then adds that entity processing personal data without a controller or a processor that exceeds the scope of its controller’s processing shall be considered controllers.
Controllers would be required to conduct data protection assessments for each processing activity that “presents a heightened risk of harm to a consumer” and make them available if the Attorney General requests them and they are relevant to an investigation. Specifically, the kinds of processing controllers would need to conduct these data protection assessments include:
§ The processing of personal data for the purposes of targeted advertising;
§ the sale of personal data;
§ the processing of personal data for the purposes of profiling, where such profiling presents a reasonably foreseeable risk of
o unfair or deceptive treatment of, or unlawful disparate impact on, consumers,
o financial, physical or reputational injury to consumers,
o a physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person, or (D) other substantial injury to consumers; and
§ the processing of sensitive data.
The purpose of data protection assessments is for a controller to understand the benefits and costs of processing that presents a heightened risk to people. Moreover, this process should also serve to find mitigation strategies. However, because the Attorney General can only access data protection assessments if relevant to an investigation, the only enforcer of SB 6 will necessarily have circumscribed insight into the risks in data processing. Finally, data protection assessments will not be required until 1 July 2023 and will be prospective in nature.
The level of knowledge a controller needs about a processor’s intentions to break the new law was relaxed. If a controller does not have actual knowledge of a processor’s intent, then the controller cannot be held legally responsible. This will likely result in a “don’t ask, don’t tell” approach to controller and processor relations in order that the former can plead ignorance about a processor’s violations.
Controllers holding de-identified data must take reasonable steps the data cannot be associated with someone, publicly pledge not to re-identify the data, and obligate any recipients of de-identified data to meet the requirements of SB 6. The bill exempts controllers from requests from people to exercise rights if the controller
§ Is not reasonably capable of associating the request with the personal data or it would be unreasonably burdensome for the controller to associate the request with the personal data;
§ does not use the personal data to recognize or respond to the specific consumer who is the subject of the personal data, or associate the personal data with other personal data about the same specific consumer; and
§ does not sell the personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor…
Pseudonymous data is also exempted from the consumer rights established in SB 6 “in cases where the controller is able to demonstrate that any information necessary to identify the consumer is kept separately and is subject to effective technical and organizational controls that prevent the controller from accessing such information.”
Next we encounter the myriad exceptions common to U.S. data privacy bills that allow controllers and processors to disregard a number of the requirements and the rights of people in terms of their personal data. Among those exceptions that may prove problematic are those that allow entities to:
§ take immediate steps to protect an interest that is essential for the life or physical safety of the consumer or another individual, and where the processing cannot be manifestly based on another legal basis;
§ prevent, detect, protect against or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities or any illegal activity, preserve the integrity or security of systems or investigate, report or prosecute those responsible for any such action;
There are additional exceptions under which a controller or processor’s “ability to collect, use or retain data for internal use” shall not be restricted by the obligations of SB 6. Some of these may also be interpreted in ways contrary to the intent of the legislation as some controllers and processors could circumvent the requirements of the bill by claiming to:
§ Conduct internal research to develop, improve or repair products, services or technology;
§ effectuate a product recall;
§ identify and repair technical errors that impair existing or intended functionality; or
§ perform internal operations that are reasonably aligned with the expectations of the consumer or reasonably anticipated based on the consumer's existing relationship with the controller, or are otherwise compatible with processing data in furtherance of the provision of a product or service specifically requested by a consumer or the performance of a contract to which the consumer is a party.
Like other data privacy regimes, SB 6 has a right to cure that is limited in scope and duration. Like the recently changed VCDPA, the Attorney General will need to determine if alleged violations can be cured. If so, a controller or processor gets 60 days to do so. If the violation is not cured or the determination is made that violations cannot be cured, then the Attorney General may enforce under the state’s unfair trade practices statute that permit seeking up to $5,000 per violation if the conduct was willful and injunctive relief.
60 days is normally longer than other states, but SB 6 will only have this feature as part of their data privacy law for the first 18 months after enactment. Thereafter, “the Attorney General may, in determining whether to grant a controller or processor the opportunity to cure an alleged violation…consider: (1) The number of violations; (2) the size and complexity of the controller or processor; (3) the nature and extent of the controller's or processor's processing activities; (4) the substantial likelihood of injury to the public; (5) the safety of persons or property; and (6) whether such alleged violation was likely caused by human or technical error.” These factors are all discretionary, and the Attorney General could decide not to use in deciding whether to let a controller or processor cure the violation.
The United States (U.S.) Cybersecurity and Infrastructure Security Agency, Federal Bureau of Investigation, National Security Agency, Australian Cyber Security Centre, Canadian Centre for Cyber Security, New Zealand's National Cyber Security Centre, the United Kingdom's National Cyber Security Centre, and the United Kingdom's National Crime Agency issued a joint Cybersecurity Advisory (CSA) “to warn organizations that Russia’s invasion of Ukraine could expose organizations both within and beyond the region to increased malicious cyber activity…[that] may occur as a response to the unprecedented economic costs imposed on Russia as well as materiel support provided by the United States and U.S. allies and partners.”
The United States (U.S.) Department of Commerce announced “the appointment of 27 experts to the National Artificial Intelligence Advisory Committee (NAIAC), which will advise the President and the National AI Initiative Office on a range of issues related to artificial intelligence (AI).”
The European Data Protection Supervisor (EDPS) published its Annual Report 2021 that “highlights the EDPS’ achievements regarding European Union institutions’ (EU institutions) compliance with the data protection framework…[and] underscores the EDPS’ increasing role in advocating for the respect of privacy and data protection in EU legislation.”
Canada, Japan, South Korea, the Philippines, Singapore, Taiwan, and the United States established “a Global Cross-Border Privacy Rules Forum to promote interoperability and help bridge different regulatory approaches to data protection and privacy” and released FAQs.
United States Representative Ted Lieu (D-CA) introduced legislation, the “Warrant for Metadata Act,” “to require governmental entities to obtain a warrant before requesting that an electronic communications provider disclose a customer’s metadata, often referred to as data that describes other data.”
Virginia Governor Glenn Youngkin (R) signed SB 741 that “[a]uthorizes local law-enforcement agencies, campus police departments, and the Department of State Police (the Department) to use facial recognition technology for certain authorized uses,” which the Virginia chapter of the American Civil Liberties Union (ACLU) characterized as “a controversial bill that would lift the ban on the use of facial recognition technology without a warrant by local law enforcement agencies.”
The Joint Committee of the European Supervisory Authorities (ESAs) – EBA, EIOPA and ESMA – published “its 2021 Annual Report, providing a detailed account of its joint work completed over the past year.” The ESAs stated that “[t]he main areas of cross-sectoral focus continued to be joint risk assessment, enhancement of consumer protection, development of the regulatory and supervisory frameworks for sustainable finance and securitisation…[and] monitoring and contributing to the digital finance developments, supporting FinTech scale up through innovation hubs and sandboxes as well as cyber security completed the work programme.”
The United States (U.S.) Cybersecurity and Infrastructure Security Agency announced “the expansion of the Joint Cyber Defense Collaborative (JCDC) to include Industrial Control Systems (ICS) experts—security vendors, integrators, and distributors—to further increase U.S. government focus on the cybersecurity and resilience of industrial control systems and operational technology (ICS/OT)…[including] Bechtel, Claroty, Dragos, GE, Honeywell, Nozomi Networks, Schneider Electric, Schweitzer Engineering Laboratories, Siemens, and Xylem, as well as several JCDC Alliance partners.”
The United States (U.S.) National Telecommunications and Information Administration “is requesting comments on competition in the mobile application ecosystem.” NTIA stated that “[t]he data gathered through this process will be used to inform the Biden-Harris Administration's competition agenda, including, but not limited to, the Department of Commerce's work developing a report to submit to the Chair of the White House Competition Council regarding the mobile application ecosystem.”
United States (U.S.) Representative Tony Cárdenas (D-CA) and U.S. Senators Ben Ray Luján (D-NM), Bob Menendez (D-NJ) and Amy Klobuchar (D-MN) “led 17 of their colleagues in sending a letter urging Mark Zuckerberg, CEO of Meta, formerly Facebook, to increase platform moderation of Spanish-language disinformation on the war in Ukraine from Russian-owned media outlets.”
Tweet of the Day
“Apple to roll out child safety feature that scans messages for nudity to UK iPhones” By Alex Hern — The Guardian
“Meta’s Sheryl Sandberg Pressured Daily Mail to Drop Bobby Kotick Reporting” By Ben Fritz, Keach Hagey, Kirsten Grind, and Emily Glazer — Wall Street Journal
“The gig workers fighting back against the algorithms” By Karen Hao and Nadine Freischlad — MIT Technology Review
“Hackers Claim to Target Russian Institutions in Barrage of Cyberattacks and Leaks” By Kate Conger and David E. Sanger — New York Times
“American Phone-Tracking Firm Demo’d Surveillance Powers by Spying on CIA and NSA” By Sam Biddle and Jack Poulson — The Intercept
“Obama says tech companies have made democracy more vulnerable” By Elizabeth Dwoskin and Eugene Scott — Washington Post
“Companies lose your data and then nothing happens” By Emily Stewart — Vox
“Locked-down, Shanghai residents skirt censorship to vent online” By Pranshu Verma — Washington Post
“As Europe Approves New Tech Laws, the U.S. Falls Further Behind” By Cecilia Kang — New York Times
“US DOJ probes Google's $5.4b Mandiant acquisition” By Jeff Burt — The Register
“Corporate Repair Initiatives Don’t Replace the Need for Right-to-Repair Laws” By Matthew Gault — Vice
§ 21 April
o The United States (U.S.) Federal Communications Commission (FCC) will hold an open meeting with this tentative agenda:
§ Improving Receiver Performance. The Commission will consider a Notice of Inquiry to promote more efficient use of spectrum through improved receiver interference immunity performance, thereby facilitating the introduction of new and innovative services. (ET Docket No. 22-137)
§ Wireless Emergency Alerts. The Commission will consider a Further Notice of Proposed Rulemaking seeking comment on proposals to strengthen the effectiveness of Wireless Emergency Alerts, including through public reporting on the reliability, speed, and accuracy of these alerts. (PS Docket Nos. 15-91, 15-94)
§ Restricted Adjudicatory Matter. The Commission will consider a restricted adjudicatory matter.
§ Restricted Adjudicatory Matter. The Commission will consider a restricted adjudicatory matter.
§ Enforcement Bureau. The Commission will consider an enforcement action.
§ 27 April
o The United States (U.S.) Federal Trade Commission (FTC) and U.S. Department of Justice will hold a listening forum on firsthand effects of mergers and acquisitions: media and entertainment.
§ 12 May
o The United States (U.S.) Federal Trade Commission (FTC) and U.S. Department of Justice will hold a listening forum on firsthand effects of mergers and acquisitions: technology.
§ 15-16 May
o The United States-European Union Trade and Technology Council will reportedly meet in France.
§ 16-17 June
o The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”
 "Sale of personal data" does not include (A) the disclosure of personal data to a processor that processes the personal data on behalf of the controller, (B) the disclosure of personal data to a third party for purposes of providing a product or service requested by the consumer, (C) the disclosure or transfer of personal data to an affiliate of the controller, (D) the disclosure of personal data where the consumer directs the controller to disclose the personal data or intentionally uses the controller to interact with a third party, (E) the disclosure of personal data that the consumer (i) intentionally made available to the general public via a channel of mass media, and (ii) did not restrict to a specific audience, or (F) the disclosure or transfer of personal data to a third party as an asset that is part of a merger, acquisition, bankruptcy or other transaction, or a proposed merger, acquisition, bankruptcy or other transaction, in which the third party assumes control of all or part of the controller's assets.