Iowa’s Data Privacy Bill Mirrors Utah’s

Iowa’s Data Privacy Bill Mirrors Utah’s
Photo by Kelcy Gatson on Unsplash

The Wavelength is moving! This week, you’ll get the same great content, but from the Ghost platform from this email address. The move will allow for a price decrease because of reduced costs for me.

And, it bears mention that content on technology policy, politics, and law that preceded the Wavelength can be found on my blog.

It appears that the data privacy bill the Iowa House of Representatives overwhelmingly voted to send to the state Senate is dead. Supposedly, because a Senate committee did not report out the bill before 18 March, it cannot proceed to Senate consideration before the Iowa legislature ends its current session in mid-April. And yet, nonetheless, I have also read there are ways to still move legislation in the Senate. Consequently, it is not clear whether Iowa’s data privacy bill is still alive.

What is clear, however, is that Iowa’s bill is very similar to Utah’s recently passed data privacy bill that is awaiting action by Governor Spencer Cox (R) in that it is very industry friendly. As with Utah’s bill, the bill sent to the Iowa Senate provides fewer rights than most of the other data privacy bills enacted in states or introduced in Congress, certainly fewer than California’s operative and forthcoming bills. The passage of Utah’s bill, and the advancing of Iowa’s bill seem to be part of a concerted effort by industry stakeholders to enact laws they favor in states while the Congress continues to deadlock on data privacy. Should industry stakeholders succeed in more states, they may change their position on one national law to replace all state laws and may even abandon efforts to get a national law. The odds of this scenario occurring increase if the new state laws are broadly very similar as is the case with the Utah and Iowa legislation.

A few of the definitions warrant review. First, a consumer is defined as “a natural person who is a resident of the state acting only in an individual or household context and excluding a natural person acting in a commercial or employment context” (emphasis added.) This is language I have not seen before in a privacy bill, which is very curious. Is this to mean that if a person in Iowa is using their work computer or device for work, then she is not considered a consumer and is therefore not able to use the modest rights bestowed by the bill? That seems like a fair reading given the term “consumer” is used throughout. If so, this represents a massive loophole that would allow the collection, processing, sale, and distribution of one’s personal data during work hours on work devices. On the other hand, the carveout for employment contexts is not new.

Like many others, Iowa’s bill uses the controller/processor framework and stipulates that whether an entity is one or the other will be a fact-intensive, case-by-case determination.

What is “personal data” deserves some scrutiny. The term “means any information that is linked or reasonably linkable to an identified or identifiable natural person” that is not “de-identified or aggregate data or publicly available information.” First, “pseudonymous data” would be considered personal data unless one somehow shoehorns that definition into aggregate data, which seems like a stretch. And, speaking of aggregate data, the definition is “information that relates to a group or category of consumers, from which individual consumer identities have been removed, that is not linked or reasonably linkable to any consumer.” The first thing to note is that this definition pertains to consumers and not “identified or identifiable natural person[s]” as it does in the personal data definition. Also, this is “information” that relates to a group of consumers that “individual consumer identities have been removed.” The bill provides no guidance on the processes by which individual identities would be removed or even which data qualifies. Is it just names? Social Security numbers? Moreover, what “reasonably linkable” will come to mean will shape what “individual consumer identities” will mean.

Sensitive data is “a category of personal data that includes the following:

§  Racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status, except to the extent such data is used in order to avoid discrimination on the basis of a protected class that would violate a federal or state anti-discrimination law.

§  Genetic or biometric data that is processed for the purpose of uniquely identifying a natural person.

§  The personal data collected from a known child.

§  Precise geolocation data

This category of personal data cannot be processed without clear notice and an opportunity to opt out, which, of course, means that all remaining personal data can be processed without a person’s consent.

Those entities covered by Iowa’s law are people “conducting business in the state or producing products or services that are targeted to consumers who are residents of the state and that during a calendar year does either of the following:

§  Controls or processes personal data of at least one hundred thousand consumers.

§  Controls or processes personal data of at least twenty-five thousand consumers and derives over fifty percent of gross revenue from the sale of personal data.

Like most data privacy bills, there is a two-tiered threshold for potential controllers or processors to qualify once it has been established an entity conducts business in the state or is targeting products and services at Iowa’s residents. Under the first tier, controlling or processing the data of at least 100,000 residents makes an entity subject to the new law. Under the other tier, there is a lower threshold in terms of numbers, but the entity must derive more than 50% of its income from selling personal data. As a result, the largest tech companies in the U.S. would be subject as well as many other large multinationals.

This is a good spot to examine the muddy definition of “sale of personal data.” Initially, it seems quite clear as it is defined as “the exchange of personal data for monetary consideration by the controller to a third party.” Like Utah’s bill, this would seem to rule out merely sharing or trading personal data and focus on what has become seen as the more insidious practice of collecting and processing and then selling personal data. But the bill also stipulates what the term does not include, which I will quote in full:

§ The disclosure of personal data to a processor that processes the personal data on behalf of the controller.

§ The disclosure of personal data to a third party for purposes of providing a product or service requested by the consumer or a parent of a child.

§ The disclosure or transfer of personal data to an affiliate of the controller.

§ The disclosure of information that the consumer intentionally made available to the general public via a channel of mass media and did not restrict to a specific audience.

§ The disclosure or transfer of personal data when a consumer uses or directs a controller to intentionally disclose personal data or intentionally interact with one or more third parties.

§  The disclosure or transfer of personal data to a third party as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller’s assets.

Based on how the define positively defines a sale of personal data, some of these exceptions seem unnecessary. For example, information a person has made widely available via mass media is already outside the definition of personal data and hence a sale of information one scraped off of a public Twitter account would not qualify even without this exception.

As with many state and federal privacy bills, the Iowa legislation exempts a range of entities subject to existing federal laws such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Gramm-Leach-Bliley, the Fair Credit Reporting Act, the Family Educational Rights and Privacy Act, and others. Consequently, most health care entities, financial services companies, and credit reporting agencies will be excluded from the bill along with a number of other entities.

Again, like Utah, the right to correct the personal data a controller is holding would not be among the rights consumers would get. Until now, this has been included along with the rights to access, delete, or obtain personal data. This is not the case with Iowa’s bill. Also, as alluded to above, these rights are only for consumers, a term that excludes the commercial activities of Iowa residents. And so, if my reading is correct, one could not request that a controller delete personal data collected from a work phone. This is a huge loophole that will inevitably be exploited to the greatest extent possible, especially with the other right consumers would get: to opt out of targeted advertising or the sale of personal data.

Like Utah’s bill, Iowa’s bill establishes a process that consumers can use to exercise these rights that is likely to be used in ways to defeat these rights. For example, while entities have 45 days to respond to or act on requests with the entity being allowed to unilaterally extend this period by another 45 days, these entities may merely decline to take action so long as the reasons are explained. The bill is silent on what some of those reasons might conceivably be, but the only recourse for consumers is an appeals process each entity must provide and then a complaint to the attorney general.  If the office of the attorney general is not bothered or lacks the resources, then the consumer is out of luck, for there is no private right of action. Moreover, an easier way for entities to defeat requests to exercise one’s rights is through the authentication process. The bill provides that “[i]f a controller is unable to authenticate a request using commercially reasonable efforts, the controller shall not be required to comply with a request to initiate an action under this section.” As has been the case in California under the California Consumer Privacy Act, research has shown that some companies are making it close to impossible to exercise some of the rights granted in that state. It is foreseeable that entities in Iowa will seek to exclude as many requests as possible on the grounds they cannot be authenticated. Moreover, if an entity makes this decision, it is not obligated to give a consumer a chance to provide additional information. Again, a consumer would be left with an appeals process and complaining to the Iowa attorney general.

Moreover, there are additional grounds upon which controllers and processors can reject the requests of consumers to exercise rights. Even if a controller or processor authenticates a request, under the following circumstances, it may decline to comply:

§ The controller is not reasonably capable of associating the request with the personal data or it would be unreasonably burdensome for the controller to associate the request with the personal data.

§ The controller does not use the personal data to recognize or respond to the specific consumer who is the subject of the personal data, or associate the personal data with other personal data about the same specific consumer.

§  The controller does not sell the personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor, except as otherwise permitted in this chapter.

I fear the first two exceptions will be provided as reasons why requests cannot be met as potentially low risk means of defeating the exercise of rights. The worst case scenario for an entity willing to push non-compliance within limits would be written notice from the attorney general of possible violations the entity could cure through a written statement of compliance within 30 days. The third category would seem designed to encourage companies not to sell or disclose personal data to third parties. However, it seems inexplicable that such entities would be rewarded with the privilege of being allowed to decline requests to delete or access personal data.

There is the usual language on data security. Controllers must “adopt and implement reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data….appropriate to the volume and nature of the personal data at issue.” However, there is a caveat that:

A controller shall not process sensitive data concerning a consumer or a nonexempt purpose without the consumer having been presented with clear notice and an opportunity to opt out of such processing, or, in the case of the processing of sensitive data concerning a known child, without processing such data in accordance with the federal Children’s Online Privacy Protection Act…

And so, consumers must be given clear notice and the chance to opt out of the processing of sensitive personal data. This is another right given to the residents of Iowa aside from their commercial activities. However, it is not clear what is meant by “[a] controller shall not process sensitive data concerning…a nonexempt purpose.” Presumably, this means that before a controller may process sensitive personal data for some purpose not exempted from the bill, and Iowa’s bill has the usual exemptions, the entity must give consumers the opportunity to opt out. I think this could be drafted more clearly.

Controllers are barred from processing personal data in ways that violate state or federal anti-discrimination laws. Additionally, entities controlling personal data cannot “discriminate against a consumer for exercising any of the consumer rights…, including denying goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services to the consumer.” But, controllers are not required “to provide a product or service that requires the personal data of a consumer that the controller does not collect or maintain.” This is puzzling exception, and I am hard pressed to think of reasons why this language is included. I would have to guess an entity or group of entities

The bill also makes clear that the new privacy regime does not “prohibit a controller from offering a different price, rate, level, quality, or selection of goods or services to a consumer, including offering goods or services for no fee, if the consumer has exercised the consumer’s right to opt out [of the sale of personal data or targeted advertising].” This sounds like once a consumer has opted out of the sale of their personal data or targeted advertising entities can try and induce them to opt back in for a reduced price or no price. Likewise, entities may discriminate in terms of price, product, or service if it is related to “a bona fide loyalty, rewards, premium features, discounts, or club card program.”

Naturally, controllers must inform consumers of their privacy policies. The bill provides that “[a] controller shall provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes the following:

§ The categories of personal data processed by the controller.

§ The purpose for processing personal data.

§ How consumers may exercise their consumer rights pursuant to section 715D.3, including how a consumer may appeal a controller’s decision with regard to the consumer’s request.

§ The categories of personal data that the controller shares with third parties, if any.

§ The categories of third parties, if any, with whom the controller shares personal data.

The bill spells out the duties of a processor, which include the following insofar as they are “reasonably practicable:”

§ To fulfill the controller’s obligation to respond to consumer rights requests…

§  To meet the controller’s obligations in relation to the security of processing the personal data and in relation to the notification of a security breach of the processor…

Processors and controllers must operate under a contract to “govern the processor’s data processing procedures with respect to processing performed on behalf of the controller.” Moreover, “[t]he contract shall also include requirements that the processor shall do all of the following:

§ Ensure that each person processing personal data is subject to a duty of confidentiality with respect to the data.

§ At the controller’s direction, delete or return all personal data to the controller as requested at the end of the provision of services, unless retention of the personal data is required by law.

§ Upon the reasonable request of the controller, make available to the controller all information in the processor’s possession necessary to demonstrate the processor’s compliance with the obligations in this chapter.

§  Engage any subcontractor or agent pursuant to a written contract in accordance with this section that requires the subcontractor to meet the duties of the processor with respect to the personal data.

The bill has a long list of exemptions from its requirements that is fairly standard by now, including the following which I think are flexible enough to tempt controllers and processors into using them to circumvent the requirements of the bill:

§ Take immediate steps to protect an interest that is essential for the life or physical safety of the consumer or of another natural person, and where the processing cannot be manifestly based on another legal basis.

§ Prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any illegal activity.

§ Preserve the integrity or security of systems.

§  Investigate, report, or prosecute those responsible for any such action.

The bill has additional exemptions from its obligations that will also likely be used to circumvent its protections and requirements. It is stated that “[t]he obligations imposed on a controller or processor under this chapter shall not restrict a controller’s or processor’s ability to collect, use, or retain data as follows:

§ To conduct internal research to develop, improve, or repair products, services, or technology.

§ To identify and repair technical errors that impair existing or intended functionality.

§ To perform internal operations that are reasonably aligned with the expectations of the consumer or reasonably anticipated based on the consumer’s existing relationship with the controller or are otherwise compatible with processing data in furtherance of the provision of a product or service specifically requested by a consumer or parent or guardian of a child or the performance of a contract to which the consumer or parent or guardian of a child is a party.

The last category seems especially ripe for use, for controllers and processors have a monetary incentive to read as broadly as they can “the expectations of the consumer or reasonably anticipated based on the consumer’s existing relationship with the controller.”

However, there do seem to be limits on how personal data are processed. The bill provides that a controller’s data processing activities must be:

§ Reasonably necessary and proportionate to the purposes listed in this section.

§  Adequate, relevant, and limited to what is necessary in relation to the specific purposes listed in this section.

There is no private right of action. The bill says this in two different ways, one explicit and one implicit. The state attorney general is the only entity capable of enforcing the new requirements, and he or she must give a covered entity thirty days to cure violations before enforcement can commence. Specifically, the attorney general must give written notice of the specific provisions being violated, and if the covered entity “provides the attorney general an express written statement that the alleged violations have been cured and that no further such violations shall occur” then the attorney general cannot act. However, if the covered entity violates the express written statement, then the attorney general can proceed to prosecute, and the same is true if the entity never cured the violations to begin with. It bears mention the attorney general can issue civil investigative demands to entities if he or she reasonably believes there has been a past violation, is a current violation, or may be a future violation. The attorney general can seek injunctive relief and up to $7,500 in civil penalties per violation.

Other Developments

Photo by Wesley Tingey on Unsplash

As part of its antitrust suit, the United States Department of Justice filed a motion to sanction Google and compel disclosure of materials the company is allegedly claiming are privileged as attorney-client communications.

Gina Cass-Gottlieb has begun her term as chair of the Australian Competition and Consumer Commission (ACCC) and has replaced outgoing chair Rod Sims.

The United States (U.S.) Federal Communications Commission announced“the launch of a mapping tool that can be used to help assess whether and to what extent there is unassigned 2.5 GHz spectrum available in any U.S. county.”

Seventeen Democratic members of the United States (U.S.) House Energy and Commerce Committee’s Communications and Technology Subcommittee wrote to National Telecommunications and Information Administration (NTIA) head Alan Davidson, “outlining their recommendations and priorities as the agency implements the broadband programs in the Bipartisan Infrastructure Law.”

The United States (U.S.) Federal Communications Commission Enforcement Bureau “warned three more voice service providers that are apparently transmitting illegal robocalls on their networks that they have 48 hours to stop facilitating this traffic or face all their traffic being blocked by other providers.”

Automattic, jodel,, Twitter, and Vimeo launched the Open Internet Alliance, “an informal group of companies that advocates for fair and progressive regulation in Europe and globally,” because “[w]ith the Digital Services Act and the Digital Markets Act, the EU has an opportunity to create laws that make the internet a safer and healthier place and a more competitive environment for companies of all sizes.” The companies said “[w]e’re concerned that these laws are shaping up to make the largest and most powerful companies even larger and more powerful.”

Germany’s Bundesamt für Sicherheit in der Informationstechnik (BSI) recommended“that consumers always use 2FA if the respective online service allows it.”

Australia’s Attorney-General namedLeo Hardiman as the country’s new Freedom of Information Commissioner.

The United States (U.S.) Federal Communications Commission FCC announcedthe next 5G mid-band spectrum auction set for 29 July.

Tweet of the Day

Further Reading

Photo by Wesley Tingey on Unsplash

Keeping the Wrong Secrets” By Oona A. Hathaway — Foreign Affairs

Lawsuit Highlights How Little Control Brokers Have Over Location Data” By Jon Keegan and Alfred Ng — The Markup

Elon Musk’s Starlink is keeping Ukrainians online when traditional Internet fails” By Rachel Lerman and Cat Zakrzewski — Washington Post

Lawsuit says Google discriminates against Black workers” By Barbara Ortutay — Associated Press

YouTube Is a Huge Classroom Distraction. Teachers Are Reluctant to Banish It.” By Julie Jargon — Wall Street Journal

Justice Department accuses Google of hiding business communications” By Ashley Gold — Axios

Meta's antitrust defense: Blizzard of subpoenas” By Margaret Harding McGill — Axios

China’s Big Tech Firms Are Axing Thousands of Workers” By Yoko Kubota — Wall Street Journal

'This Is Really, Really Bad': Lapsus$ Gang Claims Okta Hack” By Lily Hay Newman — WIRED

The Latecomer’s Guide to Crypto” By Kevin Roose — New York Times

“‘Kill more’: Facebook fails to detect hate against Rohingya” By Victoria Milko and Barbara Ortutay — Associated Press

Big Tech boosts lobbying spending in Brussels” By Pietro Lombardi — Politico EU

How two years of working from home changed workers around the world” By Najwa Jamal and Cengiz Yar — Rest of the World

Coming Events

Photo by Jorge Gordo on Unsplash

§  22 March

o   The United Kingdom’s (UK) House of Lords Science and Technology Committee will hold a formal meeting (oral evidence session)as part of its inquiry into the Government’s plans to deliver a UK science and technology strategy.

o   The United Kingdom’s House of Commons General Committee will hold two formal meetingson the “Product Security and Telecommunications Infrastructure Bill” “A Bill to make provision about the security of internet-connectable products and products capable of connecting to such products; to make provision about electronic communications infrastructure; and for connected purposes.”

o The United States (U.S.) Senate Commerce, Science, and Transportation Committee will mark up a number of bills:

§  The “Martha Wright-Reed Just and Reasonable Communications Act of 2021” (S. 1541) “To amend the Communications Act of 1934 to require the Federal Communications Commission to ensure just and reasonable charges for telephone and advanced communications services in correctional and detention facilities.”

§  The “Next Generation Telecommunications Act” (S. 3014) “To establish the Next Generation Telecommunications Council, and for other purposes.”

§  “Reese’s Law” (S. 3278) “To protect children and other consumers against hazards associated with the accidental ingestion of button cell or coin batteries by requiring the Consumer Product Safety Commission to promulgate a consumer product safety standard to require child-resistant closures on consumer products that use such batteries, and for other purposes.”

§  The “Low Power Protection Act” (S. 3405) “To require the Federal Communications Commission to issue a rule providing that certain low power television stations may be accorded primary status as Class A television licensees, and for other purposes.”

§  23 March

o   The United Kingdom’s House of Commons’ Science and Technology Committee will hold a formal meeting (oral evidence session)in its inquiry on “The right to privacy: digital data

o   The United States (U.S.) Senate Commerce, Science, and Transportation Committee will hold a hearingthat “will examine the correlation between American competitiveness and semiconductors; the impact of vulnerabilities in our semiconductor supply chains; and the importance of CHIPS legislation within the U.S. Innovation and Competition Act (USICA) of 2021 and the America COMPETES Act of 2022.”

§  24 March

o The United Kingdom’s (UK) House of Lords Fraud Act 2006 and Digital Fraud Committee will hold a formal meeting (oral evidence session)regarding “what measures should be taken to tackle the increase in cases of fraud.”

o   The United Kingdom’s House of Commons General Committee will hold two formal meetingson the “Product Security and Telecommunications Infrastructure Bill” “A Bill to make provision about the security of internet-connectable products and products capable of connecting to such products; to make provision about electronic communications infrastructure; and for connected purposes.”

§  29-30 March

o The California Privacy Protection Agency Board will be holding “public informational sessions.”

§  31 March

o The United Kingdom’s (UK) House of Lords Fraud Act 2006 and Digital Fraud Committee will hold a formal meeting (oral evidence session)regarding “what measures should be taken to tackle the increase in cases of fraud.”

§  6 April

o   The European Data Protection Board will hold a plenary meeting.

§  15-16 May

o   The United States-European Union Trade and Technology Council will reportedly meet in France.

§  16-17 June

o   The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”