EDPB Proposes Guidance On Dark Patterns In Social Media

EDPB Proposes Guidance On Dark Patterns In Social Media
Photo by Susan Q Yin on Unsplash

You can find older editions of Wavelength here and before that on my blog.

Years ago, a friend of mine and I were walking in Washington, DC when we saw a grand mansion likely built in the late 19th Century. And while the one-time owner was associated with a famous cult or religion depending on one’s perspective, what drew our eyes was the promise of a tour. Nonetheless, we entered and were told tours started after guests watched a short video. Well, after 45 minutes of watching a very hard sell of said cult/religion, I was no longer interested in a tour. When the film ended, I was out of my seat and heading toward the door with my friend in tow. One of the disciples opened the door and was surprised to see me already there. I was brushed past this person and started asking where the exit was, but none of the 5-7 disciples assembled to bring us into the cult/religion would answer. So, I looked around and saw an open door, and I led my friend out of the building quite without any help from people inside.

You may be asking why I’m starting today’s edition of the Wavelength with this seeming non-sequitur. The reason is that reading the European Data Protection Board’s new guidelines on dark patterns in social media under the General Data Protection Regulation (GDPR) very much remind me of this experience in a number of ways. One can count on misdirection, deception, and hiding the ball when one wanders into social media. Recently, we decided to end one of our evergreen subscription services from a major United States (U.S.) giant, and, in contrast to the ease of signing up, one had to wander through page after page of content that was not more confusing than helpful before we found how to pull the plug on this service.

Of course, the use of dark patterns online is not new, but regulators are paying more mind to how they can frustrate and deny users their rights. Last week, an edition of the Wavelength examined a bill in California based on the United Kingdom’s age appropriate code that addresses, in part, the use of dark patterns. And so it is in the European Union, too, with the EDPB issuing its “Guidelines 3/2022 on Dark patterns in social media platform interfaces: How to recognise and avoid them” for public consultation. Additionally, this is not the first time the EDPB has issued guidelines specific to social media companies.

In terms of the guidelines’ structure, the Board organizes dark pattern tactics into categories and explains them, cites to and discusses the relevant GDPR provisions, provides of examples of how dark patterns can be used throughout the life cycle of a social media account, makes best practice recommendations, and offers a checklist for social media platforms designers.

While the EDPB’s guidelines pertain directly to social media platforms, it is not hard to see their applicability to any controller subject to the GDPR, for dark patterns are widely used. Moreover, the EDPB is also thinking about the future of how people interact with technology and the evolving online world by including “voice-controlled interfaces (e.g. used for smart speakers) or gesture-based interfaces (e.g. used in virtual reality)” in how it defines “interface.” Regardless of the means, the Board asserted that many features of the “attention economy” in which those in many nations find themselves living can exceed the bounds of the GDPR through their use of dark patterns.

The EDPB offers a working definition of “dark patterns:” “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions in regards of their personal data.” The EDPB noted that dark patterns can interfere with and influence adversely how a person grants consent. Moreover, the EDPB asserted that not only can dark patterns violate data protection law, but they may also run afoul of consumer protection laws, for the two realms often overlap.

As mentioned, the EDPB divides dark patterns into the following categories “according to their effects on users’ behaviour:”

§ Overloading: users are confronted with an avalanche/ large quantity of requests, information, options or possibilities in order to prompt them to share more data or unintentionally allow personal data processing against the expectations of data subject.

§ Skipping: designing the interface or user experience in a way that the users forget or do not think about all or some of the data protection aspects.

§ Stirring: affects the choice users would make by appealing to their emotions or using visual nudges.

§ Hindering: an obstruction or blocking of users in their process of getting informed or managing their data by making the action hard or impossible to achieve.

§ Fickle: the design of the interface is inconsistent and not clear, making it hard for users to navigate the different data protection control tools and to understand the purpose of the processing.

§ Left in the dark: an interface is designed in a way to hide information or data protection control tools or to leave users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights.

The EDPB continued “dark patterns can also be divided into content-based patterns and interface-based patterns to more specifically address aspects of the user interface or user experience:

§  Content-based patterns refer to the actual content and therefore also to the wording and context of the sentences and information components. In addition, however, there are also components that have a direct influence on the perception of these factors.

§  These interface-based patterns are related to the ways of displaying the content, navigating through it or interacting with it.

And, lest it be forgotten, the EDPB stressed that social media companies have heightened responsibilities under the GDPR regarding children. And so, the Board stated “[i]t is essential to keep in mind that dark patterns raise additional concerns regarding potential impact on children, registering with the social media provider,” namely the requirement that clear and plain language be used with children. Some dark patterns do not follow this dictate and may provide additional grounds for DPAs to pursue enforcement actions.

In turning to how dark pattens may violate the GDPR, the EDPB looks to the regulation itself and also its own guidelines that have construed the GDPR. The Board starts by noting Article 5(1)(a)[1]and its fairness requirement that “requires that personal data shall not be processed in a way that is detrimental, discriminatory, unexpected or misleading to the data subject.” According to the EDPB, interface-based dark patterns can potentially be unfair if they have “insufficient or misleading information for the user.”

The EDPB next discusses how dark patterns may well violate other foundational principles of the GDPR: accountability, transparency, and the obligation of data protection by design. It is at this point that the EDPB mentions and then declines to analyze in depth the concept that dark patterns violate the consent requirements of the GDPR. It seems strange that the Board would choose not to go more deeply than a few sentences, and I can only speculate as to why. It may perhaps be that getting into consent gives rise to a colorable argument along the lines that sophisticated adults can rarely be fooled by dark patterns, and so giving consent under those circumstances is still valid under the GDPR. It may also be the case that the EDPB does not feel it necessary to even go to consent because dark patterns violate even more basic principles of the GDPR like fairness, accountability, and transparency.

The EDPB takes an interesting route in its discussion on accountability in that it noted controllers must be able to show compliance with the GDPR. The Board proceeds to suggest ways a social media platform might show accountability in its practices and how to prove they are not dark practices. For example, the EDPB recommends that “[t]he user interface and user journey can be used as a documentation tool to demonstrate that users, during their actions on the social media platform, have read and taken into account data protection information, have freely given their consent, have easily exercised their rights, etc.” The EDPB does much the same regarding transparency and stated “making documentation on processing accessible or recordable could help provide accountability.” Finally, with respect to data protection by design and default, the EDPB relied on its construction of Article 25 of the GDPR:

In the context of the Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, there are some key elements that controllers and processors have to take into account when implementing data protection by design regarding a social media platform. One of them is that with regard to the principle of fairness, the data processing information and options should be provided in an objective and neutral way, avoiding any deceptive or manipulative language or design.

In terms of advice or best practices, the EDPB offered the following at the beginning of its discussion about dark patterns and the opening of social media accounts:

Social media providers need to make sure that they implement the principles under Article 5 GDPR properly when designing their interfaces. While transparency towards the data subjects is always essential, this is especially the case at the stage of creating an account with a social media platform. Due to their position as controller or processor, social media platforms should provide the information to users when signing up efficiently and succinctly, as well as clearly differentiated from other non-data protection related information. Part of the transparency obligations of the controllers is to inform users about their rights, one of which is to withdraw their consent at any time if consent is the applicable legal basis.

The Board offered this advice in the context of “staying informed on social media:”

The relationships just outlined become clear on the basis of Article 5 GDPR. Transparency and fairness are already systematically mentioned side by side in Article 5 (1) (a) GDPR, as one component determines the other. The fact that not only external but also internal transparency must exist is also made clear by the accountability requirement in Article 5 (2) GDPR. The most important part of internal transparency is the requirement to keep a record of processing activities under Article 30 GDPR. For external transparency, social media providers can provide a layered privacy notice to users, among other means of information. This need for comprehensibility and fair processing also results in the requirements of Article 12 (1) GDPR, which state that any information referred to in Articles 13 and 14 shall be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language. Consequently, the information content must be made available without obstacles. If the requirements of Article 12 are not met, there is no valid information within the meaning of Articles 13 and 14 GDPR. Thus, for effective control, controllers and processors can be held accountable, leading to effectiveness of the GDPR requirements in practice.

The Board explained how consent should work under the GDPR when a person is using a social media platform:

Social media platform users need to provide their respective consent during different parts of data processing activities, for example before receiving personalized advertisement. As already outlined in the EDPB Guidelines on Targeting of Social Media Users, consent can only be an appropriate legal basis if a data subject is offered control and genuine choice. In addition, according to Article 4 (11) GPDR, consent must be specific, informed and unambiguous. It is important to underline that the requirements for valid consent under the GDPR do not constitute an additional obligation, but are preconditions for lawful processing of users’ personal data. Moreover, when online marketing or online tracking methods are concerned, Directive 2002/58/EC (e-Privacy Directive) is applicable. However, the prerequisites for valid consent under the e-Privacy Directive are identical to the provisions related to consent in GDPR.

The EDPB said that not all seven rights under the GDPR will apply to all social media platforms (i.e. access, rectification, erasure, temporary restriction of processing, portability, objection to processing, and not being subject to decisions based on automated processing):

The EDPB underlines that not all of these rights will apply to every social media platform, depending on its legal basis and purposes of processing of personal data and types of services provided. The differences should be explained by the controller in accordance with Article 12 GDPR. This means that the information on applicable rights should be concise and clear to users, including why certain rights do not apply. Such an explanation could limit the amount of communication with users when they are trying to exercise some of them. The exercise of the right should be easy and accessible in accordance with Article 12 (2) GDPR and the reply should be given without undue delay as required per Article 12 (3). Similarly, the social media platform should explain why certain requests cannot be fulfilled and inform on the possibility to lodge a complaint to a designated supervisory authority as per Article 12 (4) GDPR. Thus, the following dark patterns may not be applicable to all of the rights mentioned above. The right to erasure is discussed in detail in the next chapter.

Other Developments

Photo by Krzysztof Hepner on Unsplash

United States President Joe Biden renewed warnings of possible Russian cyber attacks on U.S. entities “based on evolving intelligence that the Russian Government is exploring options for potential cyberattacks” and a factsheettitled “Act Now to Protect Against Potential Cyberattacks.”

The Supreme Court of the United States ruled that the “state secrets” privilege trumps a provision in the Foreign Intelligence Surveillance Act “that provides a procedure under which a trial-level court or other authority may consider the legality of electronic surveillance conducted under FISA and may thereafter order specified forms of relief.”

The Australian Communications and Media Authority (ACMA) issued “A report to government on the adequacy of digital platforms’ disinformation and news quality measures” and Minister for Communications, Urban Infrastructure, Cities and the Arts Paul Fletcher MP announced that the “The Morrison Government will introduce legislation this year to combat harmful disinformation and misinformation online…[that] will provide the ACMA with new regulatory powers to hold big tech companies to account for harmful content on their platforms.”

United States (U.S.) Senators Thom Tillis (R-NC) and Patrick Leahy (D-VT) introduced the “SMART Copyright Act of 2022” (S.3880) “bipartisan legislation that would hold tech accountable by developing effective, widely-available measures to combat copyright theft.”

Colorado Attorney General Phil Weiser invited informal comments on future rulemakings, which would include those regulations necessary to effectuate the “Colorado Privacy Act.”

The government of Australian Prime Minister Scott Morrison announced it “is investing a further $150 million into a range of measures to deliver on our commitment to ending family, domestic and sexual violence…[including] a $104 million technology-focused package to keep women and children safe and prevent devices being used to perpetrate or facilitate family, domestic and sexual violence.”

Maryland’s House of Delegates passed a biometric privacy bill 100-30, sending the legislation to the state Senate, which has its own bill.

Tweet of the Day

Further Reading

Photo by Eduardo Casajús Gorostiaga on Unsplash

The cyber warfare predicted in Ukraine may be yet to come” By Chris Krebs — Financial Times

Why You Haven’t Heard About the Secret Cyberwar in Ukraine” By Thomas Rid — New York Times

The secret police: Cops built a shadowy surveillance machine in Minnesota after George Floyd’s murder” By Tate Ryan-Mosley and Sam Richards — MIT Technology Review

A professor found his exam questions posted online. He’s suing the students responsible for copyright infringement.” By Jaclyn Peiser — Washington Post

China requires Microsoft's Bing to suspend auto-suggest feature” — Reuters

Anti-vax conspiracy groups lean into pro-Kremlin propaganda in Ukraine” By Laura Kayali and Mark Scott — Politico EU

Employment law still has roots in the Middle Ages. That’s terrible for workers.” By Matthew Scherer and Aiha Nguyen — Washington Post

Australia launches program to curb stalkerware” By Tonya Riley — cyberscoop

Rohingya refugees file lawsuit against Facebook over genocide” — DW

The Supreme Court just made a US-EU Privacy Shield agreement even harder” By Patrick Toomey and Ashley Gorski — The Hill

CISA’s Easterly calls on industry leaders to close gender gap” BY Emma Vail — The Record

Russia outlaws Facebook and Instagram after labeling Meta 'extremist” By Veronica Irwin — Protocol

Google Settles With Four Engineers Over Complaint It Fired Them for Organizing” By Lauren Kaori Gurley — Vice

Coal to crypto: The gold rush bringing bitcoin miners to Kentucky” By Avi Asher-Schapiro — Thomson Reuters

Coming Events

Photo by Antenna on Unsplash

§  21 March

o The European Parliament’s Internal Market and Consumer Protection and Civil Liberties, Justice and Home Affairs committees will hold a joint hearingon the proposal for an Artificial Intelligence Act.

o Canada’s House of Commons Standing Committee on Access to Information, Privacy and Ethics will hold a hearing titled “Use and Impact of Facial Recognition Technology.”

§  22 March

o   The United Kingdom’s (UK) House of Lords Science and Technology Committee will hold a formal meeting (oral evidence session) as part of its inquiry into the Government’s plans to deliver a UK science and technology strategy.

o   The United Kingdom’s House of Commons General Committee will hold two formal meetings on the “Product Security and Telecommunications Infrastructure Bill” “A Bill to make provision about the security of internet-connectable products and products capable of connecting to such products; to make provision about electronic communications infrastructure; and for connected purposes.”

o The United States (U.S.) Senate Commerce, Science, and Transportation Committee will mark up a number of bills:

§  The “Martha Wright-Reed Just and Reasonable Communications Act of 2021” (S. 1541) “To amend the Communications Act of 1934 to require the Federal Communications Commission to ensure just and reasonable charges for telephone and advanced communications services in correctional and detention facilities.”

§  The “Next Generation Telecommunications Act” (S. 3014) “To establish the Next Generation Telecommunications Council, and for other purposes.”

§  “Reese’s Law” (S. 3278) “To protect children and other consumers against hazards associated with the accidental ingestion of button cell or coin batteries by requiring the Consumer Product Safety Commission to promulgate a consumer product safety standard to require child-resistant closures on consumer products that use such batteries, and for other purposes.”

§  The “Low Power Protection Act” (S. 3405) “To require the Federal Communications Commission to issue a rule providing that certain low power television stations may be accorded primary status as Class A television licensees, and for other purposes.”

§  23 March

o   The United Kingdom’s House of Commons’ Science and Technology Committee will hold a formal meeting (oral evidence session) in its inquiry on “The right to privacy: digital data

o   The United States (U.S.) Senate Commerce, Science, and Transportation Committee will hold a hearingthat “will examine the correlation between American competitiveness and semiconductors; the impact of vulnerabilities in our semiconductor supply chains; and the importance of CHIPS legislation within the U.S. Innovation and Competition Act (USICA) of 2021 and the America COMPETES Act of 2022.”

§  24 March

o The United Kingdom’s (UK) House of Lords Fraud Act 2006 and Digital Fraud Committee will hold a formal meeting (oral evidence session) regarding “what measures should be taken to tackle the increase in cases of fraud.”

o   The United Kingdom’s House of Commons General Committee will hold two formal meetings on the “Product Security and Telecommunications Infrastructure Bill” “A Bill to make provision about the security of internet-connectable products and products capable of connecting to such products; to make provision about electronic communications infrastructure; and for connected purposes.”

§  29-30 March

o The California Privacy Protection Agency Board will be holding “public informational sessions.”

§  31 March

o The United Kingdom’s (UK) House of Lords Fraud Act 2006 and Digital Fraud Committee will hold a formal meeting (oral evidence session) regarding “what measures should be taken to tackle the increase in cases of fraud.”

§  6 April

o   The European Data Protection Board will hold a plenary meeting.

§  15-16 May

o   The United States-European Union Trade and Technology Council will reportedly meet in France.

§  16-17 June

o   The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”



[1] Personal data shall be:

(a) processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’);

A newsletter written by a former lobbyist and lawyer on tech policy, law, and politics.