Abusive by Design: Part 3. Dark Patterns v. the GDPR
Abusive by Design: Part 3. Dark Patterns v. the GDPR
Ever pressed 'Agree' without even trying to read the description in a hurry to access the website? Congratulations, you've got caught by so-called 'dark patterns'. Don't worry - everyone has. Every so often websites are flavored with nudges for the user to agree and spend more, and there isn't much we can do with this. Or is there?
In two previous articles (Part 1 & Part 2), I contemplated the rationale for design regulation and many ways how to spoil user consent. In this article, I would like to wrap up with no less interesting issues: how website owners increase conversion rates by various cognitive tricks, called dark patterns, and whether these tricks are lawful in terms of the EU privacy laws.
What are Dark Patterns
Yet, dark patterns do not have its legal definition, but the idea is straightforward. According to the website darkpatterns.org, Dark Patterns are tricks used on websites and apps that make you do things that you didn't mean to, like buying or signing up for something. Varying from pre-ticked checkboxes to blaming the user, those are different design features that nudge the user to the desired decision. For example, a customer is more likely to press a big green and underlined 'Agree' button than a grey small-font 'I don't want to receive an interesting content' button.
As I stated earlier, the European GDPR requirements do regulate the way how the design of the services communicates with its users. Here comes the main controversy between the GDPR and dark patterns:
While dark patterns and UX in general usually nudge the user to share more data without thinking, the privacy-friendly design should empower the user to make informed decisions.
According to the famous book 'Thinking: fast and slow' by Daniel Kaneman, it is easier to agree or spend something when you think little or have limited information on the issue. The rationality of the decision vastly depends on the time we spent on thinking and the quality of information we base our judgments on. The dark patterns try to trick us by giving only a limited amount of information or nudging us to make immediate decisions. Both 'informed' and 'decision' parts become blurred, for the sake of user convenience.
The result is that privacy becomes at odds with many widespread design practices. In its report on design and privacy, CNIL, a French Data Protection Authority highlights four major groups of dark patterns that affect user's privacy:
1. Pushing to share more data;
2. Influencing consent;
3. Making data actions difficult; and
4. Diverting the attention of the user.
All of them try to trick the user, but it doesn't necessarily mean they are unlawful. By using the dark patterns, business pursues its own interests - and these interests can be legitimate.
The lawfulness will depend on whether the feature violates "Privacy by design" principle, which is detailed in the GDPR requirements:
1. Lawfulness, Fairness, and Transparency - does the design clearly communicate data collection practices of the service? Does the user have a choice, whenever he/she is supposed to have it?
2. Data Minimization - is the user required to submit a minimum of information necessary for providing online services?
3. Privacy by default - does the user information remain not used for sharing with partners, sending marketing emails or personalized ads until the user decides otherwise?
4. Balance of the company's legitimate interests and user rights - does the user have reasonable expectations about the company's data practices? Is it fair to him/her (awfully vague criterion)?
Let's see what dark patterns overviewed by the CNIL could be considered illegal, and what are legitimate business practices. I regrouped all proposed techniques in three categories:
- Dangerous practices – the ones that most probably violate the GDPR requirements and thus must be avoided;
- Mixed practices – mostly depend on additional details, such as whether the data is being collected or who would get access to it;
- Non-privacy-violating practices – those that, in my opinion, will not violate the data protection requirement, namely the GDPR, though they still can be unethical or violate other laws (e.g., consumer protection);
All of the assessments are of the author’s personal opinion and none of them constitute legal advice nor official positions of regulators.
- Confidence abuse. (pushing to share more data) – "Data is stored securely", "It is just between you and us". Mind your language, sir. Is the company 100% sure it doesn't share the data with its channel partners or clients? Not even with a cloud provider or email notification provider? In the SaaS-based economy, it is extremely difficult to keep the data within one entity. Statements like 'between you and us' can deceive the user and violate transparency principle, making one think the data is not shared with anyone;
- Default sharing (pushing to share more data) – many services, such as Google or Facebook, have pre-set user settings that allow sharing/selling user information. The user is required to make an active action to opt-out of the privacy intrusions. Such a feature violates 'privacy by default' principle.
Default sharing. Source: wired.com
- Trick Questions (influencing consent). Example: Do you not want your information not to be shared to provide you with exceptional personalization?
o Yes, I don't want;
o No, I do want.
What would you choose? The wording is important: the GDPR requires to communicate with users in a clear, plain, and intelligible form. Some diligent regulators, such as a French one, may consider tricky questions breaching transparency requirements;
- Making differences between consequences vague and unclear (influencing consent). The GDPR directly states that the company must inform the user about all the purposes and consequences of the user choice. Playing with wordings as 'we might share' or 'we reserve our right to use' can lead to a breach of this obligation;
- Silently changing the terms of sharing personal data (influencing consent). It is a clear violation of the GDPR requirements, since the user must have information and choice about new Terms before they start to apply to one's account. That's why a proper notification of updating the Privacy Notice is important;
- Cookie Wall (making data actions more difficult). Before you proceed with surfing the website, you have to agree to all cookies. The described situation leaves the user with no choice, which would violate the notion of a 'freely given' consent. The illegality of cookie walls was twice confirmed by the British and French regulatory bodies in their guidelines;
- Complicate adjusting of privacy settings (making data actions more difficult).Confirmed by the Google case mentioned afore, if the regulator proves that the company intentionally hides privacy settings from the user, the practice will constitute transparency violation.
How many steps does it take to consent to cookies?
- Continue reading the article by providing your e-mail address. Is there a necessity to obtain a user's email address to give him access to the article? If you don't want to send a copy to the user's email, the 'email' requirement can be at odds with data minimization, fairness, and lawfulness principles;
- Attention Diversion – (green ‘continue’ button v. grey 'no' button). At first sight, it does not violate the transparency nor fairness principle. Not all countries believe so, however: in its cookie guidance, a British regulatory body (ICO) emphasized such practices as undue influence on the user choice, thus violating lawfulness principle, and more specifically, consent requirements;
- Bait and change – user's choice produces a different effect than expected. For example, giving an "acceptance" effect to a button with a cross (close/exit), which in users’ minds is synonymous with “close and move on.” This method has for example been used by Microsoft to “encourage” users of the previous version of its Windows OS to switch to Windows 10. The same can be used for obtaining cookie/spam consent, which will lead to transparency and consent violation;
- Chameleon strategy – masking third-party services by the same design (API). To avoid GDPR violations, the company should clearly mark the services as 'third-party' and, ideally, provide a link to third-party Privacy Notice.
- Safety Blackmail – update your personal details for security reasons. This one won't probably violate the legal framework; moreover, such a request can be in line with the GDPR principle of data quality, which requires to retain information in an accurate and up-to-date form. Even if the end purpose is to update the marketing database, the practice remains in the company's legitimate interests expected by the user;
- Asking more data for "Improving the experience" – while it could be a trick for pushing to share user data, it does not contain the GDPR violation, as long as all purposes are clearly articulated to the user;
- Last-minute consent – when the company asks for consent as the last step before receiving a reward (free article, file conversion, lottery, etc.). There will be no violation of the GDPR - only make sure that the consent was before the data collection;
- Graphic Signal – using standard icons (padlock) for nudging. Such icons may be used to emphasize important information or make a certain choice more attractive to the user. As long as the icons don't mislead the user, they will not violate, but rather complement the company's privacy efforts;
- Blaming the individual for non-providing the data. While it becomes increasingly popular to claim the personal data to be a source of the site's life, such statements will not violate the data protection requirements;
- Repetitive requests for sharing the data. Nobody prohibits the changing of the user's choice - a company may ask the user for consent as many times as it wants. Though there will be no violation, attacking the user with numerous requests will negatively affect the user experience, which could decrease conversion rates;
- Camouflaged Advertising – when a company places advertising, e.g., on the blog, without marking it as Ad. It’s not a violation of the GDPR unless personal data is collected under such an Ad.
Do Dark Patterns Work?
Dark patterns affect our comprehension both in the short and long term. However, it is unknown how exactly it works with user's trust since no studies of the long-run effects were conducted.
Some designers and end-users report that breach of trust to services that use dark patterns can lead to a bad reputation and switching to competitors. In the long term, the site can end up with less total weekly active users and less total revenue due to the deteriorating effects of dark patterns.
Needless to say, the design does affect our choices. This power carries responsibility not to cross the line. A bit deeper analysis of dark patterns' legality, however, led to a mixed conclusion. Although most of the dark patterns are not ethical regarding the end-user, not all of them are illegal.
Moreover, nudging and using dark patterns can and must be used for the good of individuals, empowering them to take control over their data. To be privacy compliant, the website should meaningfully inform and navigate through the choices, rather than puzzle the user and leave without a choice at all.
I hope this series of articles did shed some light on the topic of design regulations and its intersection with the right to privacy.
Stay tuned for more in 2020.
P.S. Isn’t it funny that even the button’s colour can bear legal consequences?
Disclaimer: the information in this article is provided for informational purposes only. You should not construe any such information as legal, tax, investment, trading, financial, or other advice.
Vlad Nekrutenko, CIPP/E
Privacy Lawyer at Legal Nodes
Need a lawyer in this area?
3 years in data protection