How Privacy Laws are Combatting the Growing Threat of Dark Patterns

Created with Microsoft Bing Image Creator powered by DALL-E

How Privacy Laws are Combatting the Growing Threat of Dark Patterns 


Introduction to Dark Patterns

        Dark Patterns have become increasingly scrutinized by privacy regulators in the past year. Dark patterns are tactics used to trick or manipulate users into making purchases or providing personal information. Common examples used by companies today include misleading or disguised advertisements, difficult-to-cancel subscriptions or charges, burying key terms and junk fees in dense text, and tricking users into sharing their data. Their use by companies is widespread, with a 2019 study finding over half of 5000 privacy notifications sent by companies in Europe utilizing dark patterns, and only 4.2% of them giving their users the choice to give consent to collecting their personal data. Moreover, it has become revealed that those impacted by dark patterns increasingly include minors, such as in children's apps and games, which along with the expanding usage and scope of dark patterns have prompted a harsher crackdown by regulatory agencies such as the FTC

Why do dark patterns matter? 

        The main issue of dark patterns is that they manipulate consumers into making choices that are not in the consumers' interest, but that of the company's instead. In particular, the lack of informed consent (where consumers are made fully aware of the reasoning and consequence of their choice) is what causes dark patterns to violate most privacy laws and legislation against unfair practices. Although not all dark patterns usage may be illegal, depending on the specific tactic used and the laws or regulations that apply, considering the increasing scope and development of privacy laws, they are likely to violate the principles of informed consent or unfair practices and thus be considered a breach of such laws. 

What are some current examples? 

        The latest example, Epic Game's $520M settlement penalty for dark patterns and children's privacy violations, gives an example of how these exploitative practices impact both financially and on privacy rights. $245M of the $520M settlement are intended to refund customers for deceptive design tricks that duped players into making unintentional purchases. These deceptive designs by Epic Games also included making it difficult or confusing to undo purchases or request refunds, or request that personal information not be collected. 

        $275M was for violating the US's Children's Online Privacy Protection Act (COPPA), which applies to sites that targets or collect personal information about children under 13. After ruling that COPPA applied to Epic Games' Fortnite (due to factors such as children constituting a significant portion of playerbase, and 'Fortnite-branded merchandize' licenses owned by Epic Games), the FTC alleged that COPPA was violated due to the lack of parental controls in default settings that put children at risk(of bullying, threats, and sexual harassment) through voice and text chat. These involved "privacy-invasive default settings and deceptive interfaces that tricked Fortnite users, including teenagers and children" according to the FTC Chair Lina M. Khan. In the settlement, Epic Games agreed to delete all personal information of minors unless parental consent is given, as well as changing its default settings to opt-in rather than opt-out of such voice and text communications. 

How do existing laws deal with dark patterns? 

        In the US, legislation already exists that prohibit the use of such dark patterns, although there is variation in the methods and severity. For instance, the California Privacy Rights Act(CPRA) and the Colorado Privacy Act (CPA) both state that consent obtained via dark patterns do not constitute proper, valid consent, while the Virginia Consumer Data Protection Act (VCDPA) does not specifically mention dark patterns by name and merely defines consent, which could indirectly be argued as being violated if obtained through dark patterns.  Additionally, in a report on September 2022, the FTC reiterates their commitment to promoting awareness of dark patterns and cracking down on companies that abuse it. In terms of the FTC, it is dark patterns' nature as unfair and deceptive practices (Stated in the FTC Act) that bring it under its regulatory overview, as well as the Restore Online Shoppers' Confidence Act (ROSCA). In the EU, the European Data Protection Board (EDPB) adopted Guidelines on dark patterns which detail how dark patterns may breach GDPR requirements and thus subject the companies to regulation and possible fines for violating the GDPR. 

        In comparison, Canada does not have specific mention of dark patterns, as existing privacy legislation is rather outdated and a new privacy law (Bill C-27) is currently under review. Instead, Canada relies on requirements for meaningful consent for the collection, use and disclosure of personal information to protect consumers against dark patterns. Bill C-27 may provide more direct protection, with section 16 of Bill C-27 stating that an organization cannot obtain consent through "deceptive or misleading practices", which can likely be applied to dark patterns. Similarly, Canada's Anti-Spam Legislation (CASL) prohibits false and misleading statements that promote a business interest or product, which could indirectly be applied to some types of dark patterns. However, despite the potential use of principles of meaningful consent to combat dark patterns, Canada currently lacks specific prohibition of dark patterns nor can existing legislation be easily and directly applied against them. 

        Korea appears to be moving in a similar direction; Korea's Personal Information Protection Commission (PIPC) recently published the 'Personal Information Protection Investigation Promotion Direction in 2023' on Jan 11th, which states that their goal in the upcoming year is to create a trustworthy "digital ecosystem" through preemptive and preventive inspections to protect and promote privacy rights. Dark patterns are specifically mentioned by the PIPC among the areas of focus for inspection and investigation, alongside a focus on protecting children's personal information. Although there are no specific details yet, it seems that Korea will follow the methods used by the EDPB and the US's FTC in employing privacy laws to combat dark pattern usage. 

Conclusion: 

        Overall, a common theme seems to be an emphasis on the principle of 'informed, meaningful consent', with some legislation choosing to specifically address and prohibit dark patterns while other laws provide a more general prohibition against deceptive practices which could be applied to dark patterns. In the absence of specific legislation, the duty falls on regulatory agencies such as the Data Protection Officers, the FTC, and PIPC to investigate and if necessary, pursue legal action against companies that apply dark patterns to exploit consumers so that the significant financial penalties (as in the Epic Games case) may deter other companies. Thus it appears that in order to further protect consumer rights and privacy rights, more direct legislation and more enforcement powers for regulatory agencies is needed, especially in countries like Canada where the regulatory agencies lack the enforcement power wielded by those such as the FTC and in the GDPR. 


Comments

Popular posts from this blog

Seeking ChatGPT's Insight: Are the Biden Administration's 'Trump-Proofing' Efforts Legally and Morally Justifiable?

ChatGPT's Age-related Slogans for Biden, Trump, and Desantis.

Unraveling the WGA’s MBA with ChatGPT: Expert Analysis or Algorithmic Bias Towards Legalese?