Hostname: page-component-745bb68f8f-lrblm Total loading time: 0 Render date: 2025-02-05T21:59:23.773Z Has data issue: false hasContentIssue false

The Issue of Dark Patterns in Digital Platforms: The Challenge for Indonesia’s Consumer Protection Law

Published online by Cambridge University Press:  07 January 2025

Adis Nur Hayati*
Affiliation:
The National Research and Innovation Agency (BRIN), Jakarta, Indonesia
Rights & Permissions [Opens in a new window]

Abstract

Dark patterns have become a pervasive issue around the world. However, Indonesia has not taken any action to deal with the situation. This article aims to analyse the issue of dark patterns in digital platforms from Indonesia’s consumer protection law perspective. It contends that Indonesia’s consumer protection law is not adequate in overcoming the issue of dark patterns in digital platforms. Several laws regulate basic principles and provisions that indirectly prohibit dark patterns, but these regulations can only be applied to certain forms of dark patterns and are difficult to implement. The article suggests that Indonesia must specifically regulate the prohibition of the practices of dark patterns in its current law. The regulations do not have to define the term “dark patterns” directly; however, they should expressly regulate prohibitions against the practice of using design interfaces in a way that subverts or impairs consumers’ autonomy and/or exploits consumers’ behavioural biases.

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of Asian Journal of Law and Society

1. Introduction

Dark patterns have become an important issue faced by a lot of countries around the world. In general, dark patterns can be understood as user interface design choices that benefit an online service by coercing, manipulating, and/ or deceiving consumers to make unintended decisions or things, which often are not in their best interests and they probably might not make if they are fully informed and/ or capable of selecting alternatives (OECD, 2022, p. 8; Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 2). There are a lot of types of dark patterns, such as fake countdown timers, disguised ads, hidden costs, forced registration, etc. These dark patterns become an issue because they undermine individual autonomy in making decisions (King and Stephan, Reference King and Stephan2021, p. 259), and their nature potentially can cause detriment to consumers, such as manipulating consumer purchase decisions, misleading and/or tricking consumers into giving out their personal data, stimulating compulsive and addictive consumer behaviours of time on service, etc. (Bongard-Blanchy et al., Reference Bongard-Blanchy, Rossi, Rivas, Doublet, Koening and Lenzini2021, p. 763). Research has shown evidence of the prevalence of these dark patterns in digital platforms. Mathur et al., for example, stated that around 11.1% of 11,000 popular e-commerce websites featured dark patterns (Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 2). Carol Moser et al., found that the top 200 e-commerce websites in the United States (US) are identified to use multiple features that stimulate “impulse buying,” which can also be perceived as dark patterns (Moser, Schoenebeck and Resnick, Reference Moser, Schoenebeck and Resnick2019, p. 1). Further, the European Commission, explained that 97% of the most popular websites and apps used by European Union (EU) consumers deployed at least one dark pattern (European Commission, 2022a, p. 6), etc.

A lot of countries are taking action to overcome or regulate the issue of dark patterns. However, Indonesia has not yet been fully concerned or taken any action to deal with the situation. In the US, for example, in 2020, California enacted the California Privacy Rights Act (CPRA), which is the first legislation explicitly regulating the definition of dark patterns related to privacy consent (it took effect on 1 January 2023) (King and Stephan, Reference King and Stephan2021, p. 252). In 2022, the EU adopted the Digital Services Act (DSA), which also contained regulations and definitions related to dark patterns (European Commission, 2022b). In Indonesia, on the other hand, there have not been any regulations that explicitly regulate the dark pattern issues. The issue of dark patterns seems to have not gained the recognition or attention of the government, even though it causes harm to the consumers. Related to that situation, this article then aims to analyse the issue of dark patterns in digital platforms from Indonesia’s consumer protection law perspective.

This article contends that Indonesia’s consumer protection law is not adequate in overcoming the issue of dark patterns in digital platforms. Several laws regulate basic principles and provisions that indirectly prohibit dark pattern practices, but these regulations can only be applied to certain forms of dark patterns and are difficult to implement. There are two main reasons for the difficulty in implementing these practices, which are: (1) the construction of the regulations in the acts is very general in nature, so it really depends on the interpretation of judges or law enforcement agencies and (2) the difficulty of proving the detriments experienced by consumers due to dark patterns created by business actors. The article then suggests that Indonesia must specifically regulate the prohibition of the practices of dark patterns in its current law. Learning from other countries, the regulations do not have to define the term “dark patterns” directly; however, they should expressly regulate prohibitions against the practice of using design interfaces in a way that subverts or impairs consumers’ autonomy and/or exploits consumers’ behavioural biases. Furthermore, the formulation of regulations related to the prohibition of dark pattern practices must also focus more on the actions that are prohibited rather than emphasising the element of the detriments incurred. Furthermore, in order to explain this argument, this article will proceed in four parts. Part II explains the nature of dark patterns and how they could potentially cause harm to consumers. Part III delves into an analysis of Indonesia’s consumer protection law approaches to the issue of dark patterns in digital platforms. Part IV discusses the practices in other countries, such as the US, EU, and India, in regulating dark patterns and how they can become considerations for Indonesia, and Part V is the conclusion of the article.

2. Understanding dark patterns and how it may cause harm to consumers

Dark patterns are pervasive and frequently appear on digital platforms, such as e-commerce websites, smartphone apps, online games, etc. (Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 2). Consumers probably often encounter these dark pattern practices on digital platforms, but they may not realise that such business practices are called dark patterns. Suppose we want to unsubscribe from a service, but the procedure is difficult to carry out when we see a discount with a time countdown or information about recent sales activities that makes us subconsciously make a purchase in a hurry for fear of missing out on the opportunity, or perhaps when we purchase a flight ticket online and we have to remember and actively untick the checkbox for purchasing travel insurance if we do not want to, etc. (Laguri and Strahilevitz, Reference Laguri and Strahilevitz2021, pp. 48–51; Brenncke, Reference Brenncke2023, p. 12). These practices are some examples of the use of dark patterns on digital platforms.

Up to this moment, there is still no single universal definition regarding the term dark patterns. The reason is that there are a variety of practices and views regarding things that are considered dark patterns (OECD, 2022, p. 8). The term “dark patterns” was first introduced by Harry Brignull, a user experience (UX) designer, in 2010. He defined dark patterns as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something” (Brignull, no date). This initial definition was developed further by other researchers. Gray et al., for example, in their article, defined dark patterns as situations or instances “where designers use their knowledge of human behavior (e.g. psychology) and the desires of end users to implement deceptive functionality that is not in the user’s best interest” (Gray et al., Reference Gray, Kou, Battles, Hoggatt and Toombs2018, p. 1). Mathur et al. explained dark patterns as “user interface design choices that benefit an online service by coercing, steering, or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make” (Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 2). Meanwhile, Jamie Luguri and Lior Jacob Strahilevitz described dark patterns as “user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions” (Laguri and Strahilevitz, Reference Laguri and Strahilevitz2021, p. 44). From these definitions, it can be seen that the term dark patterns can be concluded as something related to business practices that use user interfaces to trick, manipulate, coerce, steer, or deceive consumers to make decisions that are not in their best interests, and they might not make if fully informed.

Dark patterns “nudge” consumers by relying on cognitive mechanisms and subliminally exploiting certain specific biases or heuristics (OECD, 2022, p. 9). According to Thaler and Sunstein, “nudge” is defined as “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives” (Thaler and Sunstein, Reference Thaler and Sunstein2008, p. 6). “Nudge” is basically a behavioural economics concept that explains how subtle changes in the environment can affect the outcomes of someone’s decision-making process (O¨zdemir, 2020, p. 4). For example, it was found that when people enter a canteen, they usually choose the first menu offered or the menu that is the easiest to reach; organisations use default options such as automatic renewable subscriptions because their consumers usually do not realise it and make them subscribe for a long time (Thaler and Sunstein, Reference Thaler and Sunstein2008, pp. 1, 35); supermarkets deliberately place items with the highest markup (e.g. reaching the eye) to “nudge” shoppers to make unplanned purchases, etc. (Weinmann, Schneider and Brocke, Reference Weinmann, Schneider and Brocke2019, p. 2). These cases show how a simple change or setting in the environment, such as placing a menu in a canteen, setting the subscription option in services, or placing products in supermarkets, have immense effects on the consumer’s decision.

Nudging works because people do not always behave rationally (Weinmann, Schneider and Brocke, Reference Weinmann, Schneider and Brocke2019, p. 2). Research in psychology has shown that humans act in a boundedly logical manner due to cognitive limits, and different heuristics and biases affect their decision-making. People’s choices and decisions can be influenced by even the smallest changes in their decision-making environment. For this reason, presenting choices in a certain way can be used to modify people’s behaviour in a predictable way (Narayanan et al., Reference Narayanan, Mathur, Chetty and Kshirsagar2020, p. 5; Weinmann, Schneider and Brocke, Reference Weinmann, Schneider and Brocke2019, p. 2). As a note, the concept of “nudge” offered by Thaler and Sunstein intends to focus on its beneficial uses to help people make better choices that can improve individual and social welfare. However, when “nudge” is essentially used for evil intentions or goals that make wise decision-making and prosocial activity more difficult, they call it “sludge” (Thaler, Reference Thaler2018, p. 431). Research also used the terms “evil” or “dark” nudges to describe nudging, which makes it easier to make choices against one’s best interests (OECD, 2022, p. 9).

In the practice of dark patterns context, the “choice architect” (the person who creates the decision environment) sets or designs user interface design elements of digital platforms to nudge consumers make choices or inputs in the business actors’ desired direction. For example, some websites use a “high-demand message” interface design such as “only left 1,” “last booked 1 hour ago,” etc. to indicate that a product or service is in high demand and likely to sell out soon (Bongard-Blanchy et al., Reference Bongard-Blanchy, Rossi, Rivas, Doublet, Koening and Lenzini2021, p. 768). This situation will unconsciously urge consumers to make a purchase in a hurry for fear of missing out on the opportunity. Figure 1 shows an example from a travel website in Indonesia that uses a “high-demand message.”

Figure 1. “High-demand message”

Not only are there dark patterns that urge consumers to purchase products or services but there are also dark patterns that are designed to nudge consumers into contractual agreements that they probably would not have otherwise desired. These strategies seem to be used by a range of e-commerce companies, from well-funded platforms to start-up apps (Laguri and Strahilevitz, Reference Laguri and Strahilevitz2021, p. 58). There are a variety of practices and views regarding things that are considered dark patterns, therefore, efforts to define or categorise them become difficult because their forms and practices are always developing. However, referring to Arunesh Mathur et al.’s research, dark patterns at least can be categorised into the following types (Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 12) (Table 1):

Table 1. Types of dark patterns (Mathur et al., Reference Mathur, Acar, Friedman, Lucherini, Mayer, Chetty and Narayanan2019, p. 12)

Dark patterns become a concern because they undermine consumers’ autonomy in making decisions. According to Arunesh Mathur et al., autonomy is “the normative value that users have the right to act on their own reasons when making decisions” (Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, p. 12). The idea of autonomy is based on the view that individuals can (mostly) reason through the various possibilities they confront, that they can (mostly and roughly) know what they believe and desire, and that they can act for the reasons that seem most appropriate to them (Susser, Roessler and Nissenbaum, Reference Susser, Roessler and Nissenbaum2019, pp. 35–36). Dark patterns weaken consumer autonomy because they lead consumers to unconsciously make decisions or choices that they would not have chosen if the decision-making environment had not been modified. As Arunesh Mathur et al. explained, “a dark pattern might deny a user choice, obscure available choices, or burden the exercise of choice” (Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, pp. 12–13). In this case, dark patterns mostly only provide consumers with the illusion of control rather than actual control (OECD, 2022, p. 23). Without realising that their choice architectures were constructed or modified in order to influence them, the consumers make decisions in the business actors’ desired direction (Susser, Roessler and Nissenbaum, Reference Susser, Roessler and Nissenbaum2019, p. 38).

Besides undermining consumers’ autonomy, dark patterns become an issue because they bring harm to consumers. The personal consumer harm of dark patterns can be categorised into three categories, which are (1) financial loss, (2) privacy harm, and (3) psychological detriment and time loss (OECD, 2022, p. 24; Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, pp. 9–11).

  1. 1) Financial loss. Financial loss is one form of detriment that many consumers experience due to dark pattern practices. In this case, digital platforms often use or design interfaces on their websites to manipulate or direct consumers to make purchases of products or services they do not need or more than what they originally planned. For example, the use of dark patterns in the form of sneak into baskets, namely where the platform enters products into consumer shopping carts without the consent of the consumer concerned (OECD, 2022, p. 24). Another example is the use of dark patterns in the form of hidden subscriptions, in this case, the platform misleads consumers by making it seem like consumers are only taking part in a free trial programme even though in fact they are making permanent subscriptions to the website, and so on (Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, p. 9). Dark patterns increase a business actor’s ability to extract revenue from its consumers (Narayanan et al., Reference Narayanan, Mathur, Chetty and Kshirsagar2020, p. 11). In the US, several actions to combat these practices have been carried out several times. In 2020, for example, the US FTC handled a case related to the use of dark patterns in the form of automatic renewal of consumers’ subscriptions without consent. This case then resulted in refunds for consumers affected by the practice of up to USD 9.7 million (OECD, 2022, p. 24).

  2. 2) Privacy harm. Another less obvious detriment that consumers might experience due to dark pattern practices is privacy harm. It is found that some digital platforms set privacy-intrusive settings as the default, making consumers face difficulties when trying to make privacy-related choices or information (OECD, 2022, p. 25). Other research explains that digital platforms use privacy-invasive defaults that expose users’ data and use emotion-laden language to discourage users from choosing privacy-respecting choices (Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, p. 9). According to Christoph Bösch et al., instead of building IT systems with privacy-friendly solutions, these business actors “aim for systems that purposefully and intentionally exploit their users’ privacy for instance motivated by criminal reasons or financially exploitable business strategies” (Bösch et al., Reference Bösch, Erb, Kargl, Kopp and Pfattheicher2016, p. 241). However, even though a lot of dark patterns are designed to exploit consumers’ privacy, some or a lot of consumers are probably unaware that their privacy has been violated. Consumers may find it difficult to determine the harm that results from a transaction involving their personal data because “the trade-off between the tangible and immediate short-term benefit of using the service and the costs of potential long-term privacy loss is difficult to evaluate” (OECD, 2022, p. 25).

  3. 3) Psychological detriment and time loss. Research has shown that dark patterns can create psychological detriment for consumers, such as resulting feelings of frustration, shame, being tricked, etc. Besides, it also leads consumers to expend a lot of energy and/or attention, which results in time loss (OECD, 2022, p. 25). According to Arvind Narayanan et al., making services addictive is one of the goals of dark patterns because consumers who stay on the websites or apps longer will yield more personal information, purchase more products or services, and see more ads (Narayanan et al., Reference Narayanan, Mathur, Chetty and Kshirsagar2020, p. 12). In social media platforms, for example, user interface designs such as infinite scroll and autoplay have been found to trigger addictive usage patterns for consumers. The same thing also happens in video games; interface designs such as loot boxes are also considered dark patterns that cause addiction, especially for children (OECD, 2022, p. 26).

Moreover, dark patterns not only bring detriments for the consumer as an individual but also can potentially cause structural consumer detriment collectively, such as undermining competition and reducing consumers’ trust. In this case, dominant firms may use dark patterns to further strengthen their market position and diminish competition by making it seem like consumers choose their product while it actually is from a series of dark patterns. Besides, dark patterns also can reduce consumers’ trust in online businesses because consumers who are aware of them might become sceptical and resistant to interfaces that probably “only look like” dark patterns (OECD, 2022, pp. 26–28; Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, pp. 10–11).

3. Indonesia’s consumer protection law against the issue of dark commercial patterns in digital platforms

Up to this moment, Indonesia still does not have an act that explicitly and specifically regulates the practice of dark patterns. Several provisions in Indonesia’s consumer protection law indirectly prohibit dark pattern practices, but these regulations can only be applied to certain forms of dark patterns and are difficult to implement. There are two main reasons for the difficulty in implementing these practices, which are: (1) the construction of the regulations in the acts is very general in nature, so it really depends on the interpretation of judges or law enforcement agencies and (2) the difficulty of proving the detriments experienced by consumers due to dark patterns (design interface) created by business actors.

Before going further to analyse how Indonesia’s consumer protection law addresses the issue, first, we have to equalise our perception of what is meant by consumer protection law. According to the Act of the Republic of Indonesia Number 8 of 1999 concerning Consumer Protection (Act Number 8 of 1999), consumer protection is defined as all efforts to ensure the existence of legal certainty to provide protection to consumers. Furthermore, researchers interpret consumer protection law as:

All the principles and rules that regulate and protect consumers in the relationships and issues of providing and using consumer products between providers and their use in social livelihoods. Strictly speaking, consumer protection law constitutes all statutory regulations, both laws and other legislation as well as judge’s decisions, the substance of which regulates the interests of consumers. (Rosmawati, Reference Rosmawati2018, pp. 7–8; Zulham, 2016, pp. 23–24)

This interpretation is in accordance with articles 6 and 64 of Act Number 8 of 1999, which state that consumers also have rights in the provisions of other laws and regulations and that all laws and regulations aimed at protecting consumers remain valid as long as they are not specifically regulated and/or conflict with the law in question. Therefore, it can be concluded that what is meant by ‘consumer protection law,’ especially in this paper, is not only limited to Act Number 8 of 1999 concerning Consumer Protection but is also related to other law regulations as long as it relates to consumer protection.

As mentioned before, there have not been any acts that explicitly regulate the dark pattern issues in Indonesia. However, some of Indonesia’s acts actually indirectly prohibit these practices. Indonesia’s consumer protection laws that can be used in relation to this issue include the (1) Act of the Republic of Indonesia Number 8 of 1999 concerning Consumer Protection, (2) Act of the Republic of Indonesia Number 27 of 2022 concerning Data Protection, and (3) Act of the Republic of Indonesia Number 11 of 2008 concerning Information Technology and Electronics (as changed by Act Number 19 of 2016, Act Number 1 of 2023 concerning the Criminal Code, and Act Number 1 of 2024).

Act Number 8 of 1999 does not explicitly mention the prohibition of dark pattern practices in Indonesia. Nevertheless, the law actually contains regulations regarding basic consumer principles and rights that indirectly prohibit the practice of dark patterns. These provisions are regulated in Article 4 of Act Number 8 of 1999. According to Article 4 letters (a), (c), and (g), it is regulated that consumers have the right to comfort, security, and safety in consuming goods and/or services, the right to correct, clear, and honest information regarding the guaranteed conditions of goods and/or services, as well as the right to be treated or served correctly and honestly and without discrimination. These rights are then correlated with the obligations of business actors to fulfil them in Article 7 letters (a), (b), and (c). Related to the issue of dark patterns, these practices clearly contradict the consumer rights mentioned. The practice of confirm shaming, for example, may contradict consumers’ right to comfort, the practice of hidden costs and hidden subscriptions has the potential to harm consumers’ rights to correct, clear and honest information, and so on. Apart from being contrary to the consumer rights in Article 4, the practice of dark patterns is actually also prohibited based on the formulation of Article 15 of Act Number 8 of 1999. This article stipulates that “Business actors in offering goods and/or services are prohibited from using coercion or other means that can cause physical or psychological interference to consumers.” Dark pattern practices, such as forced enrolment, hard-to-cancel, etc., can be categorised as a form of offering services through indirect coercion. In this case, proving physical and psychological interference due to these practices is certainly difficult. However, as explained previously, several studies have shown that the use of dark patterns can indeed cause psychological detriment for consumers (OECD, 2022, pp. 25–26; Narayanan et al., Reference Narayanan, Mathur, Chetty and Kshirsagar2020, pp. 4–5).

Another law that indirectly prohibits the practice of dark patterns in Indonesia is the Act of the Republic of Indonesia Number 27 of 2022 concerning Data Protection (“Act Number 27 of 2022”). The provisions of Act Number 27 of 2022 regulate that consent to personal data processing must be stated in a legal, explicit, transparent, limited, and specific manner (The Republic of Indonesia (last), n.d., art. 23 jo. 27). In Article 22 paragraph (4), it is also explained that if an agreement to process personal data contains other purposes, then the request for approval must meet the conditions of (a) being clearly distinguishable from other things, (b) being made in a format that is understandable and easy to access, and (c) using simple and clear language. Dark pattern practices such as using privacy-invasive defaults to reveal users’ data or using emotion-laden language to discourage users from choosing privacy-respecting choices (Mathur, Mayer and Kshirsagar, Reference Mathur, Mayer and Kshirsagar2021, p. 9) are thus prohibited because they contradict the principles of requesting consent, which must be clear, transparent, and easy to understand as explained before.

As regulated in Act Number 8 of 1999 and Act Number 27 of 2022, the provisions of Act Number 11 of 2008 concerning Information Technology and Electronics as changed by Act Number 19 of 2016 and Act Number 1 of 2024 (“ITE Act”) also do not explicitly regulate dark patterns. However, there is an article in the Act that basically shows that the practice of dark patterns is a practice prohibited in the Indonesian digital platform ecosystem. This element can be seen in the context of Article 28 paragraph 1 ITE Act. The provisions of Article 28 of the ITE Act regulate that every person is prohibited from “intentionally distributing and/or transmitting Electronic Information and/or Electronic Documents containing false or misleading information that results in material losses for consumers in Electronic Transactions.” As explained in the previous discussion, the practice of dark patterns is clearly misleading and has the potential to cause harm to consumers in electronic transactions. Therefore, in a broad sense, this article could be interpreted as a prohibition of such practices.

Based on the above explanations, it can be seen how several acts in Indonesia indirectly prohibit the practice of dark patterns. However, these regulations tend to be general in nature and do not explicitly and directly prohibit the practice of dark patterns or a practice that exploits consumer behaviour biases. Such conditions result in the enforcement or implementation of these regulations in the case of dark patterns being very dependent on the judge and law enforcers’ interpretations, especially regarding how judges and law enforcers understand the harm of dark pattern practices and their implications for consumers’ losses. Since Indonesia is a civil law country (the judge’s main role is to establish the facts of the case and apply the provisions of the applicable code) (Berkeley Law School, 2010, p. 1) and the issue of dark patterns itself is not yet well recognised in Indonesia, this is certainly an issue that needs attention.

Besides the regulations that are too general, another difficulty in implementing existing laws and regulations regarding dark pattern practices is related to proving the detriments experienced by consumers. If we look carefully, the regulations contained in Act Number 8 of 1999 and the ITE Act emphasise the prohibition of practices that cause harm to consumers. The current consumer protection law in Indonesia does not yet adhere to a strict liability system, thus the consumer as the plaintiff is a party who must prove (1) that there is actual loss that the consumer experienced, (2) that the loss occurred as a result of consuming goods and/or services produced or traded by the business actor, and (3) that the consumer did not contribute either directly or indirectly for the losses he experienced (Syam, Reference Syam2018, p. 103). Provisions of Article 28 Act Number 8 of 1999 actually regulate that proving whether or not there is an element of fault or omission is the burden and responsibility of the business actor, however, consumers are still burdened with proving the existence of losses suffered as a result of the business actor’s goods or services (Sumaryanto, Reference Sumaryanto2020, p. 62; Syam, Reference Syam2018, p. 103). In this case, the consumer must prove and convince the judge that the consumer suffered detriment, and the detriment was caused by dark patterns (interface design) created by the business actor. On the other hand, as mentioned in the previous discussion, proving losses due to dark pattern practices, whether in the form of financial losses, privacy harms, psychological harm, and especially consumer autonomy harm, is very difficult (OECD, 2022, pp. 24–26). These conditions then show how the current legal arrangements in Indonesia are not yet able to overcome the existence of the issue of dark patterns.

4. Learning from other countries’ practices: considerations to regulate dark patterns for Indonesia

Although regulating dark patterns can be difficult, some countries, such as the EU, the US, and India, have established laws and/or policies that particularly address the problem. Indonesia thus should also start regulating the practices of dark patterns in a specific way. Drawing from the other countries’ practices, one of the considerations that probably can be taken into account is that the regulations do not always define the term “dark patterns” explicitly; however, it should expressly regulate prohibitions against the practice of using design interfaces in a way that subverts or impairs consumers’ autonomy and/or exploits consumer behavioural biases.

In general, there are several reasons that make dark pattern practices hard to regulate, First, as mentioned before, there are a variety of practices and views regarding things that are considered dark patterns; thus it is difficult to regulate because there are always new practices or techniques that can be considered dark patterns. Second, dark patterns often operate in the grey zone between legitimate persuasion techniques and clearly illegitimate methods of influencing consumer behaviour such as coercion and deception. Third, dark patterns work because they exploit consumer behaviour biases, which are something probably different and new from traditional consumer legislation in some countries. Fourth, the notion of protecting consumer autonomy as a normative rationale for assessing and regulating dark patterns is under-researched (OECD, 2022, pp. 12–15; Brenncke, Reference Brenncke2023, pp. 35–36). However, as mentioned before, even though it is not easy to regulate, several countries, such as the EU, the US, and India, in fact have adopted and implemented some regulatory measures to address dark patterns.

In the EU, there are multiple acts that expressly protect consumers from dark pattern practices in digital platforms. In 2022, the EU enacted the DSA and Digital Markets Act (DMA), which posit obligations and prohibitions relating to dark patterns on online platforms in the EU (OECD, 2022, p. 33). The DSA is a legislative instrument that governs online intermediaries in the EU Single Market. Provisions regarding the prohibition of dark pattern practices are regulated in Article 25 of the DSA. The article does not expressly mention the term dark patterns but states that:

Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions (European Union, 2022).

The DSA mandates the European Commission to issue guidelines on how the prohibition in Article 25 applies to specific practices. This Act is set to take effect in February 2024, and both consumers and/or corporate users who get services from online platforms are protected by it (Brenncke, Reference Brenncke2023, p. 9). According to Mark Leiser and Christina Santos, the DSA aims to protect users’ autonomous decisions and choices by prohibiting various types of practices that manipulate, deceive, nudge, and exploit user’s choices, decision-making, and autonomy (Leiser and Santos, Reference Leiser and Santos2023, p. 21). The DMA also introduced provisions regarding dark patterns. The DMA is a legislative instrument that aims to overcome the roles and unfairness practices of certain online platforms that qualify as “gatekeepers.” It also does not directly state the term dark patterns, but it emphasises that gatekeepers must refrain from actions that compromise the enforceability of legal requirements and prohibitions. This involves the use of non-neutral design, presenting options to the end user, or subverting user autonomy, decision-making, or choice through a user interface’s structure, functionality, or operation, in whole or in part. Additionally, it has clauses that are designed or intended to prevent gatekeepers from using dark patterns (Leiser and Santos, Reference Leiser and Santos2023, p. 23). In 2023, the EU also adopted the Data Act, which posits prohibitions relating to dark patterns. The Data Act is a legislative instrument that harmonises rules on data sharing and data portability activities. According to this act, dark patterns are defined as “design techniques that push or deceive consumers into decisions that have negative consequences for them.” Furthermore, in Article 4 (4) Data Act, it is stated that:

Data holders shall not make the exercise of choices or rights under this Article by the user unduly difficult, including by offering choices to the user in a non-neutral manner or by subverting or impairing the autonomy, decision-making or choices of the user via the structure, design, function or manner of operation of a user digital interface or a part thereof (European Union, 2023).

The provisions related to the prohibition of dark pattern practices in the EU are not only regulated in the DSA, DMA, and Data Act but are also contained in the Consumer Rights Directive, Consumer Credit Directive II, The Regulation on Key Information Documents for Packaged Retail Investment and Insurance-based Investment Products, etc. (Brenncke, Reference Brenncke2023, p. 36).

In the US, the first regulation that expressly mentioned dark patterns is the California Privacy Rights Act (CPRA), which was passed in 2020. According to it, a dark pattern is defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” In the specific case of obtaining consent for personal information, it is regulated that “agreement obtained through use of dark patterns does not constitute consent” (King and Stephan, Reference King and Stephan2021, p. 271). Besides California, other consumer privacy laws at the state level such as in Colorado, Connecticut, and Texas also expressly regulate the use of dark patterns. Furthermore, at the federal level, the US introduced the “Deceptive Experiences to Online Users Reduction Act” or the “DETOUR Act” in 2021. This act does not expressly mention the term dark pattern; however, it is basically aimed at prohibiting business actors from using dark patterns in their user interfaces and also promoting consumer welfare through the use of behavioural research by such providers. The act states that it prohibits large online operators from using any user interface “with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, or choice to obtain consent or user data” (Brenncke, Reference Brenncke2023, p. 5; The United States, 2021, S.3330—117th Congress).

In India, their Central Consumer Protection Authority recently also enacted a guideline to prevent dark patterns called “Guidelines for Prevention and Regulation of Dark Patterns, 2023.” According to that guideline, dark patterns are defined as:

Any practices or deceptive design patterns using UI/UX (user interface/user experience) interactions on any platform; designed to mislead or trick users to do something they originally did not intend or want to do; by subverting or impairing the consumer autonomy, decision making or choice; amounting to misleading advertisement or unfair trade practice or violation of consumer rights.

Furthermore, the guideline then sets a list of specified dark pattern practices that are prohibited by the law, including the practice of false urgency, basket sneaking, confirm shaming, forced action, subscription trap, interface interference, bait and switch, drip pricing, disguised advertisement, and nagging (Government of India, 2023).

Based on the prevailing regulations, several researchers then proposed some recommendations or key points for regulating dark patterns. Jennifer King and Adriana Stephan, for example, recommend that, drawing from CPRA, in order to regulate dark patterns, the definition of dark patterns should be formed by focusing on outcomes rather than targeting business actors’ intent. The reason is that the intent of business actors will be hard to prove, while focusing on outcomes can be done by applying performance-based standards as the assessment tools (King and Stephan, Reference King and Stephan2021, pp. 273, 276). On the other side, Martin Brenncke, explains that the normative theory adopted by EU law to assess dark patterns is users’ autonomy. Drawing from the normative theory and these regulations, he recommends a taxonomy encompassing six categories of autonomy violation in business-to-consumer relationships. He concludes that:

Dark patterns in online choice architectures violate consumers’ ability to make a model self-determined decision if they: a) Influence consumers’ decision-making in such a way that consumers ignore or misunderstand mandated information; b) Cause consumers to hold false beliefs that form the foundation of consumers’ decision-making; c) Cause consumers to enter into a specific contractual agreement with a business without reflection about its substance and content; d) Create unreasonable time, decision effort, or emotional costs for pursuing or adhering to a particular decision; e) Present choice options in a non-neutral manner when asking consumers to select between different choice options or, f) Manipulate consumers (Brenncke, Reference Brenncke2023, p. 36).

Indonesia should also start regulating the practices of dark patterns in a specific way. Based on the above explanations, there are several things that can be taken into consideration by Indonesia. Drawing from the other countries’ practices, the regulations do not have to always define the term “dark patterns” directly; however, they should expressly regulate prohibitions against the practice of using design interfaces in a way that subverts or impairs consumers’ autonomy and/or exploits consumer behavioural biases. In line with Jennifer King and Adriana Stephan’s opinion, the regulation related to dark patterns should focus on the outcomes rather than the business actors’ intent. The regulations themselves can be a mix of negative obligations to prohibit dark patterns and positive obligations to ensure consumer-friendly choice architecture. Furthermore, since proving detriment due to the practice of dark patterns is very difficult, the formulation of articles related to the prohibition of dark pattern practices must focus more on the actions that are prohibited. The element “which results in consumer loss,” as often regulated in Act Number 8 of 1999, should not be one of the elements included in formulating dark patterns because it will increase the burden of proof. Apart from that, government agencies or institutions also must play an active role and initiate action against the practice of dark patterns on digital platforms in Indonesia.

5. Conclusion

Business actors often use dark patterns to increase profits and gain a competitive advantage by manipulating consumer behaviour biases. It is a problem that is faced by a lot of countries, but up to this moment, this issue has not received a lot of attention in Indonesia. Therefore, this article tries to explain what is meant by dark patterns and the harm that they cause, examines Indonesia’s current consumer protection law regarding dark patterns, and discusses the practices in other countries, such as the US, EU, and India, in regulating dark patterns. Based on the analysis, this article then explains that many consumers perhaps do not know or are aware of the existence of dark pattern practices, but these practices have a detrimental impact on them, such as undermining their autonomy, causing financial losses, privacy harms, psychological losses, time losses, and even massive structural losses such as undermining competition and reducing consumers’ trust. On the other hand, Indonesia currently does not have a law that expressly regulates dark patterns or practices that exploit consumer behavioural biases. Indonesian consumer protection legal provisions regulate basic principles and provisions that indirectly prohibit dark pattern practices, but these regulations can only be applied to certain forms of dark patterns and are difficult to implement because the article is very general, and it is difficult to prove the detriments due to dark patterns. Therefore, this article suggests that Indonesia must explicitly regulate the prohibition of the practices of dark patterns in its current law. Drawing from other countries, the regulations do not have to define the term “dark patterns” explicitly, but they should explicitly forbid the use of design interfaces that undermine or impair consumer autonomy and/or take advantage of behavioural biases in consumers. Moreover, the formulation of rules pertaining to the proscription of dark pattern activities ought to prioritise the behaviours that are forbidden over the aspect of the harm experienced.

Acknowledgements

I would like to thank my work institution National Research and Innovation Agency (BRIN), especially the Research Centre for Law. I am grateful to Asian Journal of Law and Society for the opportunity they have given me. I also want to express my gratitude to Harison Citrawan for providing the information related to this publication. Furthermore, I acknowledge the use of AI tools in assisting with grammar and language checking for this work. These tools were used to enhance clarity and ensure accuracy in the writing process. All content, ideas, and conclusions are my own.

References

Berkeley Law School (2010). The common law and civil law traditions. Available at: https://www.law.berkeley.edu/wp-content/uploads/2017/11/CommonLawCivilLawTraditions.pdf (Accessed: 18 August 2024).Google Scholar
Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koening, V. and Lenzini, G. (2021). ‘I am definitely manipulated, even when I am aware of it. It’s ridiculous!- Dark patterns from the end-user perspective’, Designing Interactive Systems Conference 2021 28 June 2021-2 July 2021. Virtual Event USA. Available at: https://doi.org/10.1145/3461778.3462086 (Accessed: 18 August 2024).CrossRefGoogle Scholar
Bösch, C., Erb, B., Kargl, F., Kopp, H. and Pfattheicher, S. (2016). ‘Tales from the dark side: Privacy dark strategies and privacy dark patterns’, Proceedings on Privacy Enhancing Technologies, 4, pp. 237254.CrossRefGoogle Scholar
Brenncke, M. (2023). ‘Regulating dark patterns’, Notre Dame Journal International & Comparative Law, 14(1), pp. 136.Google Scholar
Brignull, H. (no date). What are deceptive patterns? Available at: https://darkpatterns.org/ (Accessed: 18 August 2024).Google Scholar
European Union (2022). Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000/31/EC (Digital Services Act) (Text with EEA relevance). Available at: https://eur-lex.europa.eu/eli/reg/2022/2065/oj (Accessed: 18 August 2024)Google Scholar
European Commission (2022a). Behavioural study on unfair commercial practices in the digital environment: Dark patterns and manipulative personalisation. Luxembourg: Office of the European Union.Google Scholar
European Commission, (2022b). Digital Services Act: EU’s landmark rules for online platforms enter into force. Available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_22_6906 (Accessed: 18 August 2024).Google Scholar
European Union (2023). Regulation of The European Parliament and of The Council on Harmonised Rules on Fair Access to and Use of Data and Amending Regulation (EU) 2017/2394 and Directive (EU) 2020/1828 (Data Act). Available at: https://eur-lex.europa.eu/eli/reg/2023/2854 (Accessed: 18 August 2024).Google Scholar
Government of India (2023). Guidelines for prevention and regulation of dark patterns. Available at: https://consumeraffairs.nic.in/sites/default/files/file-uploads/latestnews/Draft%20Guidelines%20for%20Prevention%20and%20Regulation%20of%20Dark%20Patterns%202023.pdf (Accessed: 18 August 2024).Google Scholar
Gray, C. M., Kou, Y., Battles, B., Hoggatt, J. and Toombs, A. L. (2018). ‘The dark (patterns) side of UX design’, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Conference on Human Factors in Computing Systems, 21-26 April 2018. Montréal, QC, Canada: ACM. Available at: https://doi.org/10.1145/3173574.3174108 (Accessed: 18 August 2024).Google Scholar
King, J. and Stephan, A. (2021). ‘Regulating privacy dark patterns in practice, drawing inspiration from California privacy right’, Georgetown Law Technology Review, 5, pp. 251276.Google Scholar
Laguri, J. and Strahilevitz, L. J. (2021). ‘Shining a light on dark patterns’, Journal of Legal Analysis, 13(1), pp. 44109.Google Scholar
Leiser, M. and Santos, C. (2023). ‘Dark patterns, enforcement, and the emerging digital design acquis - Manipulation beneath the interface’, European Journal of Law and Technology, 15(1), pp. 131.Google Scholar
Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M. and Narayanan, A. (2019). ‘Dark patterns at scale: Findings from a crawl Of 11k shopping websites’, Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), pp. 132.CrossRefGoogle Scholar
Mathur, A., Mayer, J. and Kshirsagar, M. (2021). ‘What makes a dark pattern dark? Design attributes, normative considerations, and measurement methods’, Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Conference on Human Factors in Computing Systems. 8-13 May 2021. Yokohama, Japan: ACM. Available at: https://doi.org/10.1145/3411764.3445610 (Accessed: 18 August 2024)Google Scholar
Moser, C., Schoenebeck, S.Y. and Resnick, P. (2019). ‘Impulse buying: Design practices and consumer needs’, CHI Conference on Human Factors in Computing Systems Proceedings, 4-9 May 2019. Glasgow, Scotland UK: ACM. Available at: https://doi.org/10.1145/3290605.3300472 (Accessed: 18 August 2024).Google Scholar
Narayanan, A., Mathur, A., Chetty, M. and Kshirsagar, M. (2020). ‘Dark patterns: Past, present, and future: The evolution of tricky user interfaces’, Acmqueque, 18(2), pp. 6792.Google Scholar
OECD (2022). Dark commercial patterns - OECD digital economy papers No.336. Available at: https://www.oecd-ilibrary.org/docserver/44f5e846-en.pdf?expires=1723947753&id=id&accname=guest&checksum=8A3E77628704F9DE02C7C5B52A0B3C18 (Accessed: 18 August 2024).Google Scholar
O¨zdemir, S. (2020). ‘Digital nudges and dark patterns: The angels and the archfiends of digital communication’, Digital Scholarship in the Humanities, 35(2), pp. 417428.CrossRefGoogle Scholar
Rosmawati, S. H. (2018). Pokok-pokok hukum perlindungan konsumen. Jakarta: Prenadamedia Gorup.Google Scholar
Sumaryanto, A. D. (2020). Bunga rampai pembalikan beban pembuktian [Anthology of reversal of the burden of proof]. Surabaya: Jakad Media Publishing.Google Scholar
Susser, D., Roessler, B. and Nissenbaum, H. (2019). ‘Online manipulation: Hidden influences in a digital world’, Georgetown Law Technology Review, 4(1), pp. 145.Google Scholar
Syam, M. (2018). ‘Penerapan asas pembalikan beban pembuktian dalam penyelesaian sengketa konsumen [Application of the principle of reversal of the burden of proof in dispute resolution consumer]’, Jurnal Hukum Acara Perdata - ADHAPER, 4(1), pp. 91108.CrossRefGoogle Scholar
Thaler, R. H. (2018). ‘Nudge, not sludge’, Science, 361(6401), p. 431.CrossRefGoogle Scholar
Thaler, R. H. and Sunstein, C. R. (2008). Nudge - Improving decisions about health, wealth, and happiness. New Haven and London: Yale University Press.Google Scholar
The United States (2021). S.3330 - 117th congress (2021-2022). DETOUR Act. Available at: https://www.congress.gov/bill/117th-congress/senate-bill/3330 (Accessed: 18 August 2024).Google Scholar
Weinmann, M., Schneider, C. and Brocke, J. v. (2019). ‘Digital nudging’, Business & Information Systems Engineering, 58, pp. 433436.CrossRefGoogle Scholar
Zulham. (2016). Hukum perlindungan konsumen. Jakarta: Kencana Prenada Media Group.Google Scholar
Figure 0

Figure 1. “High-demand message”

Figure 1

Table 1. Types of dark patterns (Mathur et al., 2019, p. 12)