There is an intersection of digital commerce, consumer protection, and the nudging power of data driven digital interactions that remains both under explored and a potentially important area for management, perhaps even through regulation. In that connection the OECD has published (through its Digital Economy Paper) a quite interesting study: "Dark Commercial Patterns." The abstract nicely summarized the problem and approach.
There is mounting concern that dark commercial patterns may cause substantial consumer detriment. These practices are commonly found in online user interfaces and steer, deceive, coerce, or manipulate consumers into making choices that often are not in their best interests. This report proposes a working definition of dark commercial patterns, sets out evidence of their prevalence, effectiveness and harms, and identifies possible policy and enforcement responses to assist consumer policy makers and authorities in addressing them. It also documents possible approaches that consumers and businesses may take to mitigate dark commercial patterns.
The Table of Contents and Executive Summary follow. The paper may be accessed HERE.
Executive summary
The term “dark (commercial)1 patterns” refers to a wide variety of practices commonly found in online
user interfaces that lead consumers to make choices that may not be in their best interests, including by
exploiting consumer biases. They typically seek to get consumers to give up more money, personal data
or attention time than desired. In this way, they are inextricably linked to an underlying business model,
even if user interface designers may often bear no malicious intent.
The OECD Committee on Consumer Policy proposes a working definition of dark patterns to facilitate
near-term discussion among regulators and policy makers across jurisdictions: “Dark commercial patterns
are business practices employing elements of digital choice architecture, in particular in online user
interfaces, that subvert or impair consumer autonomy, decision-making or choice. They often deceive,
coerce or manipulate consumers and are likely to cause direct or indirect consumer detriment in various
ways, though it may be difficult or impossible to measure such detriment in many instances.” The full
definition appropriate in a particular setting may depend on its intended use and the broader policy, legal
or technological context.
Many e-commerce2 websites and apps, including those of major online platforms, feature more than one
dark pattern. Online games, browsers, and many cookie consent notices also commonly feature them – the
latter potentially entailing high rates of data protection law violation. The more frequent dark patterns on
websites and apps involve framing (preselecting choices by default or giving them visual precedence,
hiding information or disguising advertisements); creating a sense of urgency (through potentially
misleading scarcity indications); generating social proof (through potentially misleading popularity
indications); forcing registration or information disclosure; nagging to make a choice; or making it difficult to cancel or opt out.
Given online businesses’ ability to repeatedly run experiments to hone user interfaces, consumers’
heightened susceptibility to deception online and the scale of consumers reachable, dark patterns are likely
a greater concern than analogous practices offline. Indeed they can influence consumer decision-making
substantially. They appear more effective on mobile devices and when combined. Some, such as hidden
information, appear substantially more effective than others, such as scarcity cues. Seemingly mild dark
patterns, such as preselecting choices and making it hard to decline, may be as or more effective than
aggressive ones such as nagging and toying with emotions. A dark pattern’s effectiveness may in part be
driven by the difficulty in its detection, which may relate to whether consumers have prior experience with
it, its intrinsic subtlety or general pervasiveness.
In addition to impairing autonomy, some dark patterns, such as drip pricing and subscription traps, can
cause substantial financial loss. Others may cause significant privacy harms or psychological detriment.
They may also harm consumers collectively, by weakening competition and sowing distrust, and can
disproportionately harm certain consumers such as less educated consumers or children. While there is not
yet evidence suggesting that dark patterns triggering personal vulnerabilities are common, this may change with businesses’ increasing data collection combined with machine learning techniques.
Market forces are unlikely to address dark patterns alone, and may at times incentivise their use. Consumer and data protection authorities have accordingly been taking action on the basis of laws outlawing practices associated with many dark patterns, while also issuing guidance to support business compliance. But enforcement cases to date predominantly relate to a few dark patterns commonly recognised by regulators, which could point to gaps in the law, in available evidence, or in enforcement capacity. In particular, some dark patterns are not clearly deceptive and may evade prohibitions on deceptive practices. Various regulatory measures to respond to dark patterns have been proposed or implemented across OECD jurisdictions. These include measures to address them specifically on online platforms; prohibit specific dark patterns; foster consumer-friendly choice architecture (e.g. by making it as easy to cancel as to sign up); empower regulators; and address consumer vulnerability. However, much evidence indicates that disclosure and transparency measures are insufficient in isolation. Other key considerations relate to combining principle- and rule-based consumer laws; employing specific tools to gather evidence (e.g. web scraping); enhancing co-operation among policy areas (e.g. privacy, artificial intelligence and competition policy); and adapting the interpretation of legal standards.
Initial evidence of dark patterns’ prevalence, effectiveness and harms provides directions for possible
further action, such as focusing on dark patterns on mobile devices and popular websites. Given dark
patterns specifically involving hiding information, making it hard to cancel or opt out, preselecting choices
or giving them visual precedence are effective, highly prevalent (on websites and apps, including of major
online platforms, and cookie notices) and hard for consumers to detect, they could be a priority focus.
Overall, however, more evidence is needed regarding many dark patterns to further guide policy and
enforcement efforts. Technical tools, such as browser extensions, may also help consumers mitigate dark patterns and other measures can raise awareness about them, such as information campaigns. Various business initiatives may also assist, including self- or co-regulatory initiatives, ethical design standards, and digital choice architecture self-audits. While such tools and initiatives can play an important supporting role, they should be seen as complementary to robust regulatory and enforcement measures
No comments:
Post a Comment