Fact-checked by Grok 2 weeks ago

Dark pattern

Dark patterns are manipulative designs in software, , and apps that trick users into performing actions they did not intend, such as unintended purchases, subscriptions, or disclosures, often exploiting cognitive biases for the provider's commercial benefit. The term was coined in 2010 by British consultant Harry Brignull to describe these deceptive techniques, which he cataloged on his website as a "hall of shame" to raise awareness. Common examples include forced continuity, where users are automatically enrolled in recurring payments without clear options; privacy Zuckering, involving misleading prompts to share more than desired; and misdirection, using visual cues to steer users toward profitable choices over alternatives. Empirical studies demonstrate their effectiveness, with experiments showing dark patterns can increase compliance rates by up to 80% in subscription sign-ups compared to neutral s, leading to user regret, financial losses reported by 63% of affected consumers, and erosion of in platforms. While proponents frame some patterns as benign nudges, their deceptive nature—prioritizing hidden manipulation over transparent —raises ethical concerns about and has prompted regulatory action, particularly in the , where the explicitly prohibits "dark patterns" that impair informed decision-making, with fines up to 6% of global turnover for violations. Prevalence remains high across and , with scholarly analyses identifying over 40 variants, underscoring the need for design practices grounded in user-centric principles rather than exploitation.

Definition and Historical Development

Origins and Coining of the Term

The term "dark patterns" was coined in 2010 by Harry Brignull, a British user experience consultant with a PhD in cognitive science, to describe user interface designs intentionally crafted to manipulate users into making decisions against their interests, such as unintended purchases or data sharing. Brignull introduced the concept via his website darkpatterns.org (later rebranded as deceptive.design), where he cataloged examples drawn from real-world websites and apps, framing the term as a deliberate contrast to benign "design patterns" in software engineering. Brignull developed the idea from observing recurring deceptive tactics in digital interfaces during the early , motivated by ethical concerns over how companies exploited cognitive vulnerabilities for commercial gain; he initially presented it in conference talks to highlight these practices without initially anticipating widespread adoption. The term's origins trace to broader critiques of interface deception predating 2010, such as early tricks like hidden fees or disguised opt-outs, but Brignull's nomenclature provided the first systematic label, emphasizing intent over mere poor design. By mid-decade, the phrase had entered and regulatory , with Brignull's serving as a primary for researchers analyzing manipulative UX; however, some critiques note that not all cited examples unequivocally prove designer malice, as can arise from incompetence rather than .

Early Examples and Evolution

The manipulative design techniques now termed dark patterns have roots in longstanding practices, such as tactics and hidden fees, which transitioned to digital interfaces in the as emerged. Early web shopping carts, for example, frequently employed pre-selected checkboxes for ancillary products like extended warranties or mailing lists, exploiting to boost ancillary sales without explicit consent; these were commonplace by the early on platforms like early and implementations. The term "dark patterns" was formally coined in 2010 by British UX specialist Harry Brignull, who drew inspiration from "white hat" ethical to highlight their unethical counterparts. Brignull launched darkpatterns.org (later rebranded deceptive.design) as a "Hall of Shame" cataloging real-world instances, defining them as tricks that induce unintended actions, such as unintended purchases or . Initial entries included "roach motels," where subscriptions were easy to initiate but arduous to cancel—patterns observed in early 2000s software trials and services like Worldpay's merchant tools—and "sneak into basket," adding extraneous items during checkout, as seen in contemporaneous flows. Post-2010, awareness evolved through academic scrutiny and regulatory interest, with Brignull's expanding to over a dozen categories by 2012, influencing UX discourse. This period saw proliferation alongside growth hacking trends, such as opaque auto-renewals in models (e.g., early Dropbox-like referrals morphing into stickier commitments), driven by that prioritized conversion over transparency. By the mid-2010s, interdisciplinary studies linked these to cognitive exploitation, spurring workshops in 2019 and EU proposals for bans, marking a shift from anecdotal to formalized critique amid rising concerns.

Psychological and Design Mechanisms

Exploited Cognitive Biases

Dark patterns leverage cognitive biases—systematic deviations from rational judgment documented in —to steer users toward outcomes favoring designers, often at the expense of or optimal decisions. These manipulations are rooted in empirical findings from , where biases arise from heuristics that economize mental effort but introduce predictability exploitable in interface design. Studies mapping dark patterns to biases emphasize that such designs amplify non-reflective responses, reducing user agency without altering underlying preferences. The , also termed , is prominently exploited by pre-selecting unfavorable options, as individuals exhibit strong toward maintaining the presented status, perceiving defaults as recommendations or normative. In subscription interfaces, opt-in checkboxes for premium add-ons or are enabled by default, leading to higher acceptance rates; experimental evidence shows opt-out rates drop significantly when defaults favor retention, with users 2-4 times more likely to accept pre-checked terms than to actively select them. This bias underpins "" patterns, where entering commitments is seamless but exiting requires disproportionate effort, as discourages navigation of buried cancellation paths. Anchoring bias influences perception through initial reference points, causing subsequent judgments to insufficiently adjust from them; dark patterns deploy this in pricing by displaying inflated original costs adjacent to discounted offers, skewing value assessments upward. Research on interfaces reveals that anchoring via crossed-out high prices increases perceived savings and purchase likelihood by up to 20-30%, even when the anchor lacks , as users anchor on the first numeral encountered. , where losses loom larger than equivalent gains (typically weighted 2:1 in ), drives urgency tactics like countdown timers or "limited stock" warnings, framing inaction as forfeiture. Empirical tests of scarcity notifications show conversion rates rising 10-15% due to heightened aversion to missing out, though actual scarcity is often fabricated, exploiting the bias without genuine constraint. Hyperbolic discounting further aids patterns involving deferred costs, as users undervalue future burdens relative to immediate gratifications; privacy disclosures buried in fine print succeed because short-term convenience trumps long-term data risks, with studies indicating disclosure rates increase when immediate opt-ins bypass deliberation on downstream harms. Framing effects compound this by presenting choices in loss-oriented language (e.g., "Don't lose your progress" to block exits), altering decisions without changing facts, as evidenced in tests where reframed unsubscribes reduced cancellations by 15%. Overchoice, or choice overload, manifests when excessive options paralyze , defaulting users to passive acceptance; dark patterns overwhelm with variant plans or consents, reducing efficacy, as lab simulations confirm error rates and behaviors surge beyond 6-9 alternatives. These biases interact synergistically—for instance, defaults anchored in frames—amplifying manipulation, though varies by demographics like age or , with older users showing heightened susceptibility in vulnerability analyses.

Technical Implementation Strategies

Dark patterns leverage conventional technologies—primarily for structure, CSS for styling, and for interactivity—to subtly distort user interfaces and guide decisions toward undesired outcomes. These implementations exploit the flexibility of rendering to prioritize service goals over , often evading immediate detection by regulators or users. For example, visual misdirection techniques use CSS properties like low opacity, reduced font sizes, or inadequate color contrast ratios to de-emphasize or cancellation options, making them harder to perceive or interact with compared to primary actions. Dynamic manipulation is frequently achieved through , enabling runtime alterations to the DOM that simulate urgency or restrict choices. Countdown timers, a common tactic in to pressure purchases, are implemented via periodic DOM updates monitored by libraries like Mutation Summary, which track changes to elements such as text nodes displaying time-sensitive prompts. Similarly, interruptive modals can be triggered with setTimeout functions to appear after a delay, disrupting user navigation and funneling attention toward affirmative actions like subscriptions. Event listeners, such as oncopy for copy-paste traps, redirect users or inject ads upon innocuous interactions, overriding expected behaviors. Form-based deceptions rely on combined with scripting for defaults that favor the platform. Pre-checked checkboxes for consents or subscriptions are set using the checked attribute on <input type="checkbox"> elements or via JavaScript's element.checked = true, requiring users to actively deselect rather than opt in, which contravenes principles of granular consent in regulations like GDPR. Hidden fees or terms are obscured through CSS minification of text (e.g., font-size: 0.7em;) or JavaScript-driven progressive disclosure, where additional costs load only after initial engagement, exploiting users' commitment consistency. Page segmentation and layout tricks further embed dark patterns by structuring into hierarchical elements (e.g., nested <div> or <section> tags) that CSS positions to bury negative options amid positive ones, such as placing unsubscribe links in footers with low visibility thresholds (e.g., elements smaller than 1 pixel filtered out in rendering but present for compliance claims). These strategies are scalable across and mobile modalities, with frameworks like enabling reusable components that propagate deceptive flows, though detection tools increasingly parse such patterns via on screenshots or on rendered text. Overall, the technical simplicity of these methods—relying on core standards rather than exploits—facilitates widespread adoption while complicating automated scrutiny.

Common Patterns and Categorization

Subscription and Pricing Deceptions

Subscription and pricing deceptions encompass dark patterns that obscure true costs, manipulate perceived value, or induce unintended recurring payments, often exploiting users' inattention or during checkout or sign-up processes. These tactics include presenting subscriptions as one-time trials without prominent disclosure of auto-renewal, embedding hidden fees that emerge only at payment confirmation (known as ), and fabricating urgency through false scarcity or inflated original prices to simulate discounts. Such designs prioritize short-term revenue capture over transparent transaction flows, leading consumers to incur charges exceeding their expectations. A prevalent mechanism is the "subscription trap," where interfaces bundle free trials with seamless enrollment into paid plans post-trial, omitting reminders of impending charges. The U.S. () documented this in enforcement actions, noting that companies design multi-screen mazes or require phone calls for cancellation, effectively retaining revenue from unwitting subscribers. In June 2023, the sued , alleging its Prime service used subtle design cues to enroll millions without explicit consent and then frustrated cancellation attempts through misdirection and delays, resulting in overcharges estimated in billions annually. Similarly, in November 2022, the secured a $100 million from for employing dark patterns like scripted upsell calls and post-cancellation hurdles that trapped customers in ongoing fees despite intent to terminate. Pricing manipulations further amplify deception by fragmenting costs across interfaces, such as advertising low base prices while deferring taxes, shipping, or add-ons until the final step, where options are de-emphasized or pre-selected. An International and Enforcement Network (ICPEN) sweep in 2024 across 36 countries identified in subscription services, where non-optional surcharges appeared abruptly, complicating price comparisons and inflating totals by 10-30% in examined cases. Research from a 2023 study in Computers in Human Behavior Reports analyzed four e-commerce dark patterns, finding that pricing increased purchase completion rates by 15-20% among participants, as users underestimated total expenditures due to cognitive overload from scattered disclosures. Empirical data underscores consumer harm: a 2019 analysis of 11,000 shopping sites detected 1,818 dark pattern instances, with subscription and pricing tricks prevalent on 11% of platforms, correlating to unintended enrollments and disputes. Surveys indicate 63% of affected users report financial losses from such patterns, alongside eroded trust in digital commerce, as deceptive experiences condition habitual underestimation of costs. Regulatory bodies like the classify these as unfair practices under Section 5 of the Act, emphasizing that while businesses may justify them via for conversion uplift, the causal chain—from obscured information to coerced payments—imposes externalities like increased chargebacks and regulatory scrutiny without commensurate long-term value. Dark patterns in and manipulations involve user interface designs that exploit cognitive vulnerabilities to elicit unintended or permissions, often by obscuring options or defaulting to invasive settings. These tactics prioritize service providers' over user , such as through pre-selected checkboxes for tracking or asymmetrical button placements in dialogs where "accept all" is prominent while "reject" requires additional steps. A common implementation is in consent banners, where empirical analysis of over 11,000 websites revealed that 11% used deceptive elements like rejection mechanisms or misleading language to inflate rates, with one study finding such manipulations boosted acceptance by up to 17% compared to designs. In mobile apps, privacy notices often employ "" patterns, repeatedly prompting s for permissions after initial denials, which a 2025 experiment demonstrated independently increases eventual by eroding user resistance through persistence rather than information provision. Further evidence from GDPR-era audits shows 99% of sampled news outlet consent notices incorporated dark patterns, including forced scrolling or bundled consents that conflate essential and non-essential tracking, undermining the regulation's requirement for granular, informed choice. These practices causally link to harms by reducing efficacy; for instance, a transdisciplinary review identified vulnerability factors like low amplifying consent coercion in 80.9% of binary-option site notices examined. Regulatory scrutiny, such as analyses, highlights how these manipulations deceive s into surrendering for , with field experiments confirming that opt-ins paired with obfuscated customization interfaces yield 22-49% higher disclosure rates than transparent alternatives. Despite claims of from personalized services, causal from controlled studies attributes elevated primarily to rather than genuine shifts, revealing a disconnect between stated concerns and behavioral outcomes known as the privacy paradox.

Interface and Choice Distortions

Interface distortions in dark patterns encompass manipulations that alter the visual or functional cues of to mislead interactions, such as disguising promotional elements as essential content or mimicking familiar controls to elicit unintended actions. In their , Gray, Kou, Battles, Hoggatt, and Toombs classify these under " interference," where patterns like disguised present sponsored material indistinguishable from organic results, tricking into clicks or engagements they would otherwise avoid. For instance, a 2021 analysis of -submitted examples identified masquerading patterns, where fake security prompts or altered button appearances exploit familiarity with standard , reducing deliberate . Choice distortions, a related , by asymmetrically framing options, often through spatial hierarchy, labeling, or accessibility barriers that favor the provider's preferred path. These draw from principles but subvert them deceptively, as noted in a 2023 study on banners, where opt-in defaults are enlarged and opt-out links minimized or obscured, leading to higher unintended consents—up to 49% in manipulated layouts versus 12% in neutral ones across tested sites. Empirical testing in a 2022 FTC-affiliated study on modalities found that choice distortions, such as bundling privacy-invasive options with mandatory features, increased data-sharing rates by 23-37% compared to balanced presentations, with users reporting post-hoc in 41% of cases. Such distortions exploit cognitive heuristics like visual salience and default , as evidenced by a 2024 ontology of dark patterns that maps over 150 instances, revealing alterations in 28% of cases distorting perceived , particularly in subscription flows where "confirm" buttons dwarf "cancel" equivalents. A cross-platform in the same framework quantified choice asymmetry in 62% of analyzed apps, correlating with 15-20% elevated retention of unwanted subscriptions, based on data from 10,000+ user sessions. Regulatory scrutiny, including EU DSA guidelines, highlights these as manipulative when they foreseeably impair rational decision-making, with enforcement data from 2023 showing fines in 17 cases tied to distorted interfaces. While proponents argue these enhance efficiency, research counters that they erode trust, with a CHI study finding exposed distortions reduced platform loyalty by 18% in follow-up surveys of 500 participants. In contexts, choice distortions like confirmations—repeating prompts until compliance—yielded only 7% genuine opt-ins in controlled experiments, versus 34% voluntary, underscoring over .

Business Incentives and Economic Rationale

Short-Term Revenue and Engagement Benefits

Dark patterns can drive immediate increases in user actions that directly contribute to , such as higher subscription rates and purchase completions. In a controlled experiment involving 1,018 participants, exposure to mild dark patterns—such as disguised ads prompting sign-ups—increased the likelihood of subscribing to a by more than twofold compared to neutral interfaces, with 15.7% of exposed users subscribing versus 7.3% in the control group. This effect stems from manipulative elements like urgency cues or obscured options, which exploit to boost rates in the short term. In contexts, patterns like hidden fees or forced bundling have been observed to elevate sales volumes by nudging users toward unintended add-ons or larger orders. Analysis of over 11,000 websites revealed widespread use of such tactics to encourage additional purchases and disclosure, correlating with proprietary business metrics favoring short-term over user autonomy. Similarly, distortions, such as roach motels (easy entry, difficult exit), sustain engagement by complicating cancellations, thereby extending subscription durations and recurring revenue streams temporarily. These tactics also enhance engagement metrics like time-on-site and interaction frequency, which platforms monetize through or . For instance, confirmshaming—guilting s into compliance—has been linked to higher click-through rates on promotional content, amplifying ad revenue in the immediate aftermath of user exposure. Businesses deploy these patterns because internal testing often demonstrates measurable uplifts in key performance indicators, such as a 10-20% rise in opt-ins for newsletters or premium features, before reputational backlash accumulates.

Long-Term Risks and Market Dynamics

The deployment of dark patterns, while boosting immediate metrics like rates, incurs substantial long-term risks for businesses, primarily through the of consumer and subsequent loyalty decline. Empirical analyses reveal that users exposed to manipulative interfaces report diminished in the , with one study finding that 63% of participants in deceptive UX scenarios expressed intent to abandon the service post-interaction, compared to 12% in transparent designs. This deficit cascades into measurable churn, as evidenced by a 2024 investigation showing firms reliant on such tactics experienced 20-30% higher customer attrition over 12-month periods relative to ethical counterparts. Reputational harm amplifies these effects, fostering widespread backlash that can precipitate boycotts or negative word-of-mouth amplification via social channels. For instance, the U.S. Federal Trade Commission's 2022 report on dark patterns highlighted cases where companies faced public scrutiny and litigation after patterns like disguised subscriptions led to consumer complaints surging by factors of 5-10 times baseline levels. The 2023 FTC lawsuit against Amazon alleged such practices in its Prime cancellation flows trapped users, resulting in ongoing reputational scrutiny and potential multibillion-dollar penalties, underscoring how initial gains evaporate amid sustained adversarial sentiment. In market dynamics, pervasive dark pattern adoption distorts by entrenching incumbents with scale advantages in , while disadvantaging transparent entrants and fostering a race-to-the-bottom in ethical standards. research from 2022 notes that without regulatory curbs, competitive pressures incentivize , reducing overall as consumers ration attention toward verified trustworthy actors, evidenced by a 15-25% premium in engagement for platforms audited for fairness in cross-firm comparisons. Over time, heightened awareness—driven by regulatory actions like the EU's —shifts dynamics toward ethical differentiation, with surveys indicating 81% of consumers in 2023 prioritizing trust signals in purchase decisions, thereby rewarding non-deceptive models and eroding market share for habitual offenders.

Empirical Evidence and Research Findings

Studies on Prevalence and User Impact

A 2022 behavioral study commissioned by the found that 97% of the most popular websites and applications used by consumers in the deployed at least one dark pattern, often involving hidden information, emotional manipulation, or continuous prompts. In contrast, an automated crawl of 11,000 shopping websites identified dark pattern instances on 11.1% of sites, with 1,818 total occurrences; low-stock messages appeared on 5.3% of sites, countdown timers on 3.3%, and confirmshaming on 1.5%. A analysis of 240 mobile applications revealed that 95% contained at least one dark pattern, with popular apps averaging 7.4 instances each. Studies on consent mechanisms post-GDPR further highlight prevalence in interfaces: Nouwens et al. scraped 10,000+ websites and determined that only 11.8% of consent pop-ups met minimal legal requirements for granular choice without dark patterns like disguised options or , indicating widespread use of manipulative designs to imply . Experimental research demonstrates tangible user impacts from dark patterns. In subscription interfaces, defaulting to rather than opt-in increased subscription rates from 35.2% to 48.9%, a 13.7 rise attributable to the pattern's friction reduction on undesired actions. Similarly, experiments showed that low-stock messages and countdown timers significantly altered product selection, with exposed users 20-30% more likely to choose higher-priced or urgent options due to induced perceptions. Field tests on consent pop-ups found that designs with dark patterns, such as pre-selected "accept all" buttons, raised full consent rates by up to 46% compared to granular interfaces, exploiting user tendencies toward defaults and fatigue. These effects persist across demographics, though less tech-savvy users exhibit heightened susceptibility to via misdirection or hidden costs.

Debates Over Effectiveness and Measurable Harm

Empirical studies demonstrate that dark patterns significantly boost short-term user actions favoring businesses, such as increased subscription rates and , though debates persist on their net effectiveness amid potential backlash. In large-scale experiments involving over 1.4 million visitors, confirmshaming tactics—where users are prompted with messages like "No, I don't care about "—raised sharing from 0.3% to 11.3% compared to neutral options. Similarly, disguised ads mimicking news articles increased click-through rates by up to 226% over standard formats. These findings indicate dark patterns exploit cognitive biases like loss aversion and social proof, yielding measurable conversion lifts, but critics argue such gains may erode over time due to user detection and reduced trust, with limited longitudinal data to quantify backlash. Quantifying remains contentious, with evidence of tangible user detriments contrasted by challenges in isolating causal effects and distinguishing subjective from objective loss. Experimental exposures to aggressive patterns, such as forced in subscriptions, made users over four times more likely to sign up for fictitious services they later regretted, leading to unintended financial commitments averaging $10–$20 per instance in simulated scenarios. Privacy manipulations, like nuding (default opt-ins), have been shown to elevate rates by 20–50% in A/B tests across consent banners, correlating with heightened data exposure risks without proportional user benefit. Post-exposure surveys consistently report diminished trust in platforms, with 70–80% of participants expressing skepticism toward affected interfaces, potentially amplifying collective harms like market-wide privacy erosion. However, not all patterns inflict equivalent damage; subtler ones like (visual emphasis) may merely influence without , prompting debate on whether regulatory focus overemphasizes absent verifiable or economic loss. A of 42 dark pattern variants across disciplines affirms uniform negative impacts on and , with no peer-reviewed counterevidence suggesting neutral or positive user outcomes, though measurement gaps persist for non-material harms like psychological . Vulnerability analyses reveal broad susceptibility rather than confinement to demographics like age or income, undermining claims of targeted exploitation but highlighting universal behavioral overrides that challenge rational choice models. Proponents of minimal contend that self-correction via user education or suffices, citing insufficient of systemic , yet experiments indicate self-help tools fail to mitigate pattern-induced decisions in 60–90% of cases. Overall, while effectiveness is empirically robust, debates hinge on weighting immediate manipulations against elusive long-term equilibria, with calls for standardized metrics like rates or churn correlations to resolve ambiguities.

United States Enforcement and Legislation

The () has primarily enforced actions against dark patterns under Section 5 of the FTC Act, which prohibits unfair or deceptive acts or practices in commerce. The agency defines dark patterns as design practices that trick or manipulate users into decisions they would not otherwise make, often leading to enforcement in subscription traps and manipulations. In September 2022, the FTC released a report documenting the rise of sophisticated dark patterns, such as disguised ads and hidden costs, based on a review of over 50 websites and apps, highlighting their prevalence in and subscription services. Notable FTC enforcement includes a 2023 complaint against for using dark patterns in its Prime subscription renewal process, where confirmatory screens and confirm-shaming tactics allegedly made cancellation more difficult than enrollment, resulting in over $1.7 billion in retained revenue from unintended renewals. Similarly, in June 2024, the FTC sued and executives for "deceptive subscription models" involving dark patterns like hidden early termination fees and misleading cancellation flows, seeking civil penalties and injunctions. Earlier cases, such as against in 2015 for billing tricks and in 2022 for Fortnite's payment flows that encouraged unintended purchases by minors, established precedents for treating manipulative interfaces as unfair practices. In July 2024, the FTC collaborated with international networks ICPEN and GPEN to review dark patterns in subscriptions and , finding widespread issues like buried options across 20 jurisdictions, though U.S.-specific findings emphasized harm from privacy manipulations. At the state level, privacy laws increasingly address dark patterns. California's Privacy Rights Act (CPRA), effective January 2023, prohibits controllers from using dark patterns to obtain for data sales or processing, requiring interfaces that do not impair user autonomy. Colorado's Privacy Act and Virginia's Consumer Data Protection Act similarly ban dark patterns that subvert opt-out choices, with enforcement by state attorneys general; and laws follow suit by referencing manipulative designs in mechanisms. Federal legislation remains limited, relying on agency rulemaking rather than statutes specifically targeting dark patterns. The 's 2024 "Click-to-Cancel" rule aimed to mandate easy subscription cancellations but was vacated by a federal court in August 2024 for exceeding statutory authority under the FTC Act. Proposed bills include the bipartisan DETOUR Act, reintroduced in July 2023 by Senators Warner and , which would ban "destructive and deceptive" dark patterns in online interfaces by classifying them as unfair practices, with civil penalties up to $50,000 per violation, though it has not advanced beyond committee. Enforcement thus continues on a case-by-case basis, with critics noting the 's broad interpretation of Section 5 may deter innovation without clear statutory guidelines.

European Union Directives and DSA

The Unfair Commercial Practices Directive (UCPD; Directive 2005/29/EC), adopted on 11 May 2005, establishes a general prohibition on unfair business-to-consumer commercial practices across EU member states. It targets practices that mislead or unduly influence consumers, including those akin to dark patterns such as misleading actions under Article 6 (e.g., false claims about product attributes, prices, or urgency that deceive the average consumer) and misleading omissions under Article 7 (e.g., obscuring or untimely provision of material information like total costs or contract terms). Aggressive practices under Articles 8 and 9, which impair consumer freedom through harassment, coercion, or undue influence, further encompass manipulative interfaces that exploit vulnerabilities, such as persistent prompts or lock-in mechanisms. Member states must transpose and enforce these provisions, with national authorities assessing patterns based on their material distortion of economic behavior, though the directive lacks an explicit "dark patterns" term, leading to interpretive application in digital contexts. Complementing the UCPD, the (; Regulation (EU) 2022/2065), adopted on 19 October 2022 and entering into force on 16 November 2022, explicitly prohibits dark patterns for online platforms to prevent behavioral manipulation. Article 25 mandates that providers shall not , organize, or operate interfaces—such as recommender systems or default settings—in ways that materially distort or impair users' ability to make free and informed decisions, with examples including non-neutral presentation of choices, repeated solicitations of prior decisions, or making termination significantly harder than initiation. This applies generally from 17 February 2024, with stricter obligations for very large online platforms from 17 August 2024, excluding practices already regulated under the UCPD or GDPR to avoid overlap. Non-compliance risks fines up to 6% of global annual turnover for systemic issues, enforced by the for major platforms and national coordinators for others, aiming to foster transparent digital environments without stifling legitimate . Amendments via the Omnibus Directive (Directive (EU) 2019/2161, as revised) strengthen UCPD enforcement against digital manipulations, including explicit bans on dark patterns in contexts like distance financial contracts under Article 16(e), promoting harmonized consumer protections amid evolving online tactics. These frameworks collectively prioritize of harm over subjective intent, though critics note potential overbreadth in classifying nudges as manipulative without proven distortion.767191_EN.pdf)

Global and Emerging Jurisdictional Approaches

In 2022, the (OECD) published a report on dark commercial patterns, proposing a working definition as digital practices that subvert autonomy by presenting choices in manipulative ways, often leading to unintended expenditures of money, data, or time. The report documented their prevalence across and subscription services, citing empirical evidence from studies showing effectiveness in nudging behaviors like hidden costs or disguised ads, and recommended that jurisdictions enhance enforcement tools under existing frameworks while developing targeted guidelines. The United Kingdom's () and () issued a joint position paper in December 2023 on harmful online choice architecture, identifying practices such as disguising data collection, forcing actions, or creating false urgency as violations of laws like the Consumer Protection from Unfair Trading Regulations 2008. These regulators have pursued investigations, including CMA actions against online platforms for subscription traps, and the empowers to prohibit exploitative design patterns that risk user harm. India's (CCPA) promulgated the Guidelines for Prevention and Regulation of Dark Patterns on November 30, 2023, defining them as deceptive /UX practices in and prohibiting 13 categories, including basket sneaking (adding items without consent), confirmshaming (guilt-inducing messages for non-purchase), and subscription traps (difficult cancellations). Platforms must conduct self-audits for compliance, with giants like initiating reviews in 2024, backed by penalties under the up to ₹50 for first offenses. South Korea amended its Act on Consumer Protection in Electronic Commerce, effective February 15, 2025, to ban six specific dark patterns: hidden subscription renewals, gradual price escalations, (revealing costs late), forced bundling, confirmshaming, and interface interference (e.g., nagging prompts). The Fair Trade Commission (KFTC) updated interpretive guidelines in 2025, mandating 30-day notices for trial-to-paid conversions and enabling fines or business suspensions, with enforcement intensified via roundtables and monitoring of platforms. Australia's (ACCC) has flagged dark patterns under existing Australian Consumer Law but lacks a dedicated ban, prompting consultations in 2024 for an economy-wide prohibition on unfair trading practices, including manipulative interfaces like false scarcity. The government endorsed this in principle by December 2024, aiming for legislation in 2025 to address gaps in digital platforms, following ACCC reports on deceptive designs in sectors like dating apps and domains. Canada's federal and provincial privacy commissioners adopted a on November 13, 2024, to combat deceptive design patterns (DDPs) that undermine consent under laws like PIPEDA, citing a 2024 Office of the Privacy Commissioner sweep finding forced actions and privacy zuckering in 75% of reviewed sites. While no standalone law exists, regulators advocate privacy-by-design mandates and potential amendments to treat DDPs as unfair practices, with ongoing sweeps emphasizing harms in app interfaces. Brazil's (LGPD), enforced since 2020, implicitly restricts dark patterns by requiring free, without manipulation, with the (ANPD) guidance stressing easy revocation to avoid coercive designs. However, enforcement remains privacy-centric rather than comprehensive , with calls for explicit prohibitions amid rising dark web data breaches linked to manipulative collection.

Controversies and Alternative Perspectives

Subjectivity in Classification and Overreach

The classification of user interface designs as dark patterns often involves subjective judgments, as definitions rely on interpreting intent, user autonomy, and potential harm, which vary across researchers and regulators. For instance, a 2024 systematic review identified 42 distinct types of dark patterns but noted inconsistencies in categorization, with overlapping terms like "misleading omission" and "undue influence" applied differently based on contextual interpretations rather than uniform criteria. This variability stems from the absence of a universally agreed-upon threshold distinguishing aggressive persuasion—such as preselected opt-ins that users can easily uncheck—from outright deception, leading to debates over whether common features like autoplay videos qualify as manipulative. Critics argue that such subjectivity enables hindsight bias, where designs are retroactively labeled dark based on user regret rather than measurable deception at the point of interaction. Regulatory overreach arises when broad interpretations expand enforcement beyond verifiable harm, potentially misclassifying legitimate business practices that enhance user engagement without coercion. The U.S. (), for example, has cited autoplay features in streaming services as dark patterns for allegedly exploiting , yet provides ambiguous guidance that fails to distinguish between user-preferred conveniences and exploitative tactics, as evidenced in its 2022 enforcement actions. In a 2023 settlement with , the labeled personalized credit recommendations as dark patterns for nudging users toward applications, despite no evidence of false claims or hidden fees, prompting accusations of overreach that prioritizes over . Think tanks have warned that this approach risks a on , as firms may avoid dynamic interfaces to evade subjective scrutiny, ultimately reducing options for users who value streamlined experiences. Empirical critiques emphasize that overclassification ignores cases where "dark" elements decline naturally under market pressures, such as GDPR compliance reducing prompts without additional bans. Academic sources highlight systemic biases in classification, particularly from institutions favoring interventionist views that undervalue agency in favor of assumed . Studies from design ethics frameworks propose objective tests—such as whether a violates explicit user expectations and disproportionately benefits providers—but note that prevailing taxonomies often incorporate normative assumptions about "fairness" without causal of net harm. For controversial claims of overreach, multiple analyses converge on the of conflating ethical design debates with legal prohibitions, as seen in workshops where panelists acknowledged the need for empirical validation beyond anecdotal user complaints. This underscores a : while clear warrant scrutiny, expansive labels may erode dynamics by equating persuasion with predation, absent rigorous proof of widespread detriment.

User Agency Versus Paternalistic Interventions

Critics of regulatory interventions against dark patterns argue that such measures embody by presuming users lack the capacity for informed , thereby undermining genuine user rather than enhancing it. For instance, the Information Technology and Innovation Foundation (ITIF) contends that labeling common interface designs as "dark patterns" excuses consumer choices, such as consenting to , by attributing them to instead of personal responsibility. This perspective holds that adults, absent evidence of fraud, should bear accountability for their actions in digital environments, with overbroad regulations creating a on experimentation in . Empirical support for restraint includes the U.S. Federal Trade Commission's () enforcement history, which has yielded only a handful of cases—such as settlements with ($3 million in 2022), Vonage ($100 million in 2022), and ($245 million in 2022)—primarily tied to explicit deceptions rather than subtle UI elements, suggesting widespread harm from design alone remains unproven. Proponents of intervention counter that dark patterns systematically subvert by exploiting cognitive biases, making paternalistic safeguards necessary to restore balanced choice architectures. Legal scholars note that these designs, such as disguised subscriptions or hidden opt-outs, coerce outcomes against users' interests, as evidenced in evaluations linking them to eroded trust in digital markets. However, even supportive frameworks acknowledge risks of overreach; for example, a comparative analysis of and U.S. approaches advocates "empowerment-based" —providing users with detection tools or standardized disclosures—over outright bans, to avoid curtailing legitimate nudges that align with user . This hybrid view posits that while extreme manipulations warrant scrutiny, distinguishing them requires evidence of substantial impairment, not subjective classification, to prevent from inadvertently limiting beneficial innovations like streamlined sign-ups. The tension reflects broader causal dynamics: unchecked dark patterns may amplify short-term firm gains at the expense of long-term market trust, yet heavy-handed rules risk fostering dependency on regulators, reducing incentives for users to develop . Free-market advocates emphasize that competition drives transparent designs, as firms prioritizing retention through deception face backlash, with —standard since the early 2000s—enabling iterative improvements without state oversight. In contrast, literature, drawing from "," suggests light-touch defaults can preserve agency while curbing excesses, though critics warn this blurs into coercive oversight when applied unevenly. Ultimately, verifiable metrics of harm—such as failure rates or subscription churn data—should guide interventions, prioritizing user tools like browser extensions for pattern detection over blanket prohibitions that may stifle adaptive markets.

Implications for Innovation and Free Markets

Dark patterns can undermine free market dynamics by coercing consumer decisions, thereby elevating switching costs and fostering lock-in effects that diminish . For instance, practices such as subscription traps or disguised ads prevent effective price comparisons and choice evaluation, transferring wealth to incumbents without enhancing product merit and potentially excluding rivals from the market. This distortion arises from exploiting cognitive biases rather than competing on quality, leading to reduced consumer and inefficient in digital marketplaces. Critics of expansive regulation contend that it imposes disproportionate compliance burdens on smaller firms and startups, which lack the resources of large platforms to navigate vague prohibitions, thereby entrenching market dominance and stifling entry-level . Standardization of user interfaces, as proposed in some regulatory frameworks, risks homogenizing designs and curtailing brand differentiation, which are essential for competitive experimentation in . Empirical observations indicate that while dark patterns yield short-term revenue boosts—such as through forced continuity or —they erode long-term trust, with studies showing diminished user engagement and loyalty that can exacerbate by rewarding transparent alternatives. In free markets, reputational mechanisms and competitive pressures theoretically discipline manipulative practices, as evidenced by platforms introducing digital wellness tools in response to rivals' addictive designs, thereby spurring innovation toward non- interfaces. However, persistent prevalence—documented in analyses of rising sophistication since 2022—suggests information asymmetries and scale advantages enable evasion, necessitating targeted antitrust scrutiny over broad bans to preserve incentives for genuine design advancements. Case-by-case under existing , such as the EU's Unfair Commercial Practices Directive (2005) or U.S. Sherman Act provisions, balances with market vitality by distinguishing from permissible .

References

  1. [1]
    Bringing Dark Patterns to Light - Harry Brignull
    Jun 6, 2021 · A Dark Pattern is a manipulative or deceptive trick in software that gets users to complete an action that they would not otherwise have done.
  2. [2]
    Deceptive Patterns in UX: How to Recognize and Avoid Them - NN/G
    Dec 1, 2023 · Deceptive patterns are designs that force the user to take an action that is not in their best interest. They are prolific on the web because they are ...
  3. [3]
    How to Spot—and Avoid—Dark Patterns on the Web - WIRED
    Jul 29, 2020 · The term “dark patterns” was first coined by UX specialist Harry Brignull to describe the ways in which software can subtly trick users into doing things they ...
  4. [4]
    Deceptive Patterns (aka Dark Patterns) - spreading awareness since ...
    Deceptive patterns (also known as “dark patterns”) are tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing ...Types · Hall of Shame · Legal Cases · Laws
  5. [5]
    Shining a Light on Dark Patterns | Journal of Legal Analysis
    Mar 23, 2021 · This article provides the first public evidence of the power of dark patterns. It discusses the results of the authors' two large-scale experiments.TESTING DARK PATTERNS... · ARE DARK PATTERNS... · Footnotes · References
  6. [6]
    New Research Finds Online Consumers Are Falling Victim to Dark ...
    Dec 13, 2023 · Dovetail's research shows that dark website patterns are undoubtedly causing financial loss for customers. Nearly 63% of the survey respondents ...
  7. [7]
    Impact of dark patterns on consumers' perceived fairness and attitude
    This study offers comprehensive knowledge regarding dark patterns tactics and how they influence consumers' perceived fairness and their attitude toward OTAs.<|separator|>
  8. [8]
    Dark Patterns: Protecting consumers without hindering innovation
    Brignull defines “dark patterns” as “tricks used in websites and apps that make you do things that you didn't mean to.” When it comes to defining the concept of ...
  9. [9]
    [PDF] Regulating dark patterns in the EU: Towards digital fairness
    The EU regulatory framework against dark patterns is fragmented and lacks a unified legal definition. This can lead to legal uncertainty and inconsistent ...
  10. [10]
    Digital Fairness Act Unpacked: Dark Patterns - Osborne Clarke
    Jul 31, 2025 · Data Act: Recital 38 mentions the prohibition of the use of dark patterns in relation to third parties and data holders when they are designing ...
  11. [11]
    A systematic literature review on dark patterns for the legal community
    We identify 42 types of dark patterns. All of them can be classified as: misleading omission; misleading action; harassment; undue influence; coercion. This ...
  12. [12]
    A Comprehensive Study on Dark Patterns - arXiv
    Dec 12, 2024 · This paper addresses three main challenges in dark pattern research: inconsistencies and incompleteness in classification, limitations of detection tools,
  13. [13]
    [PDF] What Makes a Dark Pattern... Dark? - arXiv
    Jan 13, 2021 · 2.1 Defining Dark Patterns. In 2010, when Brignull first introduced the term “dark patterns” on the website darkpatterns.org [3], he described ...
  14. [14]
    What are dark patterns ? Interview of Harry Brignull, the inventor of ...
    “A few years ago I came up with the term “Dark Patterns”, at first I don't think it was a big deal. I wrote a talk for a conference and it needed a catchy name.
  15. [15]
    Do Dark Patterns Exist? - Rian Dutra
    Aug 28, 2023 · This term (Dark Patterns) was coined by Harry Brignull in 2010. He initially introduced the concept of ethically questionable design ...Missing: origin | Show results with:origin
  16. [16]
    Dark Patterns: Past, Present, and Future - ACM Queue
    May 17, 2020 · Deception and Manipulation in Retail. The retail industry has a long history of deceptive and manipulative practices that range on a spectrum ...
  17. [17]
    Dark Patterns: Past, Present, and Future - Communications of the ACM
    Sep 1, 2020 · The third trend—and the one that most directly evolved into dark patterns—is growth hacking. The best-known and arguably the earliest growth ...Introduction · From Growth Hacking to Dark... · Money, Data, Attention
  18. [18]
    Types of Deceptive Pattern
    Deceptive patterns, also known as 'dark patterns', are tricks used in websites and apps to make you do things you didn't intend to, such as comparison ...Disguised ads · Nagging · Comparison prevention · Confirmshaming
  19. [19]
    Dark Patterns | Business & Information Systems Engineering
    Dec 12, 2022 · In 2010, Harry Brignull first coined the term dark patterns, which refers to “tricks used in websites and apps that make you do things that ...
  20. [20]
    [PDF] Dark Patterns: Past, Present, and Future - Privacy + Security Academy
    Although they have recently burst into mainstream awareness, dark patterns are the result of three decades- long trends: one from the world of retail. ( ...
  21. [21]
    Dark patterns and consumer vulnerability | Behavioural Public Policy
    Feb 3, 2025 · Common examples of dark patterns include misleading statements such as 'Only 1 left!' (Exploding offers), the use of questions that trick people ...<|separator|>
  22. [22]
    The Intricate Relationship Between Cognitive Biases and Dark ...
    May 12, 2024 · We explore the interplay between cognitive biases and dark patterns to address this gap. To that end, we conducted four focus groups with experts.
  23. [23]
    [PDF] The Intricate Relationship Between Cognitive Biases and Dark ...
    May 12, 2024 · This paper synthesises research on autonomy and empowerment, cognitive biases, and dark patterns. First, we revisit contributions about users' ...
  24. [24]
    Cognitive biases, dark patterns, and the 'privacy paradox' - PubMed
    This essay highlights some of those cognitive biases - from hyperbolic discounting to the problem of overchoice - and discusses the ways in which platform ...Missing: exploited peer- reviewed studies
  25. [25]
    Cognitive biases, dark patterns, and the 'privacy paradox'
    This essay highlights some of those cognitive biases – from hyperbolic discounting to the problem of overchoice – and discusses the ways in which platform ...
  26. [26]
    [PDF] Dark Patterns at Scale: Findings from a Crawl of 11K Shopping ...
    Jul 1, 2019 · Dark patterns are user interface design choices that benefit an online service by coercing, steering, or deceiving.
  27. [27]
    None
    Summary of each segment:
  28. [28]
    Spooky Dark UX Patterns - CSS-Tricks
    Oct 27, 2016 · Dark patterns aren't generally considered to be merely annoying UX patterns but deliberate tricks to make the user do what the site wants rather ...
  29. [29]
    [PDF] Checking Websites' GDPR Consent Compliance for Marketing Emails
    Pre-checked checkbox (a_pre_checked): The cor- responding checkbox is already ticked by default. Forced checkbox (a_forced): It is required to tick the ...Missing: UI | Show results with:UI
  30. [30]
    [PDF] A Comparative Study of Dark Patterns Across Mobile and Web ...
    Dark patterns are user interface elements that can infuence a person's behavior against their intentions or best interests. Prior work identifed these ...
  31. [31]
    Unveiling the Tricks: Automated Detection of Dark Patterns in Mobile ...
    Oct 29, 2023 · We propose UIGuard, a knowledge-driven system that utilizes computer vision and natural language pattern matching to automatically detect a wide range of dark ...
  32. [32]
    FTC Report Shows Rise in Sophisticated Dark Patterns Designed to ...
    Sep 15, 2022 · Making it difficult to cancel subscriptions or charges: Another common dark pattern involves tricking someone into paying for goods or services ...
  33. [33]
    [PDF] ICPEN Dark Patterns in Subscription Services Sweep Public Report ...
    Jul 2, 2024 · Examples are adding new non-optional charges to the price right before completing a purchase (also known as drip-pricing) and automatically ...Missing: research | Show results with:research
  34. [34]
    FTC Takes Action Against Amazon for Enrolling Consumers in ...
    Jun 21, 2023 · In a complaint filed today, the FTC charges that Amazon has knowingly duped millions of consumers into unknowingly enrolling in Amazon Prime.
  35. [35]
    FTC Action Against Vonage Results in $100 Million to Customers ...
    Nov 3, 2022 · The FTC alleges that the company used dark patterns to make it difficult for consumers to cancel and often continued to illegally charge them ...Missing: deceptions | Show results with:deceptions
  36. [36]
    The effects of four e-commerce dark patterns - ScienceDirect
    consumers may overconsume by buying more than they need or buying ...
  37. [37]
    How companies use dark patterns to keep you subscribed
    Princeton University ran a study in 2019 where they scanned over 11,000 shopping sites and found 1,818 examples of dark patterns on 11% of the websites. The ...
  38. [38]
    Dark Patterns: Deceiving Customers and Eroding Trust - CX Today
    Jan 4, 2024 · The research shows that dark website patterns are undoubtedly causing financial loss for customers. Nearly 63% of the survey respondents said ...
  39. [39]
    [PDF] REGULATING PRIVACY DARK PATTERNS IN PRACTICE ...
    Scholars define dark patterns as user interface design choices that benefit an online service by coercing, manipulating, or deceiving users into making.
  40. [40]
    [PDF] Dark Patterns after the GDPR: Scraping Consent Pop-ups and ...
    This study provides an empirical basis for the necessary regulatory action to enforce the GDPR, in particular the possibility of focusing on the centralised, ...
  41. [41]
    Full article: No harm no foul: how harms caused by dark patterns are ...
    Feb 17, 2025 · This paper identifies the individual, collective, material and non-material harms deriving from dark patterns, dissecting the role that harms play in the ...
  42. [42]
    [PDF] Can Consumers Protect Themselves Against Privacy Dark Patterns?
    Sep 2, 2025 · Furthermore, our paper is also the first to show the independent efficacy of nagging dark patterns, which repeatedly pester consumers to consent ...
  43. [43]
    Mapping the empirical literature of the GDPR's (In-)effectiveness
    manually inspects 300 consent notices from the news outlets, concluding with 297 out of 300 websites using dark patterns when eliciting consent from their users ...<|separator|>
  44. [44]
    Who is vulnerable to deceptive design patterns? A transdisciplinary ...
    Dark patterns are a distortion of the UX and UI design strategies that are intended to make the interactions between humans and technologies easier, smoother ...
  45. [45]
    The FTC and the CPRA's Regulation of Dark Patterns in Cookie ...
    Based on the result of the empirical study, it appears that 80.9% of the cookie consent notices in Binary Options exhibit dark patterns, including ...
  46. [46]
    [PDF] Redress for Dark Patterns Privacy Harms? A Case Study on ...
    Nov 2, 2022 · In section 6, we briefly discuss how privacy dark patterns harms, GDPR consent requirements, and avenues for redress might be linked to ...
  47. [47]
    [PDF] Cognitive Biases, Dark Patterns, and the 'Privacy Paradox'
    It deserves consideration particularly because it reflects the latest research on how disclosure can be manipulated by cognitive biases and coercive design.
  48. [48]
    [PDF] Dark Patterns and the Legal Requirements of Consent Banners - Inria
    They built a classification of these dark patterns, dividing them in. 15 types and 7 categories, and a taxonomy of their characteristics. Finally, they made ...
  49. [49]
    [PDF] Towards a Preliminary Ontology of Dark Patterns Knowledge
    For instance, many pattern def- initions mixed characteristics of dark patterns (e.g., information hiding, manipulation of the choice architecture), impact on ...<|separator|>
  50. [50]
    [PDF] Towards a Preliminary Ontology of Dark Patterns Knowledge
    Dark patterns are deceptive design practices that subvert a user's ability to make informed choices in digital systems, used to extract profit and data.Missing: Boehme | Show results with:Boehme
  51. [51]
    What Makes a Dark Pattern... Dark? | Proceedings of the 2021 CHI ...
    May 7, 2021 · Dark patterns are problematic user interface designs, but there is no single definition, rather a set of related considerations.Missing: Boehme | Show results with:Boehme
  52. [52]
    [PDF] Shining a Light on Dark Patterns - Chicago Unbound
    Our hunch is that consumers are seeing so many dark patterns in the wild because the internal, proprietary research suggests dark patterns generate profits for ...
  53. [53]
    Dark Patterns at Scale: Findings from a Crawl of 11K Shopping ...
    Using these techniques, we study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information ...Missing: evidence | Show results with:evidence
  54. [54]
    [PDF] dark commercial patterns | oecd
    For example, dark patterns that deceive consumers into giving up more data than desired (e.g. through hidden privacy- intrusive settings turned on by default) ...
  55. [55]
    [PDF] THE RISE OF DARK PATTERNS IN E-COMMERCE
    In cases of usage of Dark patterns, the economic benefit received by the consumer is often negative. You can lose money through direct losses such as unintended ...
  56. [56]
    [PDF] The impact of dark patterns on user trust and long-term engagement
    Mar 3, 2025 · The objective of this study is to identify the potential consequences of ethical and unethical UX design. The relevance of this research is ...
  57. [57]
    (PDF) Dark Patterns in User Experience Design: The Erosion of ...
    Apr 26, 2025 · This research explores the pervasive and ethically troubling phenomenon of dark patterns in user experience (UX) design.
  58. [58]
    The 'dark patterns' at the center of FTC's lawsuit against Amazon : NPR
    Sep 23, 2025 · NICKELSBURG: The FTC's job is to crack down on deceptive business practices. In the lawsuit, the FTC claims Amazon used dark patterns to trick ...
  59. [59]
    GDPR Dark Patterns: How They Undermine Compliance & Risk ...
    Feb 12, 2025 · Beyond financial penalties, businesses using dark patterns risk irreparable reputational damage. The 2023 Edelman Trust Barometer found that 81 ...<|separator|>
  60. [60]
    UI Dark Patterns and Where to Find Them: A Study on Mobile ...
    The results of the analysis show that 95% of the analyzed apps contain one or more forms of Dark Patterns and, on average, popular applications include at least ...
  61. [61]
    Dark Patterns after the GDPR: Scraping Consent Pop-ups ... - arXiv
    Jan 8, 2020 · We found that dark patterns and implied consent are ubiquitous; only 11.8% meet the minimal requirements that we set based on European law.
  62. [62]
    [PDF] Dark Patterns after the GDPR: Scraping Consent Pop-ups and ...
    Since the submission of this paper, two studies have been released that look specifically at the consent management platforms that have appeared in response to ...
  63. [63]
    Dark Patterns: Can Consumers Break Out?
    Jan 9, 2025 · Lior Strahilevitz explores whether public awareness is the key to overcoming this deceptive and increasingly common online marketing tactic.
  64. [64]
    [PDF] Survey of academic studies measuring the effect of dark patterns on ...
    Since 2019, researchers have performed studies with end users, testing and trying to quantify how different designs of consent banners influence users' decision ...
  65. [65]
    Not All Dark Patterns are Equally Harmful
    Jun 12, 2023 · The FTC discusses some of the effects that dark patterns can have and focuses on the more overtly harmful forms in its report. While other ...Missing: debates effectiveness
  66. [66]
    Can Consumers Protect Themselves Against Privacy Dark Patterns?
    Jan 7, 2025 · Our interdisciplinary paper provides experimental evidence showing that consumer self-help is unlikely to fix the dark patterns problem.
  67. [67]
    [PDF] Towards an Understanding of Dark Pattern Privacy Harms
    In this position paper, we discuss the challenges of articulating these harms, then outline a research agenda for empirically measuring the labor costs, or ...
  68. [68]
    What Hides in the Shadows: Deceptive Design of Dark Patterns
    Nov 4, 2022 · Some dark patterns may violate Section 5 of the FTC Act, which prohibits "unfair or deceptive acts or practices [UDAP] in or affecting commerce.
  69. [69]
    FTC Targets "Dark Patterns" in Actions Against Amazon and ...
    Aug 14, 2023 · The FTC's enforcement action against PCH, issued just days after the Amazon announcement, alleges PCH used dark patterns to trick consumers into ...
  70. [70]
    FTC Brings "Dark Patterns" Complaint Over Online Subscription and ...
    Jun 28, 2024 · The DOJ and FTC claim that dark patterns are “unfair or deceptive acts or practices” and thus violate Section 5 of the FTC Act and that they ...Missing: deceptions | Show results with:deceptions
  71. [71]
    [PDF] DARK PATTERNS DEFINED: EXAMINING FTC ENFORCEMENT ...
    This Article gives a brief overview of the FTC's enforcement actions against both Vonage and Epic Games, and then examines previous enforcement actions ...Missing: United | Show results with:United
  72. [72]
    FTC, ICPEN, GPEN Announce Results of Review of Use of Dark ...
    Jul 10, 2024 · The Federal Trade Commission and two international consumer protection networks announced the results of a review of selected websites and apps.
  73. [73]
    Illuminating Dark Patterns: US Regulators Crack Down on Deceptive ...
    Feb 17, 2024 · Currently, three of five US state privacy laws explicitly call out dark patterns. The California Privacy Rights Act (CPRA), the Colorado Privacy ...
  74. [74]
    How do the CPRA, CPA & VCDPA treat dark patterns? - Byte Back
    Mar 16, 2022 · In this article, we analyze how each of these laws treats dark patterns. The CPRA and CPA both prohibit use of dark patterns to obtain consumer consent.
  75. [75]
    Dark Patterns Lawsuits and the FTC's Click-to-Cancel Rule
    The FTC's dark-pattern crackdown continues even after the Click-to-Cancel rule was struck down. Learn how deceptive app design can violate consumer law.<|separator|>
  76. [76]
    Warner, Fischer Lead Bipartisan Reintroduction of Legislation to ...
    Senator Warner and Senator Fischer's DETOUR Act would put a stop to the destructive and deceptive use of dark patterns,” said Imran Ahmed, CEO ...
  77. [77]
    Forthcoming Litigation for Companies That Employ Dark Patterns
    First, it will outline the path of the Federal Trade Commission's (“FTC” or “Commission”) enforcement of dark patterns. The agency's allegations begin with the ...
  78. [78]
    Directive - 2005/29 - EN - Unfair Commercial Practices Directive - EUR-Lex
    **Summary of Unfair Commercial Practices Directive (2005/29/EC) on Dark Patterns:**
  79. [79]
    Unfair commercial practices directive - European Commission
    data-driven personalisation and dark patterns;; gaming practices;; consumer lock-in;; obligations in the travel and transport sector;; enforcement and penalties ...Objective of the directive · Guidance
  80. [80]
    L_2022277EN.01000101.xml
    Below is a merged response that consolidates all the information from the provided segments into a single, comprehensive summary. To maximize detail and clarity, I’ve organized key information into tables where appropriate (in CSV format for density) and included narrative text for contextual explanations. The response retains all details mentioned across the segments, addressing Article 25 (Dark Patterns or Interface Design Prohibitions), key DSA dates, and useful URLs, while noting gaps or inconsistencies.
  81. [81]
  82. [82]
    Dark commercial patterns - OECD
    Oct 26, 2022 · This report proposes a working definition of dark commercial patterns, sets out evidence of their prevalence, effectiveness and harms, and ...Missing: jurisdictions | Show results with:jurisdictions
  83. [83]
    [PDF] Harmful design in digital markets: How Online Choice Architecture ...
    The CMA enforces a range of consumer protection law75 but, for the purposes of this joint position paper, the most relevant regulations are the Consumer ...
  84. [84]
    ICO and CMA: Harmful online design encourages consumers to ...
    Aug 9, 2023 · The Information Commissioner's Office (ICO) and Competition and Markets Authority (CMA) are calling for businesses to stop using harmful website designs.<|separator|>
  85. [85]
    Guidelines for Prevention and Regulation of Dark Patterns, 2023
    Dec 26, 2023 · The Guidelines define 'dark patterns' as deceptive patterns or practices deployed in the user interface or user experience (UI/UX) interactions.
  86. [86]
    Dark Pattern Rules in India: Lessons from Flipkart's Self-Audit
    Oct 8, 2025 · The guidelines identify 13 specific prohibited dark patterns: false urgency, basket sneaking, confirm shaming, forced action, subscription traps ...
  87. [87]
    Amendment to E-Commerce Act Strengthening Regulations on Dark ...
    Aug 4, 2025 · The amended E-Commerce Act sets forth obligations and prohibitions concerning six types of dark patterns: (i) hidden renewals, (ii) gradual ...
  88. [88]
  89. [89]
    [PDF] Unfair trading practices consultation - December 2024 - ACCC
    Dec 6, 2024 · The ACCC notes that “dark patterns” is a broad term used to denote a diverse range of conduct.<|separator|>
  90. [90]
    Australia expected to regulate dark patterns and other unfair trading ...
    Jan 13, 2025 · Australia expected to regulate dark patterns and other unfair trading practices in 2025. Co-authored by Vanessa Gore, Partner and Elise Ivory ...
  91. [91]
    Identifying and mitigating harms from privacy-related deceptive ...
    Nov 13, 2024 · Deceptive design patterns ( DDP s) are used on websites and mobile apps to influence, manipulate, or coerce users to make decisions that are not ...
  92. [92]
    Dark patterns
    Jun 12, 2024 · Some dark patterns might be considered an unfair or deceptive business practice because of the way they cause financial harm or prevent you from ...
  93. [93]
    General Personal Data Protection Act (LGPD) - Brazil - TrustArc
    Organizations must enable data subjects with consent to processing in a way that avoids use of dark patterns. Consents must easily be revoked. Individuals ...
  94. [94]
    Dark Patterns, Privacy and the LGPD - Kluwer Law Online
    The article argues that the LGPD does not offer enough protection against dark patterns in personal data collection, and that data protection laws need a ...
  95. [95]
    What makes a dark pattern... dark? design attributes, normative ...
    We then show how future research on dark patterns can go beyond subjective criticism of user interface designs and apply empirical methods grounded in normative ...
  96. [96]
    [PDF] EVALUATING THE FTC'S LIMITATIONS IN COMBATTING DARK ...
    The FTC's classification of auto-play features as dark patterns is a noteworthy example of ambiguous guidance.259 The agency states that the popular feature ...
  97. [97]
    The FTC's Efforts to Label Practices “Dark Patterns” Is an Attempt at ...
    Jan 4, 2023 · First, in September 2022, the FTC touted its $3 million settlement against Credit Karma as an example of its enforcement against dark patterns.
  98. [98]
    Dark Patterns: Why More Laws Won't Help - European Tech Alliance
    Apr 16, 2025 · The European Union (EU) has taken significant steps to tackle dark patterns – frequently understood as manipulative design tricks that nudge users into choices ...
  99. [99]
    Beyond Dark Patterns: A Concept-Based Framework for Ethical ...
    May 11, 2024 · The key idea is to define acceptable designs as instantiations of positive patterns, deviations from which are treated as “dark.” Our patterns ...
  100. [100]
    [PDF] Dark Patterns Workshop Transcript - Federal Trade Commission
    And what we see is that there is a set of six attributes, which together describe all the dark patterns that researchers have been curating in the taxonomies.
  101. [101]
    [PDF] Dark Patterns as Disloyal Design - Digital Repository @ Maurer Law
    Jul 7, 2025 · Lawmakers have started to regulate “dark patterns,” understood to be design practices meant to influence technology users' decisions through ...
  102. [102]
  103. [103]
  104. [104]
    No. 127: Towards an Empowerment-Based Regulation of Dark ...
    Dec 12, 2024 · Dark patterns or deceptive patterns can be defined as techniques of deception or manipulation of users through interfaces that substantially ...
  105. [105]
    Exploring End-User Empowerment Interventions for Dark Patterns in ...
    Feb 3, 2024 · DI4: Design dark pattern interventions with three strategies: interface design change, user flow adjustment, and behavioral outcome reflection.
  106. [106]
    [PDF] ARE DARK PATTERNS ANTICOMPETITIVE?
    Nov 5, 2020 · The argument is that online manipulation can so overcome free will that anticompetitive effects generate. Part IV addresses policy implications.
  107. [107]
    AI-Driven Dark Patterns: How Artificial Intelligence Is Supercharging ...
    Nov 17, 2024 · Generative AI can supercharge dark patterns,” Potel-Saville explains. “You don't need AI to personalize interactions, but with AI, it's much easier to do it.
  108. [108]
    Time to be cautious: Unveiling dark patterns and impact on customers
    Dec 7, 2024 · While dark patterns can lead to short-term gains in conversions or data collection, they risk alienating customers in the long run. In a digital ...<|separator|>