Fact-checked by Grok 2 weeks ago

Persuasive technology

Persuasive technology, also termed captology—an acronym for computers as persuasive technologies—encompasses the intentional design of interactive computing systems, devices, or applications to alter individuals' attitudes or behaviors in predetermined, nontrivial ways without employing or . This field, pioneered by Stanford researcher in the late 1990s, draws on principles from and human-computer interaction to leverage technology's roles as tools for enhancing , media for simulating experiences, or social actors mimicking interpersonal influence. Fogg formalized these concepts in his 2003 book Persuasive Technology: Using Computers to Change What We Think and Do, establishing foundational frameworks such as the functional triad of persuasion and an eight-step design process for creating effective systems. Key applications span health interventions, where apps prompt habit formation through nudges aligned with users' motivation, ability, and timely triggers as per the Behavior Model; environmental sustainability efforts, such as gamified interfaces encouraging reduced energy use; and commercial platforms optimizing user engagement via personalized feedback loops. Empirical reviews indicate moderate effectiveness in targeted domains, with studies demonstrating behavior changes like increased or adherence to routines when designs incorporate evidence-based prompts and simplify actions, though outcomes vary by context and user receptivity, with success rates often below 50% in uncontrolled settings. Notable achievements include 's evolution of the field into behavior design methodologies, influencing products like fitness trackers and that empirically boost short-term compliance through iterative testing. Despite its , persuasive technology has sparked controversies over ethical boundaries, particularly the of subtle eroding or prioritizing agendas over interests, as imbalances favor developers in opaque algorithmic persuasion. Critics highlight potential for unintended harms, such as addictive engagement patterns in , underscoring the need for and to mitigate coercion-like effects, even as proponents advocate self-imposed ethical guidelines like avoiding and respecting agency. These debates reflect causal tensions between in behavior shaping and realistic assessments of human volition, informed by peer-reviewed analyses rather than anecdotal advocacy.

Definition and Historical Development

Core Concepts and Terminology

Persuasive technology refers to any system, device, or application intentionally designed to influence a person's attitudes or behaviors through rather than . This intentionality distinguishes it from incidental effects, such as software that might indirectly motivate users; persuasive technologies explicitly incorporate mechanisms to shape user actions or beliefs. The field is studied under the term captology, an acronym coined by in 1997 for "computers as persuasive technologies," encompassing the research, design, and application of such systems. Captology examines how computational elements leverage human psychological susceptibilities, including credibility cues and simulated , to effect change without eliminating user choice—a key demarcation from coercive methods. A foundational classification within captology is the functional triad, proposed by , which categorizes persuasive technologies based on their primary roles: as tools that enable or constrain behavior (e.g., software that simplifies task completion to encourage habit formation), as that provide simulated experiences influencing perceptions (e.g., scenarios evoking or ), or as social actors that mimic interpersonal (e.g., chatbots employing or to build ). These roles often overlap, allowing designers to combine them for amplified effect, as seen in applications where a incorporates elements for . The underscores that arises from the technology's functional positioning relative to the user, rather than inherent properties of the device. Additional terminology includes attitude change (shifting beliefs or evaluations, often via repeated exposure or credibility enhancement) and behavior change (prompting specific actions, such as through triggers or feedback loops), with persuasive technologies targeting one or both depending on design goals. Unlike traditional , which relies on human orators, captology emphasizes scalable, automated influence via interfaces that exploit cognitive biases like reciprocity or .

Origins with BJ Fogg and Captology

, a researcher at , coined the term "captology" in 1997 as an acronym for computers as persuasive technologies, establishing the foundational framework for persuasive technology as a distinct field of study focused on designing interactive computing products to change human attitudes or behaviors without or . This concept was first presented publicly at the ACM CHI 1997 conference through an extended abstract titled "Captology: The Study of Computers as Persuasive Technologies," where Fogg outlined initial research directions, including definitions of persuasion in computational contexts and examples like software simulating credible sources to promote health behaviors. Fogg expanded on these ideas in a subsequent CHI 1998 paper, "Persuasive Computers: Perspectives and Research Directions," which formalized captology's scope by distinguishing it from traditional human-computer interaction and proposing a research agenda centered on empirical testing of persuasive interfaces, such as virtual agents that leverage to encourage . In 1998, he founded the Stanford Persuasive Technology Lab (later renamed the Behavior Design Lab) to conduct systematic experiments on how computers could function as tools, media, or social actors to influence behavior, drawing from psychological principles like reciprocity and while emphasizing ethical boundaries to avoid . The lab's early projects, starting in the late 1990s, included prototypes like the "" inspired virtual pets to promote habit formation and web-based systems for , laying groundwork for captology's functional triad—computers as tools for , media for simulated experience, and actors for relational influence—which Fogg detailed in peer-reviewed publications to guide developers toward verifiable, non-deceptive designs. Fogg's 2003 book, Persuasive Technology: Using Computers to Change What We Think and Do, synthesized these origins into a comprehensive text, presenting 28 derived from lab experiments and case studies, such as tunneling (step-by-step guidance) to simplify user tasks and reduce for behavior adoption. This work positioned captology as an interdisciplinary bridge between , , and , prioritizing empirical validation over speculative , though Fogg himself advocated for self-regulation by designers to mitigate risks like unintended habit loops.

Evolution from 1990s to Present

The field of persuasive technology emerged in the early 1990s when B.J. Fogg, then a graduate student in experimental psychology at Stanford University, coined the term "captology" in 1993 to describe the study of computers as persuasive technologies. This foundational work focused on how interactive computing systems could influence attitudes and behaviors through mechanisms like simulation and virtual experiences, building on early digital interfaces such as email and basic websites. By the late 1990s, Fogg established the Stanford Persuasive Technology Lab, where initial experiments explored ethical applications, including behavior change via software prompts and feedback loops. A seminal 1998 paper formalized captology as a discipline, emphasizing computers' roles as tools, media, and social actors in persuasion. In the 2000s, the field gained theoretical depth with 's 2003 book Persuasive Technology: Using Computers to Change What We Think and Do, which outlined core principles like reduction, tailoring, and , applied to emerging web technologies. This period saw practical implementations in and early social platforms, where features such as personalized recommendations and progress trackers leveraged persuasive strategies to boost user engagement and purchases; for instance, Amazon's "customers who bought this also bought" system exemplified tunneling and reciprocity tactics. Ethical considerations intensified, with advocating guidelines to prevent , amid growing concerns over in data-driven persuasion. Concurrently, Web 2.0's rise enabled dynamic feedback, as seen in platforms like (launched 2003) and (2004), which used notifications and social validation to encourage habitual use. The 2010s marked expansion into mobile ecosystems, with Fogg identifying smartphones as a key frontier for scalable persuasion due to their ubiquity and sensors for real-time tailoring. Gamification proliferated in apps like (2011), employing streaks and rewards to sustain learning behaviors, while fitness trackers such as (2007, mainstreamed in 2010s) used goal-setting and social sharing for health motivation. Social media algorithms refined persuasive intent, prioritizing content to maximize time-on-platform via variable rewards, contributing to documented increases in user session lengths— reported average daily use rising from 30 minutes in 2010 to over 40 minutes by 2016. Research highlighted dual-edged impacts, with studies showing efficacy in apps but risks of dependency in social feeds. By the 2020s, persuasive technology has integrated for hyper-personalized influence, enabling predictive nudges based on user data patterns, as in AI-driven health apps that adapt interventions in . Emerging modalities like generative AI and ambient computing—such as voice assistants with embedded —amplify scale, with applications in yielding meta-analytic evidence of improved adherence rates (e.g., 20-30% boosts in via PT-enhanced wearables). However, regulatory scrutiny has grown, with reports citing manipulative "dark patterns" in apps leading to addictive loops, prompting calls for in algorithmic . Advances in and further evolve the field, shifting from static designs to dynamic, context-aware systems that simulate social actors more convincingly.

Theoretical Frameworks

Fogg Behavior Model

The Fogg Behavior Model, formulated by researcher in the early 2000s as part of his work on captology, asserts that any given occurs only when three factors—, , and a —converge at the same moment. Expressed mathematically as B = M × A × P, the model treats these elements as multiplicative: the absence or weakness of any one factor results in no , emphasizing that high alone, for instance, fails without sufficient or timing via a . derived the model from empirical observations in persuasive experiments, where interventions succeeded or failed based on alignments among these components rather than isolated efforts to boost desire or simplify tasks. Motivation refers to the perceived value or emotional drive for the , scaled from low levels (e.g., avoiding physical discomfort) to high (e.g., achieving or approval), and can be enhanced through techniques like or in digital interfaces. encompasses six practical barriers—time, money, physical effort, brain cycles (), deviance, and non-routine actions—with persuasive technologies aiming to streamline these by reducing friction, such as through one-click actions or pre-filled forms. The , or , serves as the cue signaling the opportunity to act; it functions as a when and are high, a spark to ignite low- scenarios, or a mere signal otherwise, with digital examples including notifications timed to exploit momentary readiness. In persuasive technology applications, the model guides design by prioritizing prompts aligned with peak motivation and ability windows, as seen in habit-forming apps that deploy micro-prompts for "tiny habits" to anchor new behaviors to existing routines, thereby scaling from simple actions to sustained change. Empirical studies applying the model, such as evaluations of mobile health interfaces, have demonstrated improved user adherence when prompts are calibrated to ability thresholds, though outcomes vary by context and require iterative testing. A 2022 scoping review of its use in interventions found the framework practitioner-friendly for dissecting behavior gaps in low-resource settings, supporting its role in targeted nudges like vaccination reminders. Critics, including behavioral scientists comparing it to models like COM-B, contend that its focus on discrete, immediate behaviors underemphasizes habitual or systemic factors for long-term shifts, potentially limiting applicability in complex domains without supplementary theories. Despite such limitations, the model's simplicity has facilitated its adoption in over 100 Stanford-led experiments and commercial tools, where causal testing isolates prompt efficacy from fluctuations.

Functional Triad and Classification Systems

The Functional Triad, introduced by in his foundational work on captology, categorizes the persuasive roles of computing technologies into three primary functions: as tools, as , and as social actors. This framework, detailed in Fogg's 2003 book Persuasive Technology, posits that technologies influence behavior by either enabling direct action (tools), simulating experiences (), or mimicking interpersonal dynamics (social actors), often in hybrid forms. The triad serves both analytical and generative purposes, allowing designers to identify persuasive opportunities and researchers to dissect how digital systems alter attitudes or actions without coercion. In the tool role, technologies persuade by simplifying processes, reducing , or constraining choices to guide user behavior toward desired outcomes. For instance, software interfaces that automate repetitive tasks or provide step-by-step tunneling—predefined sequences of actions—leverage this function to foster habit formation, as seen in fitness apps that prompt sequential exercises with minimal user discretion. Empirical studies confirm that such tools enhance by aligning with users' existing motivations, such as , rather than relying on external rewards alone. As media, computers deliver virtual simulations or narratives that indirectly shape perceptions through experiential learning, distinct from traditional media by interactivity. Examples include environmental simulations that visualize climate impacts to promote conservation behaviors or educational games that immerse users in scenarios reinforcing ethical decision-making; these operate via credibility cues like realism and repetition, influencing attitudes over time. Research indicates this role is effective for attitude change because interactive media engages users more deeply than passive content, though outcomes depend on the simulation's fidelity to real-world causal mechanisms. The social actor function attributes human-like qualities to technology, enabling persuasion through simulated reciprocity, authority, or emotional bonding. Virtual agents that offer praise for task completion or express empathy in responses, such as chatbots in mental health apps, exploit social norms to encourage persistence; for example, a 2003 study by Fogg demonstrated how computer-delivered compliments increased user liking and compliance akin to human interactions. This role draws on psychological principles like consistency and liking from Cialdini's influence research, but its efficacy varies with cultural contexts and user anthropomorphism tendencies. Beyond the Functional Triad, other classification systems expand persuasive technology analysis. The Persuasive Systems Design (PSD) model, developed by Oinas-Kukkonen and Harjumaa in 2008, organizes 28 principles into four categories—primary task support (e.g., ), dialogue support (e.g., reminders), system credibility (e.g., expertise portrayal), and social support (e.g., social learning)—to guide ethical design and evaluation. Validated in interventions, PSD emphasizes measurable behavior change via user-system interactions, differing from the by focusing on principles rather than roles. More recent taxonomies, such as the ComTech proposed in 2024, unify techniques across domains like tunneling and into hierarchical categories, prioritizing empirical validation for commercial applications. These systems complement the Triad by addressing implementation specifics, though critiques note potential overemphasis on designer intent over user agency in biased institutional contexts.

Alternative Taxonomies and Models

The (PSD) model, proposed by Harri Oinas-Kukkonen and Marja Harjumaa in 2009, offers an alternative to Fogg's frameworks by emphasizing a structured process for developing and evaluating change support systems through computational means. Unlike Fogg's focus on behavioral triggers via , , and prompts, PSD organizes persuasive principles into four primary categories—primary task support, dialogue support, system credibility support, and —encompassing 28 specific features derived from empirical reviews of persuasive technology applications up to that period. This prioritizes system-level over , aiming to enhance long-term adherence by integrating context analysis, feature selection, and evaluation steps. Primary task support in PSD includes techniques like reduction (simplifying user actions), tunneling (guiding sequential steps), tailoring (personalizing content), self-monitoring (tracking progress), simulation (modeling outcomes), and rehearsal (practicing behaviors), which directly facilitate the core activity without relying on external motivation spikes. Dialogue support features, such as praise, reminders, suggestion, similarity, liking, and social role adoption, foster ongoing interaction akin to human coaching but automated via interfaces. System credibility support addresses trust-building through expertise (demonstrating authority), trustworthiness (ensuring reliability), and third-party endorsements, countering skepticism in digital environments where users may question algorithmic intent. Social support incorporates cooperation (group facilitation), competition (rivalrous elements), recognition (public acknowledgment), and social comparison, leveraging interpersonal dynamics within technological mediation. Empirical validations of , such as systematic reviews of web-based interventions, indicate that incorporating multiple categories correlates with higher user engagement and persistence compared to single-principle applications, though effectiveness varies by domain and user demographics. For instance, a 2012 found systems using and supports achieved 15-20% greater adherence in preventive apps than those limited to task simplification alone. Critics note PSD's static categorization may overlook dynamic user adaptation, prompting extensions like integration with for intrinsic motivation alignment. Other models include the of persuasive strategies by Oinas-Kukkonen et al. (2007), which classifies approaches across sensory channels (e.g., visual cues, auditory prompts) for interfaces, emphasizing meta-reasoning to adapt strategies in based on feedback loops. The Interactive Behavior Change Model (IBCM 8.0), updated in recent neuroscience-informed works, extends beyond persuasion to ontology-driven integrations of psychological triggers with neurochemical factors like , positioning technology as a modulator of formation circuits rather than mere prompter. These alternatives highlight a shift toward holistic, context-adaptive designs, contrasting Fogg's parsimonious triad by incorporating broader ecological and credibility variables empirically linked to sustained outcomes in controlled studies.

Persuasive Design Strategies

Core Techniques like Tunneling and Tailoring

Tunneling, one of the core persuasion strategies identified by in his framework for persuasive technology, involves guiding users through a predetermined sequence of steps or screens to facilitate adoption by minimizing distractions and . This technique leverages computing systems to lead individuals sequentially, often starting with simple actions and progressing to more complex ones, thereby exploiting the human tendency toward consistency in . For instance, in applications, tunneling might direct users through an initial setup of daily reminders followed by habit-tracking prompts, reducing and increasing completion rates. Empirical reviews indicate tunneling's use in 14% of persuasive interventions for sustainable behaviors, where it supports short-term adherence by structuring experiences akin to scripted tutorials. Tailoring complements tunneling by customizing persuasive content to individual characteristics, such as preferences, past behaviors, or contextual , thereby enhancing perceived and motivational impact. posits that tailored information—delivered via algorithms analyzing inputs like age, location, or prior interactions—yields higher compared to generic messaging, as it aligns with personal needs and reduces resistance. In practice, platforms employ tailoring by recommending products based on browsing history, while adapts lesson difficulty to learner performance metrics. Studies across domains, including a of persuasive systems, report tailoring in 57% of behavior change interventions, correlating with improved engagement through personalized feedback loops that foster . These techniques often integrate within Fogg's broader of seven persuasion tools, including and , to form layered strategies that address barriers to voluntary change. While effective for initiating behaviors—evidenced by meta-analyses showing tailored tunneling sequences boosting compliance in mobile health apps by up to 20-30% in controlled trials—long-term efficacy depends on user , with over-reliance risking if perceived as manipulative. emphasizes ethical implementation, prioritizing in data use for tailoring to mitigate concerns inherent in algorithms.

Reinforcement and Feedback Mechanisms

Reinforcement mechanisms in persuasive technology draw on principles, employing rewards to increase the likelihood of target behaviors and punishments to decrease undesired ones. outlines as one of seven categories, distinguishing primary conditioning—direct operant rewards like verbal or sanctions delivered by the system—and secondary conditioning, which uses proxies such as points, badges, or levels to mimic tangible incentives without real-world costs. These approaches apply intermittent reinforcement schedules, akin to variable-ratio setups in behavioral , to sustain engagement by creating anticipation of unpredictable rewards, as seen in apps where notifications deliver sporadic affirmations. Empirical evaluations indicate that secondary reinforcements effectively boost short-term ; for example, gamified systems awarding points for task completion have been shown to elevate user persistence in educational platforms by 20-30% in controlled trials, though effects diminish without alignment to intrinsic goals. Positive predominates over negative due to user retention concerns, with sanctions like temporary feature locks applied sparingly to avoid disengagement, as overuse risks or abandonment. Feedback mechanisms provide users with data on their , enabling and iterative behavior refinement, often integrated with to amplify persuasive impact. These include cues like progress bars or summaries that quantify achievements, fostering awareness and adjustment through causal loops where observed outcomes influence future actions. In health applications, such as activity trackers, immediate auditory or visual on steps taken correlates with heightened , with studies reporting 15-25% greater adherence when emphasizes gains over deficits. A 2024 systematic review of 28 self-monitoring interventions confirmed that feedback-enhanced designs yield moderate effect sizes (Cohen's d ≈ 0.4) on physical activity outcomes, outperforming unassisted tracking by reinforcing habit formation via repeated exposure to progress data. However, feedback efficacy varies by delivery: personalized, timely responses outperform generic ones, but overload from excessive notifications can induce fatigue, underscoring the need for calibrated intensity to maintain causal influence without saturation. Combined reinforcement-feedback systems, as in habit apps using streak counters alongside daily recaps, demonstrate synergistic effects in sustaining behaviors over weeks, provided they avoid manipulative over-reliance on extrinsic motivators.

Social Influence and Reciprocity Tactics

Social influence tactics in persuasive technology harness psychological principles to motivate users through perceived social dynamics, often via the social support category in the (PSD) model developed by Oinas-Kukkonen and Harjumaa in 2009. These tactics include social learning, where systems model desired behaviors for users to emulate, such as virtual coaches demonstrating exercise routines in fitness apps; social comparison, employing leaderboards or progress sharing to foster competition or among peers; and normative influence, which prompts to group standards by displaying aggregate user behaviors like "90% of users complete daily goals." Recognition tactics award badges or public acknowledgments for achievements, reinforcing participation through social validation, while social facilitation simulates peer presence to boost task performance, as seen in collaborative online platforms. Empirical studies validate these approaches' efficacy in digital contexts; for instance, a 2021 analysis of commercial mobile fitness applications found that implementing social influence constructs, such as normative cues and comparison features, correlated with higher user susceptibility to sustained engagement and behavior adherence. Similarly, patterns for , including peer endorsements and , have been shown to enhance compliance in health and productivity tools by instantiating features like shared goal-setting or virtual communities. These tactics derive from established principles like Robert Cialdini's , where users infer correctness from others' actions, adapted to environments through interfaces that amplify perceived or . Reciprocity tactics, rooted in Cialdini's —where individuals feel compelled to repay received favors—manifest in persuasive technology by offering unprompted value to elicit user concessions, such as data provision or continued use. In and design, this includes models providing initial free access or content, like trial subscriptions or complimentary resources, which increase user investment and conversion rates by triggering obligation; a 2014 study reported that such concessions boosted trust and willingness to share information compared to direct requests. Digital implementations often combine reciprocity with social elements, such as personalized gift recommendations framed as "exclusive offers from peers," enhancing persuasion in platforms where reciprocity norms yield up to 20-30% higher response rates in controlled experiments. However, overreliance on these tactics risks user fatigue or backlash if perceived as manipulative, as evidenced by reduced long-term retention in apps overusing unearned favors without genuine value exchange.

Applications Across Domains

Health Behavior Modification

Persuasive technologies apply design principles such as , tailored reminders, and to digital tools like mobile applications and wearable devices, aiming to alter health behaviors including , dietary intake, and substance cessation. These systems leverage feedback and motivation to overcome barriers like and forgetfulness, with platforms such as fitness trackers prompting users toward sustained engagement. Empirical reviews highlight their potential, though outcomes vary by intervention fidelity and user adherence. In promotion, persuasive systems frequently incorporate tracking (used in 90% of interventions), reminders (42%), and (38%), yielding success in 80% of 170 reviewed studies from 2010 to 2019. apps and wearables have demonstrated increases in moderate activity, such as an additional hour of weekly exercise, and reductions in sedentary by 30-40% in settings. For instance, gamified exergames like "Fish'n'Steps" combine virtual rewards with step counting to encourage walking, while just-in-time prompts in apps like "Project Energise" interrupt prolonged sitting. Dietary behavior modification benefits from similar techniques, with apps using tunneling—guiding users through simplified meal logging—and social comparison to foster healthier choices. A 2016 systematic review of app-based interventions found modest efficacy in improving diet quality and reducing calorie intake, particularly when integrated with behavior change theories. Nutrition-focused apps have shown significant health gains in chronic disease populations, such as better glycemic control via personalized tracking and feedback. However, evidence remains limited by short study durations and inconsistent theoretical grounding. For , apps employ through progress badges and features, with a 2023 meta-analysis of nine randomized trials (n=12,967) reporting an overall of 1.56 for at six months versus controls. Efficacy rises to OR 1.79 when apps pair with like replacement, emphasizing adherence-enhancing elements such as daily check-ins. Standalone apps show weaker standalone effects (OR 1.03), underscoring the need for multimodal approaches. Across domains, a 2016 review of 85 studies (2000-2015) confirmed 75% positive results, with eating behaviors responding most robustly (91% success) via feedback and social mechanisms, though only 4% targeted directly. Long-term retention challenges persist, as initial gains often wane without ongoing .

Sustainability and Lifestyle Promotion

Persuasive technologies have been deployed to encourage sustainable behaviors such as , waste reduction, and eco-friendly transportation by leveraging mechanisms, , and normative to influence user habits. Real-time eco- systems, including smart meters and home energy management systems (HEMS), provide users with immediate data on consumption patterns, often comparing individual usage to peers or averages, which has led to measurable reductions in resource use. For instance, comparative feedback via smart meters achieved up to 67% savings in controlled trials, while HEMS interventions sustained 7.8% reductions over 18 months by integrating persistent nudges and monitoring. Similarly, smart shower meters delivering auditory feedback reduced water consumption by 22% through heightened awareness of usage duration. These applications draw on principles like tunneling—guiding users through simplified steps—and , rewarding lower consumption with virtual badges or cost savings visualizations. In , mobile apps employ persuasive strategies such as , reminders, and social sharing to promote and minimize food . BinCam, for example, uses camera integration and connectivity to track bin contents and foster accountability through shared progress, correlating with higher user engagement in sorting behaviors. Gamified waste-sorting apps incorporate points redeemable for incentives, with systematic evaluations showing a positive relationship between the number of persuasive features (e.g., in 90% of reviewed apps and reduction prompts in 97%) and user ratings, though direct causation for long-term diversion remains understudied. Empirical reviews indicate that 61% of persuasive technology studies target but extend to via similar tactics, yielding short-term compliance increases without consistent evidence of enduring formation absent ongoing . For sustainable mobility and broader promotion, apps like Bellidea use point-based with extrinsic rewards—such as tangible prizes for logging non-car trips—to shift preferences toward walking, , or public transit, demonstrating raised interest among car-dependent users in pilot settings. Other eco-apps track carbon footprints or promote through tailored challenges and reciprocity tactics, like community leaderboards encouraging collective reductions. Apps such as JouleBug have boosted pro-environmental knowledge and actions in educational contexts via habit-building quests, while serious games like Energy Cat delivered 3.46% and 7.48% gas savings in social housing by simulating resource impacts. Effectiveness varies, with initial gains often relapsing after intervention cessation, as observed in one-year follow-ups where behaviors reverted without ; sustained impacts hinge on integrating intrinsic motivators like reduced psychological distance to environmental consequences over purely extrinsic ones. Despite these applications, highlights limitations in and durability, with many studies relying on self-reports prone to and focusing on short-term metrics rather than causal chains linking tech interventions to verified environmental outcomes. Reviews emphasize that while drives immediate —averaging 5-10% reductions—broader lifestyle shifts, such as adopting plant-based diets or zero-waste routines via app nudges, require addressing the attitude-behavior gap through multi-faceted designs combining with . Peer-reviewed mappings underscore potential for digital tools in maintaining behaviors over time via adaptive algorithms, but caution against overreliance on , which may foster dependency rather than internalized norms.

Commercial Marketing and Social Media

Persuasive technologies in commercial leverage computational interfaces to influence consumer decisions, often through techniques such as , cues, and integrated into websites and apps. platforms, for example, use recommendation engines that tailor product displays to individual browsing histories and preferences, applying Fogg's principles of and tunneling to simplify purchase paths and minimize cognitive . A 2023 study on developing persuasive systems for marketing identified a of 28 techniques, including appeals and prompts, with empirical testing showing their role in boosting message compliance and sales intent in digital campaigns. Meta-analytic evidence underscores the efficacy of these approaches: a review of 48 experiments found personalized advertising yields stronger effects on overall persuasion (Hedges' g = 0.35), attitudes toward the ad (g = 0.28), and purchase intentions (g = 0.22) compared to generic ads, attributing gains to perceived relevance and reduced reactance. In practice, platforms like deploy dynamic pricing and urgency timers—"only 3 left in stock"—which exploit reciprocity and scarcity to accelerate conversions, with A/B testing revealing uplift in cart completion rates by 10-20% in controlled trials. However, such tactics' long-term impact varies, as habituation can diminish returns without ongoing data refinement. Social media platforms amplify commercial persuasion by embedding design elements that prolong user sessions, thereby heightening ad exposure and revenue. Features like algorithmic feeds prioritize content evoking emotional responses or social validation—likes, shares, and notifications acting as variable rewards akin to —driving average daily usage to 2.5 hours per U.S. adult as of 2023. , for instance, integrates persuasive strategies such as primary task support (seamless scrolling) and credibility cues (influencer endorsements), with framework analyses confirming these elevate engagement metrics by fostering habitual checking and content sharing that indirectly promotes sponsored products. Empirical data links these mechanisms to behavioral outcomes: a 2023 study across platforms found persuasive designs, including streaks and reciprocity prompts, correlate with increased session lengths (β = 0.42) and problematic use indicators like FOMO-driven logins, facilitating targeted ads that convert at rates 15-30% higher than non-social channels. Platforms monetize this via micro-targeted campaigns, where user data fuels A/B-optimized content; for example, with 689,000 users manipulated news feeds to amplify positive posts, boosting engagement by 0.1% while demonstrating mood contagion's role in sustained interaction. Yet, while short-term sales lift from such virality is evident, causal attribution remains challenged by confounding factors like network effects.

Education, Productivity, and Workplace Uses

In education, persuasive technology applies techniques such as tunneling through guided sequences, tailoring via personalized , and via rewards to promote sustained learning behaviors and reduce dropout rates. A 2019 empirical study with 30 first-year students tested a e-learning platform featuring instructional self-monitoring, motivational commitments, and social rewards; post-intervention analysis via showed a statistically significant rise in understanding levels from a mean of 2.3 to 3.13 on a 5-point scale (Z=3.405, p=0.001), with 17 participants exhibiting improved behaviors. Language-learning apps like incorporate daily streaks and badges to exploit commitment consistency, prompting users to maintain practice habits and achieving over 500 million users by leveraging these mechanisms for retention. For individual productivity, tools draw on BJ Fogg's behavior model—requiring motivation, simplified ability, and timely prompts—to foster habit formation amid distractions. gamifies task completion as a game, granting experience points and leveling up avatars for productive actions while imposing penalties for neglect, which empirical user data correlates with higher adherence rates compared to non-gamified trackers. app employs virtual tree-planting tied to uninterrupted focus periods, invoking as trees "die" from app interruptions, thereby reducing smartphone usage by an average of 20-30% in self-reported trials among productivity seekers. These designs prioritize small, anchorable behaviors to scale into routines, as validated in Fogg's framework where prompts trigger actions when ability barriers are minimized. Workplace applications of persuasive technology emphasize , positive incentives, and to align employee actions with efficiency goals, often through digital platforms that provide . A 2022 systematic review of experimental studies found that rapid, meaningful prompts and personalized rewards robustly enhance and self-directed , with systems successfully shifting behaviors toward employer priorities like and task completion. elements, such as leaderboards and narrative-driven , have been shown in contexts to promote autonomous behaviors under "Goldilocks" conditions of optimal , increasing output without . These interventions yield short-term gains in metrics like task velocity, though long-term efficacy depends on avoiding over-reliance on extrinsic motivators.

Empirical Evidence of Effectiveness

Key Studies and Meta-Analyses

A 2016 meta-analysis by Miralles et al. examined the relationship between adherence to persuasive technology principles—drawn from Oinas-Kukkonen's —and the effectiveness of 48 web-based interventions for , involving 40 within-group comparisons (Hedges' g = 0.94, indicating large effects) and 19 between-group comparisons (g = 0.78, moderate to large). The analysis found a significant positive association between the total number of principles and effect sizes in within-group designs, particularly for primary task support (e.g., , ) and support (e.g., reminders, ); however, excessive principles in categories like tunneling and tailoring could diminish outcomes, suggesting optimal combinations such as pairing reminders with similarity cues enhance persuasion without overload. Adherence levels showed no correlation with effectiveness, highlighting that design quality may drive results more than user compliance. In contrast, a larger 2025 meta-analysis by Valentine et al. aggregated 92 randomized controlled trials (RCTs) of smartphone-based apps, encompassing 16,728 participants and outcomes like , anxiety, and reduction (overall g = 0.43, moderate effect). Apps incorporated 1–12 persuasive principles per design (mode = 5), yet no significant associations emerged between principle count or type and either or user metrics (reported in 76% of studies across 25 varied indicators). was robust for common conditions but limited for or , with conclusions emphasizing the need for standardized reporting amid inconsistent persuasive design impacts. These meta-analyses underscore persuasive technology's domain-specific promise in behaviorally oriented applications, where effect sizes range from moderate to large, though evidence on optimization remains equivocal—earlier work links dosage to gains, while recent scale-up reveals ties to proximal metrics like . Broader empirical surveys, such as those reviewing persuasive interventions for healthy lifestyles, report behavioral changes in approximately 48% of studies, often in or contexts, but highlight methodological heterogeneity limiting generalizability. Such findings align with Fogg's captology but caution against assuming uniform without rigorous, -tailored testing.

Long-Term vs. Short-Term Outcomes

Short-term outcomes of persuasive technologies are typically robust, with randomized controlled trials and meta-analyses demonstrating immediate behavioral shifts, such as increased or improved medication adherence, often within the first 1-3 months of use. For example, a 2025 meta-analysis of persuasive design in digital apps found significant efficacy in reducing symptoms and boosting engagement through elements like tunneling and reminders, with pooled effect sizes indicating moderate short-term improvements. Similarly, web-based incorporating persuasive principles yield higher initial adherence rates compared to non-persuasive controls, as evidenced by a showing enhanced user retention and behavioral compliance in contexts during active intervention phases. Long-term outcomes, however, reveal substantial attenuation, with effects frequently waning after 6-12 months due to user , reversion, and on continuous prompts rather than internalized change. In sustainability domains, a 2019 of behavior change interventions highlighted that strategies reliant on nudges or produce transient pro-environmental actions but fail to achieve enduring reductions in resource use, as users disengage post-intervention without sustained intrinsic drivers. Analogous patterns appear in applications, where meta-analyses of standalone tools for report initial gains (e.g., Hedge's g ≈ 0.3-0.5) that decline over time, underscoring the challenge of transitioning from externally cued behaviors to autonomous . This disparity arises because persuasive technologies excel at leveraging motivation and ability in Fogg's behavior model for acute prompts but struggle with long-term scalability without personalization or real-world integration, as critiqued in analyses of captology's limitations where non-self-sustaining changes predominate. Interventions combining persuasive elements with evidence-based habit-building, such as or social accountability, show promise for bridging this gap, though comprehensive follow-ups remain scarce and effects vary by domain.

Factors Influencing Success Rates

The success of persuasive technologies hinges on the simultaneous alignment of user , perceived to perform the target behavior, and timely triggers, as outlined in B.J. Fogg's Behavior Model (FBM). encompasses core drivers such as (), ( or fear), and social acceptance, with empirical examples like platforms leveraging peer validation to encourage content uploads. requires simplifying the behavior across dimensions including time, , physical effort, financial cost, social deviance, and routine disruption; technologies that reduce these barriers, such as one-click purchasing interfaces, demonstrate higher adoption rates. Triggers—signals, sparks to boost , or facilitators to ease action—must coincide with peak and for behavior to occur, as misalignment leads to failure despite strong design. Design adherence to the Persuasive Systems Design (PSD) framework further modulates outcomes, with primary task support (e.g., and reduction of complexity) and dialogue support (e.g., reminders and ) most frequently implemented and linked to improved user adherence in web-based interventions. A systematic review of 83 health interventions found dialogue elements, such as tailored suggestions, explained 55% of variance in adherence, outperforming social or credibility supports alone. However, meta-analyses of mobile health apps reveal inconsistent associations between PSD principle count (typically 1–12 per app) and , attributing variability to measurement inconsistencies rather than design flaws, though tunneling (guided sequences) and rehearsal features appear in 84–88% of effective cases. Individual user traits significantly moderate effectiveness, with higher , health consciousness, intrinsic motivation, and extraversion correlating to greater perceived persuasiveness in experimental evaluations of screens (explaining 40% of variance). Demographics also play a role: women often report stronger responses, while age shifts persuasiveness thresholds, with older users requiring more tailored simplicity. amplifying these traits—via adaptive feedback or —yields superior results over generic approaches, as evidenced by moderated adherence in behavior change studies. Contextual factors, including perceived ease of use and usefulness from technology acceptance models, interact with these, but empirical gaps persist in long-term causal links beyond short-term engagement metrics.

Ethical Concerns and Controversies

Distinction Between Persuasion and Manipulation

Persuasive technology, as defined by in his 2003 book, encompasses interactive computing systems designed to influence users' attitudes or behaviors without employing or , thereby preserving voluntary engagement. This framework positions as a process that supplies , cues, or simplified choices to facilitate informed , allowing users to deliberate and potentially resist the influence. In contrast, within technological contexts involves tactics that covertly exploit cognitive vulnerabilities, such as automatic biases or limited attention, to bypass deliberate reasoning and compel outcomes against the user's reflective preferences. A core ethical demarcation lies in the impact on user , defined as the capacity for —the ability to form and act on reasons aligned with one's values. supports autonomy by enhancing access to relevant data or reducing decision friction transparently, as in visible reminders for habit formation that users can dismiss or customize. erodes this by obscuring intent or creating false necessities, such as interface designs that to unwanted subscriptions without clear reversal options, thereby disrupting rational evaluation. Ethicists like Blumenthal-Barby argue that while both may leverage heuristics, manipulation qualifies as such when it systematically impairs the target's volitional control, distinguishing it from benign nudges that maintain . Transparency and intent further delineate the boundary: persuasive systems disclose their persuasive elements, enabling users to assess and consent, whereas manipulative ones rely on opacity to evade scrutiny. For instance, ethical persuasive technologies in health apps might gamify adherence with explicit progress tracking, fostering self-motivation without guile. In manipulative applications, akin to "dark patterns" in user interfaces, elements like disguised confirmations or urgency prompts deceive into unintended actions, prioritizing designer goals over user welfare. Empirical analyses of such patterns reveal they succeed by subverting autonomy rather than enlightening choice, prompting calls for design standards that mandate opt-out mechanisms and intent revelation to avert ethical overreach.

Privacy, Data Exploitation, and Autonomy Erosion

Persuasive technologies rely on granular user data—such as browsing history, biometric inputs, and interaction patterns—to tailor interventions, often necessitating pervasive that compromises . Empirical analyses of behavior change applications, including those for and , demonstrate that many collect extensive personal information beyond what users anticipate or to explicitly, with data frequently shared with third-party advertisers or analytics firms. For example, a of apps found that over 60% transmitted sensitive user data to external servers without robust or user notification, heightening risks of unauthorized and . Similarly, studies on apps employing persuasive designs reveal routine practices of tracking geolocation and sensors to infer habits, which can expose users to or discriminatory targeting if breached. These mechanisms prioritize efficacy over safeguards, as developers exploit data asymmetries to optimize metrics like retention rates, which reached 40-50% higher in personalized nudges per industry benchmarks from 2020-2023. Data exploitation in persuasive systems extends to commodifying behavioral surpluses for profit, akin to models described in surveillance capitalism, where firms extract and predict user actions to sell influence as a . Shoshana Zuboff contends that platforms like and app ecosystems convert free behavioral data into proprietary "behavioral futures markets," enabling advertisers to manipulate choices through micro-targeted prompts, as evidenced by the 2018 Cambridge Analytica scandal involving 87 million users' data for electoral persuasion. This practice, documented in peer-reviewed examinations of digital economies, allows entities to bypass rational deliberation by leveraging inferred psychological states—such as fatigue or impulsivity—from aggregated datasets exceeding petabytes in scale for major tech firms by 2022. Critics note that while proponents claim anonymization mitigates harms, re-identification attacks succeed in 99.98% of cases for datasets over 1,000 individuals, per 2019 computational research, underscoring the illusory nature of such protections. The cumulative effect erodes by supplanting informed with opaque algorithmic steering, where users perceive in decisions actually preconditioned by data-derived nudges. Ethical inquiries into persuasive technologies argue that this covert undermines volitional control, as users' choices become extensions of designers' objectives rather than endogenous preferences, supported by experimental evidence showing reduced self-reported in groups exposed to adaptive loops versus static interfaces. For instance, in health behavior apps, continuous data loops have been linked to diminished intrinsic motivation, with longitudinal studies from 2021 indicating a 25-30% drop in voluntary adherence post-initial gains when users recognize manipulative patterns. Zuboff further posits that such erosion scales societally, fostering "instrumentarian " that prioritizes prediction over democratic , as seen in regulatory fines totaling €2.7 billion against tech giants for GDPR violations related to behavioral data misuse between 2018 and 2024. While some defend these tools as benign extensions of , first-principles assessments reveal causal pathways from unchecked data leverage to habitual deference, paralleling models where dopamine-driven loops override prefrontal deliberation.

Unintended Societal Consequences

Persuasive technologies, particularly in platforms, have been linked to heightened rates of and compulsive use, driven by design elements such as variable reward schedules, infinite scrolling, and personalized notifications intended to maximize time. A 2023 study found that these persuasive designs significantly contribute to , with users reporting increased dependency and difficulty disengaging, exacerbating broader societal patterns of digital overuse that displace , , and face-to-face interactions. Empirical data from global surveys indicate that heavy correlates with a 20-30% higher likelihood of meeting clinical criteria for , unintended by initial platform goals of connectivity but resulting from profit-driven retention metrics. Among adolescents, these mechanisms have coincided with sharp declines in mental health metrics, including a 145% increase in major depressive episodes among U.S. teen girls from 2010 to 2019, temporally aligned with the widespread adoption of smartphone-based following platform shifts like Instagram's visual feed in 2012. Longitudinal analyses of CDC data reveal parallel rises in hospitalizations (up 48% for girls) and anxiety disorders, attributed in part to persuasive features amplifying social comparison, , and sleep disruption via algorithmic prioritization of emotionally charged content. While correlational, experimental manipulations reducing exposure—such as school-wide bans—have demonstrated causal reductions in depressive symptoms by 25-30% over weeks, underscoring how engagement-optimizing algorithms inadvertently foster vulnerability in developing brains. Critics note factors like economic pressures, yet the post-2010 inflection point in youth distress rates, absent in prior generations, supports a substantive role for these technologies beyond mere correlation. Algorithmic curation in persuasive systems has also fueled societal by preferentially surfacing divisive content, with a randomized experiment on U.S. users showing that engagement-maximizing feeds increased exposure to attitudinally extreme material by 15-20%, heightening animosity without boosting overall consumption. Meta-analyses of platform data indicate that while algorithms do not create filter bubbles in isolation, they amplify perceived through recommendation of like-minded but increasingly radical sources, contributing to a 10-15% widening of affective divides between political groups from 2016 to 2020. This dynamic, unintended in designs focused on user retention, has manifested in real-world outcomes like eroded in institutions, as evidenced by surveys linking heavy algorithmic exposure to doubled rates of belief in election among polarized cohorts. Broader societal ripple effects include diminished collective efficacy and interpersonal trust, as persuasive tech's emphasis on individualized nudges erodes shared civic norms; for instance, a 2021 analysis tied platform-induced outrage cycles to a 25% drop in cross-partisan willingness since 2015. These consequences, emerging from scalability of micro-persuasions at levels, highlight causal pathways where profit-oriented loops inadvertently undermine cohesion, with longitudinal studies showing intergenerational declines in community participation mirroring rises in screen-mediated .

Criticisms and Limitations

Empirical Shortcomings and Overhype

Despite promotional claims by pioneers like , who coined "captology" to describe computers as tools for influencing attitudes and behaviors, empirical evaluations of persuasive technology often reveal modest and inconsistent outcomes. Meta-analyses of interventions, a core component of persuasive tech such as digital nudges, indicate small to medium effect sizes, with Cohen's d averaging 0.45 across studies promoting behavior change, though real-world applications in scaled nudge units yield effects as low as 1-2 percentage points in take-up rates, far below laboratory findings of 8.7 percentage points. This discrepancy suggests in academic research, where positive results are overstated, contributing to overhype relative to practical efficacy. Gamification elements, frequently integrated into persuasive apps for formation, similarly underperform expectations in sustaining long-term behavioral shifts. A of gamified interventions for found only trivial increases in daily steps (less than 500) and minor reductions in , with effects diminishing over time due to user fatigue and waning motivation. In educational contexts, while gamification boosts short-term engagement, meta-analytic reviews highlight limitations in explaining deeper behavioral change, often relying on self-reported data prone to rather than measures. Fitness and health-tracking applications exemplify these shortcomings, where initial hype around persuasive features like reminders and progress badges fails to translate into enduring adherence. Longitudinal studies show high dropout rates, with users experiencing demotivation, , and futility from unmet targets, leading to disengagement and counterproductive psychological effects such as increased frustration rather than sustained activity. These findings underscore methodological flaws in early persuasive tech , including small sample sizes and power failures that inflate perceived effects, while overlooking contextual factors like user erosion that undermine causal impacts.

Risks of Abuse in Authoritarian Contexts

In authoritarian regimes, persuasive technologies can be weaponized by state actors to enforce ideological and behavioral on a mass scale, leveraging data-driven nudges, , and algorithmic feedback loops to minimize overt repression while maximizing control. China's (SCS), piloted since 2014 and expanded through local implementations, integrates data from financial records, , and public behavior to assign trustworthiness ratings, rewarding alignment with state priorities—such as timely bill payments or patriotic expressions—with benefits like expedited public services, while penalizing deviations, including criticism of the government, with restrictions on travel, employment, or education access. This mechanism operates as a form of engineered , where visible scores and consequences subtly alter without prohibiting choices outright, akin to behavioral nudges scaled to societal levels. The risks of abuse arise from the system's opacity and susceptibility to discretionary enforcement, enabling regimes to target perceived threats selectively; for instance, by 2021, blacklists under SCS frameworks had restricted millions from and air travel for infractions ranging from minor debts to spreading "rumors" that challenge official narratives, fostering preemptive among citizens. In corporate variants, such as the Corporate Social Credit System formalized in 2018, algorithms evaluate business compliance with party directives, nudging firms toward policy adherence through score-linked penalties like denials, which reinforces economic dependence on approval and discourages diverging from regime goals. While proponents, including some Chinese officials, frame these tools as promoting societal trust, independent analyses highlight how they erode individual agency by tying everyday actions to perpetual evaluation, potentially entrenching authoritarian stability through normalized rather than genuine . Export of such technologies amplifies global risks, as Chinese firms develop persuasive platforms—incorporating AI personalization and gamified loyalty mechanics—that authoritarian allies could adapt for domestic control, such as tailoring propaganda feeds to preempt dissent or incentivize informant networks. Reports from strategic analysts warn that unchecked proliferation, as seen in commercial apps mandated to align with Chinese Communist Party objectives, could enable "digital authoritarianism" where subtle influences supplant democratic deliberation, with empirical precedents in SCS pilots demonstrating compliance rates exceeding 90% in monitored locales through iterative feedback loops. This asymmetry of power, unmitigated by independent oversight, underscores causal vulnerabilities: regimes with monopolistic data access can iteratively refine manipulative algorithms, sidelining first-principles accountability in favor of outcome-engineered obedience.

Resistance and Backlash from Users

Users exhibit resistance to persuasive technologies through psychological , a motivational state triggered when individuals perceive threats to their behavioral freedoms, leading to efforts to restore such as counter-arguing, dismissal, or avoidance of the technology. Psychological reactance theory posits that overt persuasive tactics, including directive notifications or gamified rewards perceived as controlling, amplify this response, diminishing compliance; for instance, forceful language in app prompts like "you must complete this task" heightens reactance compared to subtle cues. Empirical studies in digital contexts confirm that reactance manifests physiologically as increased and negative cognitions, reducing with persuasive apps or interfaces. Backlash intensifies when users detect manipulative intent, prompting disengagement or active countermeasures like disabling features or uninstalling applications. In gamified systems, such as productivity apps employing streaks and badges, overuse leads to perceived , fostering and reversion to non-gamified alternatives; surveys indicate users abandon these when rewards feel extrinsic and autonomy-eroding rather than intrinsically motivating. strategies include selective exposure avoidance, where users curate feeds or install blockers to evade algorithmic nudges, as documented in analyses of persuasive app interactions. Digital nudges, such as default settings or timed prompts, provoke backlash when viewed as profit-driven manipulations, with evidence showing users default to behaviors to counteract perceived overreach, particularly in or social platforms. This user-driven underscores limitations in long-term , as repeated exposure erodes and amplifies toward technology-mediated attempts.

Integration with AI, VR, and Advanced Tech

systems have demonstrated superior persuasive capabilities compared to human interactants in controlled debates, with models like OpenAI's achieving higher rates of viewpoint acceptance among participants. This stems from 's ability to generate tailored arguments leveraging vast datasets on cognitive biases and rhetorical strategies, enabling scalable that adapts in to responses. For instance, experiments show producing attitude-shifting messages more effectively than generic or human-crafted ones, particularly when fine-tuned on individual profiles, raising applications in nudges and but also amplifying risks of through propaganda-like content. Peer-reviewed analyses indicate exploits heuristics such as at individual levels, potentially eroding autonomy when deployed in recommendation algorithms or chat interfaces. Virtual reality (VR) extends persuasive technology via immersive simulations that foster and induction, outperforming traditional media in altering behaviors like phobia mitigation or prosocial actions. In health domains, VR environments simulate scenarios to reinforce habits, such as virtual yielding measurable reductions in anxiety symptoms through repeated, controlled persuasive cues. Educational applications integrate captology principles by embedding behavioral prompts within gamified VR, where users' actions trigger feedback loops modeled on Fogg's Behavior Model, enhancing retention and motivation as evidenced by improved performance in anatomy training sessions. These integrations leverage VR's sensory fidelity to bypass rational resistance, creating "presence" effects that amplify depth, though empirical studies note variability based on user immersion levels and hardware accessibility. Advanced technologies like ambient computing and neurointerfaces are poised to deepen persuasive impacts by embedding nudges into everyday environments, with generative interfacing neurotech for bias-targeted interventions. Projections from 2023-2025 highlight hybrid systems combining analytics with overlays in (), enabling context-aware persuasion—such as real-time habit prompts via wearables—that scales ethical applications like sustainability campaigns while necessitating safeguards against coercive uses. Empirical validations remain nascent, with trials showing 20-30% uplift in compliance rates for personalized nudges, underscoring the trajectory toward "hypersuasion" frameworks that prioritize over unchecked deployment.

Policy and Regulatory Debates

Regulatory efforts targeting persuasive technology have intensified in response to concerns over manipulative user interfaces, often termed "dark patterns," which exploit cognitive biases to influence decisions such as subscriptions or . In the United States, the () has led enforcement, issuing a 2022 staff report documenting dark patterns' prevalence and deceptive impacts, followed by actions against companies like for complicating cancellations, resulting in a $30.8 million settlement in 2023. The 's 2021 policy statement explicitly warns against dark patterns that trick consumers into unintended actions, emphasizing violations of Section 5 of the FTC Act prohibiting unfair or deceptive practices. Internationally, the collaborated with networks like ICPEN and GPEN in 2024 to review subscription dark patterns across 30 jurisdictions, identifying widespread issues like hidden fees and hurdles. In the , the (DSA), effective from 2024, mandates platforms to mitigate systemic risks from persuasive designs, prohibiting practices that impair user autonomy, such as those nudging minors toward excessive engagement. Article 28(1) requires age-appropriate interfaces free of manipulative elements like infinite scrolling or urgency prompts, with 2025 Commission guidelines clarifying enforcement for very large online platforms (VLOPs). Proposals for a Digital Fairness Act, under consultation in 2025, extend scrutiny to addictive designs beyond economic harms, aiming to curb data exploitation and . These measures build on the Unfair Commercial Practices Directive, which deems dark patterns unfair if they materially distort consumer conduct. Debates persist over distinguishing harmful dark patterns from benign nudges, with critics arguing that broad regulations risk conflating legitimate —such as health app reminders—with , potentially stifling without clear evidence of net harm. Proponents, including regulators, cite empirical studies showing dark patterns increase unwanted subscriptions by up to 200% in lab settings, justifying intervention to preserve , though enforcement challenges arise from subjective interpretations and platform circumvention. Opponents highlight paternalistic overreach, noting that user or market may suffice, as evidenced by voluntary industry codes failing to curb abuses. In authoritarian contexts, similar tools raise fears of amplified state control, prompting calls for tech-neutral rules over sector-specific bans.

Opportunities for Individual Empowerment

Persuasive technologies empower individuals by providing interactive systems that facilitate , goal-setting, and personalized nudges to align behaviors with personal objectives, thereby enhancing in formation and decision-making. B.J. Fogg's Behavior Model posits that behavior change requires simultaneous convergence of , , and effective prompts, allowing users to leverage technology for targeted self-persuasion rather than external . Empirical reviews indicate that such tools, including mobile apps and wearables, enable users to track progress and receive timely feedback, fostering intrinsic for sustained improvement. In promotion, persuasive strategies like reminders, rewards, and self-tracking have demonstrated high efficacy; a of 170 studies reported an 80% success rate in increasing activity or reducing sedentary behavior, with succeeding in 79% of applications and in 90%. For instance, wearable interventions have boosted weekly by approximately one hour on average, equipping users with data-driven insights to refine their routines independently. These outcomes underscore PT's role in democratizing access to behavior change tools, particularly for adults where success rates reached 87%. Digital mental health applications further illustrate through persuasive design, incorporating principles such as tunneling (guiding sequential actions) and (simulating behaviors) to support self-management. A of 92 randomized controlled trials involving 16,728 participants found these apps yielded moderate improvements in , anxiety, and stress symptoms (Hedges' g = 0.43), with engagement metrics like module completion rates ranging from 30.95% to 98%. By enabling personalized interventions, PT thus aids individuals in cultivating and adaptive habits without reliance on traditional structures.

References

  1. [1]
    Persuasive Technologies - Communications of the ACM
    May 1, 1999 · A persuasive computing technology is a computing system, device, or application intentionally designed to change a person's attitudes or behavior in a ...
  2. [2]
    Persuasive Technology - 1st Edition - Elsevier Shop
    Free delivery 30-day returnsFogg has coined the phrase "Captology"(an acronym for computers as persuasive technologies) to capture the domain of research, design, and applications of ...
  3. [3]
    Fogg Behavior Model - BJ Fogg
    The Fogg Behavior Model shows that three elements must converge at the same moment for a behavior to occur: Motivation, Ability, and a Prompt.Motivation · Ability · References · PromptsMissing: official | Show results with:official
  4. [4]
    Do Persuasive Technologies Persuade? - A Review of Empirical ...
    Aug 7, 2025 · In broad terms, the theory is found to be well supported by empirical evidence. ... research and design of ambient persuasive technology. View.
  5. [5]
    Behavior Design Lab
    ### Extracted Information
  6. [6]
    Two ethical concerns about the use of persuasive technology ... - NIH
    The two ethical concerns are: not adequately considering users' interests and needs, and securing user autonomy and respecting their right to make their own ...
  7. [7]
    The Invisible, Manipulative Power of Persuasive Technology
    May 14, 2014 · Persuasive technology, which can take the form of apps or websites, marries traditional modes of persuasion—using information, incentives, and ...Missing: definition | Show results with:definition
  8. [8]
    The Ethical Use of Persuasive Technology - Behavior Design Lab
    The lab emphasizes an ethical approach to persuasive tech, teaching about it since 1997, and hopes designers focus on positive change.Missing: controversies | Show results with:controversies
  9. [9]
    Persuasive Technology: Using Computers to Change What We ...
    Persuasive technology is defined as the use of interactive systems to influence beliefs, attitudes, and behaviours without coercion (Fogg 2002) .
  10. [10]
    Persuasive Technologies - ACM Digital Library
    This triad is helpful for understanding persuasive tech- nologies because it shows how computers use various techniques to influence people's attitudes and ...<|separator|>
  11. [11]
    How Scholars Define Captology - Google Sites
    Fogg, B. (1997). Captology: the study of computers as persuasive technologies. Paper presented at the CHI'97 extended abstracts on Human factors in computing ...<|control11|><|separator|>
  12. [12]
    Definition of captology | PCMag
    (Computers As Persuasive TechnOLOGY) Captology refers to using computers to change people's attitudes and behavior. With regard to e-commerce, for instance, ...
  13. [13]
    The Functional Triad of Persuasion - ResearchGate
    The so- called functional triad is one way of classifying the viewing or responding to computing technologies: as tools, media, or social actors.
  14. [14]
    Chapter 2: The Functional Triad Computers in Persuasive Roles
    The functional triad is a framework that illustrates the three roles computing technology can play: tool, media, and social actor.
  15. [15]
    The Power of Persuasion (“Captology”) in the Age of AI and ...
    May 21, 2023 · In this book, Professor Fogg used the term “Captology” (which stands for computers as persuasive technologies) to describe an ecosystem in which ...
  16. [16]
    Captology: the study of computers as persuasive technologies
    Captology: the study of computers as persuasive technologies. Author: BJ Fogg. BJ Fogg Sun Microsystems, 901 San Antonio Road, MPK17-105, Palo Alto, CA.
  17. [17]
    [PDF] Persuasive Computers: Perspectives and Research Directions
    Functional Triad: What functional areas will the persuasive technology leverage? Should it focus on computer as tool, medium, social actor, or combinations ...
  18. [18]
    BJ Fogg - Behavior Design Lab - Stanford University
    BJ wrote a seminal book, Persuasive Technology: Using Computers to Change What We Think and Do, about how computers can be designed to influence attitudes ...
  19. [19]
    Tapping the Powers of Persuasion - MIT Technology Review
    Oct 4, 2010 · The Captologist: Stanford's B. J. Fogg coined the term “captology,” for the study of “computers as persuasive technology.” In his role as ...Missing: definition key
  20. [20]
    Persuasive Technology 101 | Time To Log Off
    Apr 4, 2022 · Persuasive technology was pioneered largely by one man, Professor BJ Fogg, at Stanford University in the late 1990s. He began formulating ...
  21. [21]
    Persuasive Design: New Captology Book - NN/G
    Mar 2, 2003 · Fogg has now opened the field's next frontier with his work on "captology" -- computers as persuasive technologies. Persuasion in itself is ...
  22. [22]
    Do persuasive designs make smartphones more addictive?
    This study provides evidence to argue that persuasive designs contribute to problematic smartphone use, potentially making smartphones more addictive.
  23. [23]
    Hypersuasion – On AI's Persuasive Power and How to Deal with It
    May 17, 2024 · Through such an approach, the power of AI as a persuasive technology can be used to support better views, decisions, and behaviours, while ...
  24. [24]
    A meta-analysis of persuasive design, engagement, and efficacy in ...
    Apr 29, 2025 · This systematic review and meta-analysis examined the efficacy of digital mental health apps and the impact of persuasive design principles on intervention ...
  25. [25]
    Persuasive technologies in China: Implications for the future ... - ASPI
    Nov 26, 2024 · Emerging persuasive technologies such as generative artificial intelligence (AI), ambient technologies and neurotechnology interact with the ...
  26. [26]
    Industry News 2024 Persuasive Technology The ... - ISACA
    Sep 4, 2024 · While persuasive technology can lead to unintended and unethical consequences if misused, it can offer numerous benefits when used appropriately ...Missing: controversies | Show results with:controversies<|separator|>
  27. [27]
    Fogg Behavior Model
    The Fogg Behavior Model shows that three elements must converge at the same moment for a behavior to occur: Motivation, Ability, and a Prompt.
  28. [28]
    BJ Fogg, PhD - BJ Fogg - Behavior Scientist & Author of Tiny Habits
    In 2002, I published a book entitled, Persuasive Technology, about how computers can be designed to influence attitudes and behaviors. At the time of ...Expert Training · Online Course · Stanford · Use My WorkMissing: PDF official
  29. [29]
    An empirical evaluation of a persuasive mobile interface: a case ...
    A growing number of mobile interfaces are now designed to be persuasive with respect to the Fogg Behaviour Model (FBM). This work carried out a structured ...
  30. [30]
    Use of a Practitioner-Friendly Behavior Model to Identify Factors ...
    We illustrate the use of the Fogg Behavior Model (FBM), a model identified as being easy for practitioners to adopt in low-resource settings.
  31. [31]
    Behavioral science meets public health: a scoping review of the fogg ...
    Oct 14, 2025 · Among these, the Fogg Behavior Model (FBM), developed by Dr. BJ Fogg, offers a specific and valuable framework for facilitating behavior change.
  32. [32]
    Theory-based habit modeling for enhancing behavior prediction in ...
    A similar idea was presented in B.J. Fogg's behavior model for persuasive design that even with sufficient motivation and ability to perform a behavior, a ...
  33. [33]
    Optimizing COVID Vaccination Rates - Behavior Design Lab
    All human behaviors are a function of three components: Motivation, Ability, and Prompts. These comprise the Fogg Behavior Model. B MAP. Like all behaviors, the ...Missing: MAT explanation
  34. [34]
    The outcomes and the mediating role of the functional triad: The ...
    Mar 23, 2018 · B.J. Fogg's Functional Triad shows the manner in which computing technologies can persuade people by playing 3 different functional roles ...
  35. [35]
    The Functional Triad Computers in Persuasive Roles - Flylib.com
    The functional triad is a framework that illustrates the three roles computing technology can play: tool, media, and social actor.<|separator|>
  36. [36]
    "Persuasive Systems Design" by Harri Oinas-Kukkonen and Marja ...
    It discusses the process of designing and evaluating persuasive systems and describes what kind of content and software functionality may be found in the final ...Missing: classification | Show results with:classification
  37. [37]
    Persuasive technologies design for mental and behavioral health ...
    May 16, 2024 · Find relevant examples of persuasive technology: Studying established examples provides insights into effective strategies and minimizes ...
  38. [38]
    ComTech: Towards a unified taxonomy of persuasive techniques for ...
    The objective of this conceptual paper is to propose a unified taxonomy of persuasive techniques for persuasive technology design, called “ComTech,”
  39. [39]
    A systematic literature review on persuasive technology at the ...
    Aug 12, 2022 · This article provides a structured literature review based on recent research on persuasive technology in the workplace.
  40. [40]
    Persuasive system design: state of the art and future directions
    This paper provides an overview of the current state of the art in persuasive systems design. All peer-reviewed full papers published at the first three ...
  41. [41]
    (PDF) Persuasive Systems Design: Key Issues, Process Model, and ...
    Aug 9, 2025 · The Persuasive Systems Design (PSD) framework is a key approach for designing and evaluating BCSS (Oinas-Kukkonen & Harjumaa, 2009 ). These ...
  42. [42]
    Persuasive systems design: Key issues, process model, and system ...
    It discusses the process of designing and evaluating persuasive systems and describes what kind of content and software functionality may be found in the final ...
  43. [43]
    [PDF] Review and Discussion of the Persuasive Systems Design Model
    The Persuasive Systems Design (PSD) model by Oinas-Kukkonen and Harjumaa (2009) is the most referenced model for designing Behavior Change Support Systems ...
  44. [44]
    [PDF] Persuasive Systems Design and Self Determination Theory
    May 5, 2025 · These categories are Primary Task Support (including persuasive software features such as Reduction, Self- monitoring and Rehearsal), Dialogue ...
  45. [45]
    Persuasive System Design Does Matter: A Systematic Review of ...
    This study aims to review the literature on web-based health interventions to investigate whether intervention characteristics and persuasive design affect ...
  46. [46]
    A TAXONOMY OF STRATEGIES FOR MULTIMODAL PERSUASIVE ...
    Feb 14, 2007 · In this article, a taxonomy of persuasive strategies and the meta-reasoning model that works on this taxonomy is described. The taxonomy is ...
  47. [47]
    Healthy Living with Persuasive Technologies: Framework, Issues ...
    This paper reviews persuasive technologies, an area of research that has the potential to assist in health related prevention services.
  48. [48]
    Tunneling - Isis Information Services
    Fogg's Principle of Tunneling: “Using computing technology to guide users through a process or experience provides opportunities to persuade along the way.”.
  49. [49]
    [PDF] An Exploration of the Ethics of Persuasive Technology
    The pervasiveness of technology comes with an increased risk for negative outcomes. Responsibility must fall upon designers to anticipate problems and ...Missing: controversies | Show results with:controversies
  50. [50]
    Persuasive Design Solutions for a Sustainable Workforce - NIH
    Feb 7, 2022 · Tailoring (57%), self-monitoring (43%), and tunneling (14%) were found to be used as persuasion strategies in studies included in the review and ...
  51. [51]
    Arranging to Persuade: Tailoring or Persuasion Through ... - IsisInBlog
    Jan 5, 2011 · Fogg's Principle of Tailoring: Information provided by computing technology will be more persuasive if it is tailored to the individual's needs, ...
  52. [52]
    Tailoring design pattern
    Tailor information to users individually. Content will be more persuasive if it is tailored to the individual needs, interests, personality, or usage context.
  53. [53]
    Persuasive System Design Principles and Behavior Change ...
    Jun 21, 2019 · The identified core techniques and principles mentioned were ... Persuasive Technology: Using Computers To Change What We Think And Do.
  54. [54]
    [PDF] Selecting effective persuasive strategies in Behavior Change ...
    Nakata, "Bibliographic Analysis of Persuasive Systems: Techniques, Methods and Domains of Application," in 7th International Conference on Persuasive Technology ...
  55. [55]
    [PDF] Toward a Systematic Understanding of Suggestion Tactics in ...
    Fogg identifies seven strategies for persuasive technology tools: reduction, tunneling, tailoring, suggestion, self-monitoring, surveillance, and conditioning ...<|separator|>
  56. [56]
    [PDF] Evaluating the use of persuasive technology for guidance and ...
    Dec 15, 2020 · These include the persuasive design strategies. (reduction, tunneling, tailoring ... The strategies from persuasive technology used in this ...
  57. [57]
    [PDF] Systematic Literature Review on Persuasive System Design ... - JOIV
    Jan 31, 2025 · tunneling is a successful technique for assisting people in changing ... Oinas-Kukkonen, "Tailoring persuasive technology: A systematic ...<|separator|>
  58. [58]
    Persuasive Technology [Book] - O'Reilly
    Fogg reveals how Web sites, software applications, and mobile devices can be used to change people's attitudes and behavior. Technology designers, marketers, ...Missing: core terminology
  59. [59]
    [PDF] How to Motivate & Persuade Users B.J. Fogg - CHI 2003
    Apr 6, 2003 · conditioning, such as reinforcement and shaping, to change behaviors ... Mobile Persuasion. Page 22. CHI 2003. 68. BJ Fogg. #68. What's ...
  60. [60]
    [PDF] Gamification, interdependence, and the moderating effect of ...
    Reinforcement mechanisms such as points used in gamification are found in design frameworks for persuasive technology, the use of digital technologies to modify.
  61. [61]
    Towards understanding the mechanism through which reward and ...
    Apr 10, 2023 · ... Persuasive Technology (PT) is any tech-. nology, such as ... Research shows that encouragement and reinforcement mechanisms can ...
  62. [62]
    Persuasive Performance Feedback: The Effect of Framing on Self ...
    In this section, we provide background on self-monitoring technologies designed for health behavior change, with an emphasis on the feedback they provide. In ...
  63. [63]
    Persuasive Performance Feedback: The Effect of Framing on Self ...
    Nov 16, 2013 · This work provides empirical guidance for creating persuasive performance feedback, thereby helping people designing self-monitoring ...
  64. [64]
    Impact of feedback generation and presentation on self-monitoring ...
    Jan 4, 2024 · The primary aim of the current study was to systematically review and, if possible, meta-analyze self-monitoring interventions that use feedback ...
  65. [65]
    Using feedback through digital technology to disrupt and change ...
    Digital technologies offer an unprecedented chance to facilitate self-monitoring by delivering feedback on undesired habitual behavior. ... feedback AND ...
  66. [66]
    [PDF] Culturally-Relevant Persuasive Technology
    Persuasive technology (PT) has been defined by B. J. Fogg as “any interactive com- puting system designed to change people's attitudes or behaviors”.<|control11|><|separator|>
  67. [67]
    Persuasive System Design: Social Support Elements to Influence ...
    Five PSD elements in social support were identified. They are: social learning, social facilitation, social comparison, recognition and normative influence. The ...
  68. [68]
    Susceptibility to social influence strategies and persuasive system ...
    This study examines how these influence strategies relate to the persuasive systems design (PSD) model constructs implemented in commercial mobile fitness ...
  69. [69]
    (PDF) Persuasive software design patterns for social influence
    Aug 6, 2025 · This article describes software design techniques for social influence as software design patterns, instantiating social influence features ...
  70. [70]
    Social Proof in the User Experience - NN/G
    Oct 19, 2014 · Social proof is a psychological phenomenon where people reference the behavior of others to guide their own behavior.Examples of Social-Proof... · Background of the Social...
  71. [71]
    Dr. Robert Cialdini's Seven Principles of Persuasion | IAW
    The First Universal Principle of Influence is Reciprocity. Simply put, people are obliged to give back to others the form of a behavior, gift, or service that ...
  72. [72]
    The Reciprocity Principle: Give Before You Take in Web Design
    Feb 16, 2014 · The reciprocity principle says that people respond in kind to nice behavior. If you want your users to trust you with their information and come ...
  73. [73]
    Persuasive technology for health and wellness: State-of-the-art and ...
    Persuasive Technology (PT) are interactive systems designed to aid and motivate people to adopt behaviors that are beneficial to them and their community while ...
  74. [74]
    Trends in Persuasive Technologies for Physical Activity ... - Frontiers
    Persuasive technology (PT) is increasingly being used in the health and wellness domain to motivate and assist users with different lifestyles and ...
  75. [75]
    Efficacy of interventions that use apps to improve diet, physical ...
    Dec 7, 2016 · This review provided modest evidence that app-based interventions to improve diet, physical activity and sedentary behaviours can be effective.
  76. [76]
    Behavior Change Effectiveness Using Nutrition Apps in People With ...
    The results suggest that mHealth apps involving nutrition can significantly improve health outcomes in people with chronic diseases.
  77. [77]
    The Effectiveness of Smartphone App–Based Interventions for ...
    Apr 20, 2023 · Objective: The purpose of this study was to synthesize the evidence for the effectiveness of smartphone app–based interventions for smoking ...
  78. [78]
    Digital technologies for behavioral change in sustainability domains
    Jan 2, 2024 · Digital tools, like serious games, mobile apps, and smart meters, can have the potential to inspire sustainable behaviors and maintain them over ...Introduction · Method · Results and discussion · Conclusion
  79. [79]
    Persuasive Apps for Sustainable Waste Management - Frontiers
    This article provides a systematic evaluation of mobile apps for sustainable waste management to deconstruct and compare the persuasive strategies employed and ...<|separator|>
  80. [80]
    A survey of empirical studies on persuasive technologies to promote ...
    Persuasive technology is the application of technology to change human behavior or attitude or both. As applied to sustainable Human Computer Interaction ...
  81. [81]
    Sustainable mobility persuasion via smartphone apps
    Most of such apps adopt a gamified approach and motivate behaviour change through external extrinsic motivational factors such as real-life prizes, that are ...
  82. [82]
    Developing persuasive systems for marketing: the interplay of ...
    Aug 12, 2023 · The study provides a condensed taxonomy of techniques and offers examples to guide the development of effective persuasive messages. Furthermore ...
  83. [83]
    How Persuasive Is Personalized Advertising? A Meta-Analytic ...
    Personalized advertising is generally more effective than generic, non-personalized advertising in influencing overall persuasion, consumer attitudes and ...
  84. [84]
    (PDF) Persuasive Technology and Users Acceptance of E-commerce:
    Mar 6, 2015 · The study proposes a theoretical model that considers the effect of persuasive technologies on consumer acceptance of e-commerce websites; with ...
  85. [85]
    Addictive Features of Social Media/Messenger Platforms and ... - NIH
    Jul 23, 2019 · Many design elements can be found in social media apps and Freemium games prolonging app usage. It is the aim of the present work to analyze several prominent ...
  86. [86]
    exploring the persuasive design elements of instagram
    This study aims to fill this gap by assessing the integration of persuasive technological strategies within Instagram, focusing on behaviour change and user ...
  87. [87]
    (PDF) The design of social media platforms—Initial evidence on ...
    Nov 13, 2022 · Initial evidence on relations between personality, fear of missing out, design element-driven increased social media use, and problematic social media use.
  88. [88]
    Persuasive technology for enhanced learning behavior in higher ...
    Apr 30, 2019 · This study uses Web 2.0 features and a persuasive system to improve learning behavior in online learning, aiming to enhance user intention.
  89. [89]
    5 App Gamification Examples You Must Copy Today - BlueThrone
    Duolingo, Habitica, Nike Run Club, Forest, and Snapchat use: XP, badges, avatars, streaks, leaderboards. Emotional triggers like progress, social proof, and ...
  90. [90]
    Stanford's School Of Persuasion: BJ Fogg On How To Win Users ...
    Dec 4, 2012 · As the founder of Stanford's Persuasive Technology Lab, he focuses on “methods for creating habits, showing what causes behavior, automating behavior change, ...Missing: history | Show results with:history
  91. [91]
    Goldilocks conditions for workplace gamification: how narrative ...
    Goldilocks conditions for workplace gamification: how narrative persuasion helps manufacturing workers create self-directed behaviors.
  92. [92]
    Game on: Can gamification enhance productivity? - PMC
    Jul 12, 2023 · Research suggests that gamification can increase work engagement by providing employees with a sense of autonomy, competence, and relatedness.Missing: persuasive | Show results with:persuasive
  93. [93]
    The relationship between persuasive technology principles ...
    There is a relationship between the number of persuasive technology principles and the effectiveness of web-based interventions concerning mental health, ...
  94. [94]
    Persuasive Technology | Guide books - ACM Digital Library
    B.J. Fogg, director of the Persuasive Technology Lab at Stanford University. Fogg has coined the phrase "Captology"(an acronym for computers as persuasive ...Missing: origins | Show results with:origins
  95. [95]
    A Systematic Review of Digital Behaviour Change Interventions for ...
    May 8, 2019 · It can be argued that digital behaviour change interventions are not enough to achieve reduced environmental impact. Many strategies are based ...
  96. [96]
    Systematic review and meta analysis of standalone digital behavior ...
    Jul 14, 2025 · A growing body of evidence suggests that DBCIs are effective in promoting PA, reducing SB, and improving overall health outcomes. By providing a ...<|separator|>
  97. [97]
    [PDF] A Behavior Model for Persuasive Design
    As I see it, persuasive technology is fundamentally about learning to automate behavior change.Missing: terminology | Show results with:terminology
  98. [98]
    The Intersection of Persuasive System Design and Personalization ...
    Sep 14, 2022 · Persuasive technology is an umbrella term that encompasses any software (eg, mobile apps) or hardware (eg, smartwatches) designed to influence ...Consumer And Patient... · Methods · Discussion
  99. [99]
    A systematic literature review on persuasive technology at the ...
    This article provides a structured literature review based on recent research on persuasive technology in the workplace.
  100. [100]
    Patient education, nudge, and manipulation: defining the ethical ...
    Apr 4, 2016 · This distinction between persuasion and manipulation was also demonstrated by Dubov; persuasion is a form of influence where one person ...
  101. [101]
    [PDF] The Ethics of Persuasion in Technology
    Persuasive tech, designed to maximize user engagement, degrades freedom and autonomy, causing distraction and undermining well-being.Missing: controversies | Show results with:controversies
  102. [102]
    [PDF] When Persuasive Technology Gets Dark? - DiVA portal
    Fogg defined persuasion in the context of persuasive computers as “an attempt to shape, reinforce, or change behaviors, feelings, or thoughts about an issue, ...
  103. [103]
    [PDF] Layered Analysis of Persuasive Designs: A Framework for ...
    dark patterns, persuasive design, user experience design, user autonomy, ethics 1 ... Neuenschwander, Toward an ethics of persuasive technology,. Communications ...
  104. [104]
    [PDF] Surveillance based Persuasion: The Good, the Bad and the Ugly
    Keywords: Persuasive Technology, Surveillance, Ethics, Freedom, Autonomy, Policy. Abstract: Surveillance-based persuasive technologies have become ...
  105. [105]
    On the privacy of mental health apps: An empirical investigation and ...
    This paper reports an empirical study aimed at systematically identifying and understanding data privacy incorporated in mental health apps.
  106. [106]
    How private is your mental health app data? An empirical study of ...
    The purpose of this project was to identify salient consumer issues related to privacy in the mental health app market and to inform advocacy efforts.
  107. [107]
    Shoshana Zuboff: 'Surveillance capitalism is an assault on human ...
    Oct 4, 2019 · What began as advertising is now a threat to freedom and democracy argues the author and scholar. Time to wake up - and fight for a different digital future.
  108. [108]
    Computational persuasion technologies, explainability, and ethical ...
    This paper conducts a systematic literature review (SLR) to evaluate the effectiveness of computational persuasion technology (CPT) in the eHealth domain.Missing: 2020s | Show results with:2020s
  109. [109]
    Social Media and Mental Health: Benefits, Risks, and Opportunities ...
    In this commentary, we consider the role of social media as a potentially viable intervention platform for offering support to persons with mental disorders, ...
  110. [110]
    [PDF] Teen Mental Health Is Plummeting, and Social Media is a Major ...
    May 4, 2022 · Teen Mental Health Is Plummeting, and. Social Media is a Major Contributing Cause. Testimony of Jonathan Haidt. Professor of Ethical Leadership ...
  111. [111]
    The Evidence - The Anxious Generation
    Is heavy smartphone and social media use during adolescence harmful to adolescent mental health and results in increased incidence of internalizing disorders, ...
  112. [112]
  113. [113]
    Is the social media creating an anxious youth?—international call for ...
    Aug 28, 2024 · The author observed a sudden increase in mental illness among youth, with the rise showing up in change in behavior as well, including self-harm ...
  114. [114]
    How do social media feed algorithms affect attitudes and behavior in ...
    Jul 27, 2023 · The notion that such algorithms create political “filter bubbles” (1), foster polarization (2, 3), exacerbate existing social inequalities (4, 5) ...
  115. [115]
    Social Drivers and Algorithmic Mechanisms on Digital Media - PMC
    Current evidence is consistent with the view that digital media as a whole, including algorithms, fuels perceived polarization by making extremist voices more ...
  116. [116]
    How social media shapes polarization - ScienceDirect.com
    This article reviews the empirical evidence on the relationship between social media and political polarization. We argue that social media shapes ...
  117. [117]
    How tech platforms fuel U.S. political polarization and what ...
    Sep 27, 2021 · Platforms like Facebook, YouTube, and Twitter likely are not the root causes of political polarization, but they do exacerbate it.
  118. [118]
    The Algorithmic Management of Polarization and Violence on Social ...
    Aug 22, 2023 · There is now good evidence, from multiple methods and perspectives, that social media platforms have had negative effects on societal conflict ...
  119. [119]
    (PDF) When Persuasive Technology Gets Dark? - ResearchGate
    In theory, unintended consequences of persuasive systems use may seem like a marginal error, but in reality, such consequences can cause long-term and harmful ...
  120. [120]
    The effectiveness of nudging: A meta-analysis of choice architecture ...
    Our results show that choice architecture interventions overall promote behavior change with a small to medium effect size of Cohen's d = 0.45.Missing: digital | Show results with:digital
  121. [121]
    [PDF] RCTs to Scale: Comprehensive Evidence from Two Nudge Units
    In papers published in academic journals, the average impact of a nudge is very large – an 8.7 percentage point take-up effect, a 33.5% increase over the ...Missing: digital | Show results with:digital
  122. [122]
    a systematic review and meta-analysis of randomized controlled trials
    Sep 25, 2024 · Gamified interventions compared to a non-gamified control resulted in trivial increases in steps, and reductions in BMI and body weight, and ...
  123. [123]
    Effects of Gamification on Behavioral Change in Education - NIH
    Mar 29, 2021 · The meta-analysis has shown that gamification affects learners' positive behavioral change, but there are limitations explaining its impact on ...
  124. [124]
  125. [125]
    The fitness of apps: a theory-based examination of mobile ... - NIH
    Jan 30, 2017 · By understanding app usage, guidelines can be developed to create apps based in health behavior research to promote long-term physical activity.
  126. [126]
    Message Design Choices Don't Make Much Difference to ...
    Jun 29, 2021 · ... failures should be expected to occur routinely in persuasion message effects research. ... Power failure: Why small sample size undermines ...
  127. [127]
    The Social Credit System: Not Just Another Chinese Idiosyncrasy
    May 1, 2020 · China, which is planning to launch the social credit system (SCS), a data-powered project to monitor, assess, and shape the behavior of all citizens and ...
  128. [128]
    [PDF] How to Make the Perfect Citizen? Lessons from China's Social ...
    Nov 5, 2021 · China's. Social Credit System is a unique case as it represents one of the most ambitious attempts in history to use sociotechnical means to ...
  129. [129]
    A Comparative Analysis of Two Data-Driven Mechanisms for ...
    Mar 12, 2025 · This paper aims to examine how nudging and the SCS are strategically designed in different ways to influence individuals' behavior in the public sector.
  130. [130]
    Social Engineering & Political Nudges - Changing the social ...
    Dec 16, 2022 · The Social Credit System is an example of how technology can be used for social engineering, nudging, and the use of behavioral economics.
  131. [131]
    China's Corporate Social Credit System: The Dawn of Surveillance ...
    Jun 21, 2023 · Our analysis underscores the potential of the CSCS to nudge corporate fealty to party-state policy and provides an early window into the far- ...
  132. [132]
    China's Corporate Social Credit System and Its Implications | FSI
    China's corporate social credit system (CSCS) is a data-driven scoring system to rate the “trustworthiness” of all business entities registered in China.
  133. [133]
    China's social credit score – untangling myth from reality | Merics
    The idea that China gives every citizen a “social credit score” continues to capture the horrified imagination of many. But it is more bogeyman than reality.
  134. [134]
    It's not too late to regulate persuasive technologies | The Strategist
    Nov 27, 2024 · The persuasive technologies they use are digital systems that shape users' attitudes and behaviours by exploiting physiological and cognitive ...
  135. [135]
    Psychological Reactance and Persuasive Health Communication
    Psychological reactance theory is a commonly relied upon framework for understanding audience members' resistance to persuasive health messages.Psychological Reactance... · Reactance and Persuasive... · Future Directions
  136. [136]
    Understanding Psychological Reactance: New Developments and ...
    Persuasive messages arouse reactance especially by using forceful and controlling language, such as the terms should, ought, must, and need. This language has ...
  137. [137]
    Psychological Reactance and Persuasive Message Design
    Apr 17, 2020 · Psychological reactance theory (PRT) provides a framework for understanding what not to do when seeking to motivate, influence, and persuade people.
  138. [138]
    Persuasion and Psychological Reactance: the Effects of Explicit ...
    Psychological Reactance Theory (Brehm 1966) accounts for how individuals become aversively aroused when perceived freedoms are threatened by overtly persuasive ...
  139. [139]
    Psychophysiological Measures of Reactance to Persuasive ... - MDPI
    Persuasive interventions can lose their effectiveness when a person becomes reactant to the persuasive messages—a state identified by feelings of anger and ...1. Introduction · 2. Method · 4. Discussion
  140. [140]
    The perils of gamification: Does engaging with gamified services ...
    Users of gamified apps are more likely to share private information with firms. · Motivational experiences caused by gamified apps cause cognitive absorption.
  141. [141]
    The Power and Pitfalls of Gamification - WIRED
    May 4, 2021 · Today, thanks to science, we know a lot more about when gamification really works, and what its boundaries seem to be. Beyond the gamified apps ...
  142. [142]
    [PDF] Understanding User Resistance Strategies in Persuasive ...
    16 Nov 2020 · Furthermore, we analyze the relationships between persuasion strategies and persuasion resistance strategies. Our work lays the ground for ...
  143. [143]
    [PDF] An Exploration of When Digital Nudges Unethically Depart
    Therefore, this paper has shown early empirical evidence of the effects that DDNs are having in forcing consumers to revert to a status quo. Contrary to nudge ...
  144. [144]
    The continued controversy of nudging - The Decision Lab
    Nudging is controversial due to its libertarian paternalist philosophy, manipulation concerns, and the field's replication crisis and media scandals.
  145. [145]
    AI can do a better job of persuading people than we do
    May 19, 2025 · OpenAI's GPT-4 is much better at getting people to accept its point of view during an argument than humans are—but there's a catch.
  146. [146]
    The potential of generative AI for personalized persuasion at scale
    Feb 26, 2024 · Specifically, we show that Open AI's ChatGPT is capable of generating personalized persuasion that is effective in shaping people's attitudes ...
  147. [147]
    How Persuasive Is AI-generated Propaganda? - Stanford HAI
    We found that GPT-3 can create highly persuasive text as measured by participants' agreement with propaganda theses.
  148. [148]
    Persuasive Technology and computational manipulation
    Jul 4, 2023 · This term is a portmanteau, standing for “Computers As Persuasive Technology”. It refers to studies regarding the area in which computing, ...
  149. [149]
    (PDF) Video games and virtual reality as persuasive technologies ...
    This paper considers persuasive technology in the form of video games and virtual reality as a means to change attitudes and/or behaviors in the area of health ...
  150. [150]
    [PDF] Video Games and Virtual Reality as Persuasive Technologies for ...
    Captology is an area of re- search that explores challenges and opportunities of using computing technology for persuasive purposes. Persuasive technology has ...
  151. [151]
    Immersive collaborative virtual reality for case-based graduate ...
    This study pilots a collaborative VR system using real-time CT data for thoracic surgery teaching. Students found it intuitive, effective, and improved anatomy ...
  152. [152]
    Captology in game-based education: a theoretical framework for the ...
    This paper provides a theory-driven framework to guide instructional and game designers through the process of persuasive game design in education.
  153. [153]
    The Art of Persuasion in the Age of AI - Method1
    Jun 24, 2024 · AI can be more persuasive than humans by using data to identify patterns, deep data analysis, audience modeling, and custom messaging.
  154. [154]
    [PDF] Bringing Dark Patterns to Light - Federal Trade Commission
    Sep 1, 2022 · The FTC has been addressing dark patterns through privacy cases and policy work for many years. Workshop panelists noted that dark patterns that ...
  155. [155]
    FTC to Ramp up Enforcement against Illegal Dark Patterns that Trick ...
    Oct 28, 2021 · The Federal Trade Commission issued a new enforcement policy statement warning companies against deploying illegal dark patterns that trick or trap consumers ...
  156. [156]
    FTC, ICPEN, GPEN Announce Results of Review of Use of Dark ...
    Jul 10, 2024 · In 2022, the FTC released a staff report, Bringing Dark Patterns to Light, which detailed a wide range of dark patterns. The Federal Trade ...
  157. [157]
    The EU's Digital Services Act - European Commission
    Oct 27, 2022 · The DSA regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, ...DSA Transparency Database · Legal documents on The... · Supervision of VLOPsMissing: persuasive | Show results with:persuasive
  158. [158]
    The long-awaited EU Guidelines on Article 28(1) DSA - Hogan Lovells
    Aug 5, 2025 · Interfaces must be age-appropriate and avoid persuasive design features such as infinite scroll or urgency cues. Platforms are also required ...
  159. [159]
    [PDF] Regulating dark patterns in the EU: Towards digital fairness
    According to the DSA, these practices are to be prohibited. The recent fitness check of EU consumer law on digital · fairness defines dark patterns as 'unfair ...
  160. [160]
    Digital Fairness Act Unpacked: Addictive Design - Osborne Clarke
    Aug 5, 2025 · In current European regulation, the rules on dark patterns mainly focus on unfair or deceptive practices with a direct economic or data ...
  161. [161]
    “Dark” and “Light” Patterns: When is a Nudge a Problem? - NAI
    Mar 29, 2021 · The challenging task in assessing a dark pattern is differentiating from a nudge seeking to achieve a business purpose that is consistent with the objectives ...
  162. [162]
    [PDF] Dark Patterns as Disloyal Design - Digital Repository @ Maurer Law
    Jul 7, 2025 · Lawmakers have started to regulate “dark patterns,” understood to be design practices meant to influence technology users' decisions through ...
  163. [163]
    [PDF] REGULATING PRIVACY DARK PATTERNS IN PRACTICE ...
    The regulatory objective lens “uses democratically created rules and standards to view when dark patterns cause individual and collective harms such as ...<|separator|>
  164. [164]
    Digital Behavior Change Intervention Designs for Habit Formation
    The results show that the most applied behavior change techniques were the self-monitoring of behavior, goal setting, and prompts and cues.
  165. [165]
    Trends in Persuasive Technologies for Physical Activity and ... - NIH
    Persuasive technology (PT) is increasingly being used in the health and wellness domain to motivate and assist users with different lifestyles and ...