Fact-checked by Grok 2 weeks ago

Center for Humane Technology

The Center for Humane Technology (CHT) is an American nonprofit organization founded in 2018 by Tristan Harris, Aza Raskin, and Randima Fernando to mitigate the unintended harms of digital technologies driven by attention-maximizing business models. Harris, a former Google design ethicist, initiated early efforts through internal memos and the "Time Well Spent" campaign highlighting persuasive design techniques that exploit human psychology for prolonged engagement. Raskin, known for software innovation and advocacy against manipulative interfaces, and Fernando, with experience in tech ethics and mindfulness applications, co-established CHT to institutionalize these concerns beyond individual companies. CHT's core focus involves analyzing misaligned incentives in tech ecosystems—where algorithms prioritize metrics like time spent over user autonomy—and proposing interventions across design, policy, and education to foster technologies that enhance rather than erode human agency and collective decision-making. The group has influenced public discourse through outputs like the podcast Your Undivided Attention, co-hosted by Harris and Raskin, which has exceeded 25 million downloads by examining technology's societal ripple effects. It also supported the 2020 Netflix documentary The Social Dilemma, featuring CHT founders and former tech insiders, which spotlighted mechanisms of behavioral manipulation in platforms and reached tens of millions of viewers. In policy realms, CHT has testified before lawmakers and advocated for regulations targeting addictive features and risks, including recent involvement in litigation exposing designs linked to youth . The claims contributions to shifts, such as adjustments reducing infinite scrolling or notifications to curb compulsive use, though causal attribution remains debated amid broader pressures. Critics, including tech analysts, contend that CHT's insider-driven reforms overlook deeper structural incentives favoring profit over restraint, potentially yielding superficial changes without enforceable external constraints. Empirical studies on tech's harms, such as correlations between heavy use and adolescent anxiety, support CHT's warnings but often fail to isolate causation from confounding factors like pre-existing vulnerabilities.

Founding and Early History

Origins and Founders

The Center for Humane Technology was established in 2018 as a dedicated to mitigating the harmful societal impacts of technology . It was co-founded by , , and Randima Fernando, who shared expertise in technology ethics and . The organization's origins trace back to Harris's observations during his tenure as a design ethicist at in the early 2010s, where he identified how platforms employed "attention-harvesting" techniques that prioritized engagement metrics over user autonomy, focus, and . Harris's pivotal contribution was an internal Google presentation in 2013 titled "A Call to Minimize Distraction & Respect Users' Attention," which argued for redesigning digital products to respect users' time and psychological well-being rather than exploiting vulnerabilities for profit. This document circulated widely within tech circles, sparking the "Time Well Spent" movement, an advocacy effort to shift industry incentives toward technologies that enhance human flourishing. Harris departed Google in December 2015 to formalize these ideas through a nonprofit precursor to CHT, which evolved into the Center for Humane Technology by 2018 with the addition of Raskin and Fernando as co-founders to broaden its scope and operational capacity. Aza Raskin, a and entrepreneur, brought experience in innovative interface design and critiques of surveillance capitalism to the founding team. Randima Fernando, also a , contributed insights into the ethical implications of digital systems. Together, the founders positioned CHT to address systemic flaws in the "," drawing on empirical evidence of technology's role in eroding attention spans and social cohesion, as evidenced by rising issues correlated with increases.

Initial Launch and Time Well Spent Movement

The Time Well Spent movement emerged from Tristan Harris's efforts as a design ethicist at Google, where he developed a presentation in the early 2010s titled "A Call to Minimize Distraction & Respect Users’ Attention." This document critiqued the attention economy's reliance on maximizing user engagement at the expense of well-being and went viral within tech circles, sparking broader discussions on redesigning technology metrics to prioritize beneficial time usage over total screen time. Harris formally launched the movement in 2013, positioning it as an advocacy framework to shift industry incentives away from addictive features toward humane alternatives. The initiative influenced policy and product adjustments, including features at companies like Facebook, Apple, and Google aimed at reducing compulsive use, though its impact on systemic change remained limited. The Center for Humane Technology (CHT) was established in 2018 as a nonprofit extension of these ideas, co-founded by Harris, Aza Raskin (inventor of browser features like infinite scroll), and Randima Fernando to institutionalize advocacy against technology-induced harms. The organization formally announced its launch on February 4, 2018, via its website and social media, declaring a mission to realign digital systems with human values. Initial activities centered on amplifying Time Well Spent principles through research, public campaigns, and partnerships, targeting the "race for attention" in social media that exacerbates distraction, polarization, and mental health issues. CHT's early output included frameworks for ethical design and calls for regulatory reforms, drawing directly from Harris's prior movement to build coalitions among technologists, policymakers, and users. While the Time Well Spent movement had operated informally and achieved some industry acknowledgments, CHT's structured launch marked a pivot to nonprofit operations with dedicated funding and staff, enabling scaled efforts like educational resources and testimony before lawmakers on persuasive technologies. Critics have noted that both initiatives primarily reflect insider perspectives from former tech employees, potentially overlooking deeper economic drivers of the attention model, but their emphasis on empirical user data—such as studies linking notifications to dopamine-driven habits—provided a evidence-based foundation for reform proposals. By mid-2018, CHT had begun collaborating with allies to prototype "humane" alternatives, solidifying its role as the movement's primary institutional heir.

Organizational Structure and Leadership

Key Personnel

Tristan Harris, Aza Raskin, and Randima Fernando co-founded the Center for Humane Technology in 2018. Harris, a former design ethicist at Google where he developed early critiques of persuasive technology, remains a prominent figure in the organization, co-hosting its podcast Your Undivided Attention and leading public advocacy on technology's societal impacts. Raskin, an entrepreneur and ethicist with prior roles at Mozilla and as founder of the Earth Species Project, contributed foundational ideas on humane design principles, though his primary current focus is on AI ethics beyond CHT. Fernando, who served as the organization's initial executive director, shifted to specializing in AI policy briefings for governments and corporations following the nonprofit's early growth. Daniel Barcay assumed the role of executive director in subsequent years, overseeing operational strategy and expansion of CHT's initiatives in policy, media, and research. The leadership draws on a network of advisors including Tim Wu, author of The Attention Merchants, and James Williams, a former Google strategist and co-founder of the Time Well Spent movement, who provide expertise on attention economics and ethical tech design. This core group emphasizes first-hand experience from Silicon Valley to inform critiques of the attention economy, though their perspectives reflect a shared insider-outsider viewpoint shaped by direct involvement in tech product development.

Funding and Financial Overview

The Center for Humane Technology (CHT) is a 501(c)(3) nonprofit organization with its fiscal year ending June 30. For the fiscal year ending June 2024, CHT reported total revenue of $3,581,637, predominantly from contributions amounting to approximately 92% of revenue, alongside expenses of $6,329,475, net assets of $6,014,566, total assets of $6,174,703, and liabilities of $160,137. Historical financials reflect growth and fluctuation, with revenue increasing from $894,347 in fiscal year 2018 to a peak of $9,457,658 in fiscal year 2023 before declining in 2024; expenses have similarly scaled, reaching $6.33 million in 2024 amid program expansion.
Fiscal Year (Ending June)RevenueExpenses
2018$894,347$570,886
2019$2,932,088$1,934,121
2020$5,246,727$3,395,097
2021$3,495,036$3,027,848
2022$2,332,710$3,358,006
2023$9,457,658$3,319,704
2024$3,581,637$6,329,475
CHT's funding derives primarily from philanthropic foundations and donor-advised funds, including the Lavin Family Foundation ($1,000,000 for science and technology initiatives), ($752,912), and Charitable Gift Fund ($512,950). Notable grants encompass $600,000 from the , $500,000 total from the ($250,000 approved November 2020 with an increase of $250,000 in June 2022, plus $100,000 in August 2018), and $500,000 from the in 2023 for AI-related projects. Additional support comes from entities such as the , , Philanthropies, , , and Foundation, many of which align with progressive philanthropic priorities. Executive compensation for fiscal year 2024 totaled $577,394 for key personnel, including co-founder and president Tristan Harris at $249,891 (base salary plus benefits), co-founder Pemith Fernando at $236,923, and advisor Aza Raskin at $210,346. CHT has also disbursed grants to aligned organizations, such as $220,000 to the Hopewell Fund for Lights on Lab and $75,000 each to Campaign for Accountability and Mothers Against Media Addiction. Public Form 990 filings provide transparency into these figures, though detailed contributor lists for amounts under reporting thresholds remain undisclosed.

Mission, Philosophy, and Core Arguments

Definition of Humane Technology

Humane technology, as defined by the Center for Humane Technology (CHT), constitutes a paradigm shift in technology design that prioritizes alignment with human values, including psychological well-being, democratic stability, and a robust shared information ecosystem, in opposition to extractive models that prioritize metrics like user attention and engagement at the expense of societal health. This approach seeks to mitigate the "wisdom gap"—the mismatch between rapidly advancing technological capabilities and human capacities for comprehension and ethical governance, as articulated by CHT co-founder Tristan Harris drawing on biologist E. O. Wilson's observation that humanity possesses "paleolithic emotions, medieval institutions, and God-like technology." Central to this definition is a commitment to respecting human vulnerabilities, such as cognitive biases like confirmation bias, rather than exploiting them for personalization or persuasion; instead, humane technology emphasizes fostering shared understanding and internalizing irreversible externalities, such as harms to social cohesion or mental health. CHT contrasts this with dominant persuasive design practices, which amplify complexity and fragment narratives, advocating for proactive measures to reduce harms beyond mere pros-and-cons evaluations. In application to fields like artificial intelligence, CHT outlines three foundational rules proposed by Harris and co-founder Aza Raskin: (1) each new technology imposes a novel class of responsibilities that cannot be deferred, such as expanded privacy rights enabled by digital persistence; (2) technologies granting power initiate competitive races with potentially destructive outcomes, necessitating preemptive safeguards; and (3) systemic tragedies, like unchecked AI proliferation, demand cross-entity coordination to enforce collective accountability. These principles underscore humane technology's pro-social orientation, aiming to elicit positive human traits like empathy and cooperation while averting zero-sum dynamics inherent in uncoordinated innovation.

Critiques of Persuasive Design and Attention Economy

The Center for Humane Technology (CHT) defines persuasive design as the application of behavioral science principles to digital interfaces, enabling platforms to shape user actions through techniques like variable rewards, social reciprocity, and personalized nudges, often without users' full awareness or consent. Co-founder Tristan Harris, formerly a Google design ethicist, describes these methods as exploiting innate human vulnerabilities—such as the dopamine response to unpredictable feedback, mirroring slot machine mechanics—to drive prolonged engagement. In a 2016 essay, Harris outlined 11 specific "hijacks," including bundling apps with habitual triggers (e.g., email checks prompting social media scrolls) and fear-of-missing-out interfaces that interrupt focus with real-time updates. CHT argues that persuasive design, when unchecked, erodes personal agency by prioritizing corporate metrics like session time over human flourishing, leading to outcomes such as fragmented attention and habitual overuse. Harris has testified to the U.S. Senate that these technologies "restructure two billion people's attention, wellbeing, and behavior" by learning user data to refine manipulative prompts, potentially threatening democratic processes through amplified outrage and echo chambers. The organization links these designs to empirical harms, citing correlations between heavy platform use and rising adolescent mental health issues, though it emphasizes causal pathways via design incentives rather than mere correlation. On the , CHT critiques the underlying ad-driven model where platforms compete for finite user time, fostering an of addictive features like infinite scrolls and algorithmic feeds optimized for retention rather than utility. This structure, per Harris, inverts human goals—treating users as means to profit ends—resulting in societal costs including shortened attention spans (e.g., average focus dropping from 2.5 minutes in 2004 to 47 seconds by 2015 in some productivity studies referenced by advocates) and via content that maximizes emotional reactivity. In congressional testimony, Harris warned that such economics amplify spread, as seen in events like the 2016 U.S. election where platforms prioritized virality over veracity. CHT's position holds that these dynamics are not inevitable but stem from misaligned incentives, advocating for alternatives like competition on healthy metrics (e.g., user satisfaction post-use) and tools for self-imposed limits, as prototyped in the earlier Time Well Spent initiative. Critics within CHT, including Harris, contend that without regulatory or design reforms, persuasive elements in emerging AI—such as chatbots mimicking empathy—will exacerbate these issues by scaling influence at unprecedented speeds.

Activities and Initiatives

Public Awareness and Media Efforts

The Center for Humane Technology conducts public awareness efforts through media production, partnerships, and expert appearances to highlight risks from persuasive technology designs. These initiatives emphasize educating audiences on attention economics and behavioral manipulation without endorsing unsubstantiated alarmism. A core component is the podcast Your Undivided Attention, launched by CHT and co-hosted by Tristan Harris, Aza Raskin, and Daniel Barcay, which airs episodes every other Thursday featuring interviews with technologists, policymakers, and researchers on technology's societal influences and reform strategies. The series aims to equip listeners with insights for personal and collective action against tech-induced harms like distraction and polarization. In February 2018, CHT collaborated with to initiate the "Truth About Tech" campaign, targeting awareness of digital media's effects on youth and development. The effort involved conferences, such as the inaugural event on , 2018, and to pressure tech firms toward less intrusive designs, engaging stakeholders including former industry executives. CHT also facilitates youth-led expression via the #MySocialTruth platform, enabling young users to document personal experiences with social media and contribute to broader discourse on humane alternatives. Complementing this, CHT experts participate in high-profile media, including Tristan Harris's appearances on 60 Minutes in November 2022 discussing humane technology principles and in February 2024 addressing tech's role in political discourse. These activities are supplemented by curated content from CHT's media team, focusing on key conversations about tech interventions, though specific reach metrics remain undisclosed in public reports.

Policy Advocacy and Regulatory Influence

The Center for Humane Technology (CHT) pursues policy advocacy aimed at imposing legal duties of care on technology companies, realigning corporate incentives toward user safety rather than engagement maximization, and addressing power asymmetries between tech firms and individuals. This includes providing expertise to lawmakers, supporting state-level legislation, and engaging in federal discussions to modernize regulatory frameworks for emerging technologies like AI and social media platforms. CHT positions itself as nonpartisan, collaborating across political lines to promote reforms such as enhanced whistleblower protections and liability for foreseeable harms from persuasive design practices. A primary focus of CHT's efforts has been child online safety, where it has joined coalitions like the Kids Code Coalition to advance age-appropriate design laws. By 2024, these advocacy initiatives contributed to the enactment of such measures in five states: California, Colorado, Connecticut, Maryland, and New York, requiring platforms to prioritize children's well-being in design choices. CHT actively supported the federal Kids Online Safety Act (KOSA), co-signing letters with over 50 organizations in March 2024 urging Senate action and mobilizing public calls for its passage, which the Senate approved on August 1, 2024, with bipartisan backing from 86 senators before it advanced to the House. In February 2025, CHT staffer Lizzie Irwin provided testimony to Vermont lawmakers on Senate Bill S.69, advocating for restrictions on addictive features and data practices targeting minors, which culminated in Governor Phil Scott signing the state's Age-Appropriate Design Code into law on June 13, 2025. CHT has also influenced AI-related policy by partnering with groups like the American Psychological Association in July 2025 to defend state-level AI regulations against federal preemption efforts, emphasizing protections against harms such as deepfakes and fraud. The organization advocates for ex ante regulatory measures, including support for the ACCESS Act to facilitate data portability and interoperability, and critiques Section 230 of the Communications Decency Act for shielding platforms from liability over user-generated harms. Additionally, CHT engages in strategic litigation to establish precedents for AI governance and has registered for domestic lobbying activities, reporting expenditures in federal disclosures as of 2025. These efforts reflect CHT's broader push against surveillance capitalism, though outcomes remain contested amid industry opposition and debates over regulatory overreach.

Research and Educational Programs

The Center for Humane Technology provides educational resources targeting professionals and younger audiences to foster awareness of technology's societal impacts. Its flagship offering, the Foundations of Humane Technology, is a free, self-paced online course comprising eight modules averaging one hour each, available in written, video, and slide formats. Aimed at individuals shaping technology—such as employees at organizations like Meta, Apple, and the United Nations—the course examines failures of prior technology paradigms and promotes humane alternatives, including value-centered design and cultural shifts within tech ecosystems. Launched prior to 2022, it has enrolled over 17,000 participants from more than 130 countries, with 99% recommending it based on feedback; the program is currently paused pending development of a second version. For aged 13-25, CHT's Youth Toolkit delivers interactive guides suitable for self-directed study, group activities, or classroom integration, blending technology analysis with mindfulness practices. Core topics include persuasive design tactics, the economy's incentives, social media's neurological effects, and strategies for toward sustainable tech reforms. Designed for educators as facilitators but accessible to users without formal guidance, the toolkit seeks to empower participants to platform business models and demand equitable digital environments. Complementary resources, such as briefs on extraction and development, draw from peer-reviewed studies to underscore risks like heightened self-harm vulnerability among adolescents. CHT's research efforts emphasize synthesis of existing empirical data rather than primary experimentation, producing tools like the —a curated aggregating dozens of studies on technology-induced harms, including social media's links to developmental delays, declines, and behavioral changes in . For instance, it highlights findings from sources such as Nature: Scientific Reports on gender-differentiated exposure effects, where young men show increased internalizing issues from platform use. Additional outputs include AI-focused analyses citing studies on emergent risks like and power-seeking behaviors in models, alongside fact sheets for policymakers and technologists that reference verified data on persuasive interfaces. These initiatives support broader educational goals, with community forums and podcasts like Your Undivided Attention—featuring expert interviews on flaws—extending research insights to public discourse. The organization's interdisciplinary approach prioritizes actionable interventions over novel data collection, often critiquing incentive structures in ecosystems.

Key Projects and Outputs

The Social Dilemma Documentary

The Social Dilemma is a documentary-drama hybrid film directed by Jeff Orlowski and released on Netflix on September 9, 2020, following its premiere at the Sundance Film Festival on January 26, 2020. The production, handled by Exposure Labs with producers including Larissa Rhodes, combines interviews with former tech executives and experts alongside a fictional narrative depicting a family's struggles with social media addiction and manipulation. It examines the mechanisms of social media platforms, portraying them as engineered for maximizing user engagement through psychological targeting. The Center for Humane Technology (CHT) played a significant role through its co-founders and personnel, who provided core analysis on the competition for user attention and featured prominently as interviewees. Tristan Harris, CHT co-founder and president, serves as a central figure, drawing on his experience as a former Google design ethicist to critique platform incentives that prioritize surveillance and behavioral prediction over user well-being. Other CHT contributors include co-founder Randima Fernando, executive director, and co-founder Aza Raskin, whose insights underscore the organization's emphasis on redesigning technology to serve human values rather than extractive business models. The film's arguments align closely with CHT's mission, highlighting how algorithms harvest personal data to exploit vulnerabilities, fostering addiction, mental health declines, social polarization, and the amplification of disinformation for profit. Harris specifically contends that platforms detect states like loneliness or depression to tailor content, effectively turning users into products in an attention economy that erodes societal cohesion. It frames these issues as stemming from persuasive design techniques, such as infinite scrolls and variable rewards, which mimic slot machines to retain engagement, rather than inherent user flaws. The documentary concludes with advocacy for regulatory and design reforms, urging lawmakers, companies, and individuals to prioritize humane technology principles that align incentives with long-term human flourishing over short-term metrics like time spent on apps. CHT has positioned the film as a catalyst for public discourse, crediting it with sparking global conversations on social media's psychological harms and boosting awareness of their efforts to counter persuasive technologies.

AI-Focused Initiatives and The AI Dilemma

The Center for Humane Technology (CHT) has broadened its scope to address artificial intelligence (AI), identifying parallels between AI's incentive structures and those of social media, which prioritize extraction over human well-being. This expansion includes efforts to mitigate AI's potential to exacerbate societal harms in areas such as relationships, work dignity, power centralization, shared understanding, and human control. CHT advocates for AI designs, deployments, and regulations that align with human values, emphasizing public awareness, policy reforms, and alternative tech incentives to prevent rapid, untested rollouts driven by profit motives. A central element of CHT's AI work is "The AI Dilemma," a 2023 presentation and podcast episode by co-founders Tristan Harris and Aza Raskin, which warns that existing AI capabilities—such as large language models deployed in consumer products like chatbots—already threaten societal stability through inadequate guardrails and safety testing. In the March 24, 2023, podcast episode, they highlight risks from a "race to recklessness," including AI's integration into platforms like Snapchat without sufficient oversight, where commercial pressures eclipse safety research. Harris and Raskin cite a 2022 survey of AI researchers indicating that 50% estimate at least a 10% probability of human extinction due to failure to control AI systems, framing this as underscoring the urgency of misalignment between AI development and societal needs. CHT positions "The AI Dilemma" as distinct from narrow existential risk debates, focusing instead on near-term societal breakdowns from AI's persuasive and extractive features, such as amplifying polarization or eroding trust in information ecosystems. To counter these, the organization calls for coordinated public and media pressure on AI developers to demonstrate safety empirically, rather than relying on self-regulation, and promotes nonpartisan policy interventions to avert a "tragedy of the commons" in AI racing. Complementing these awareness efforts, CHT engages in AI policy advocacy, including commendation of the bipartisan AI Labeling for Education and Detection Act (AI LEAD Act) introduced by Senators Dick Durbin and Josh Hawley on September 30, 2025, which aims to require transparency in AI-generated content to combat deception. In a June 2025 Q&A, CHT representatives stressed building AI policies that prioritize public interest over industry capture, through systemic incentives for humane design and cross-partisan collaboration. These initiatives build on CHT's podcast series, such as follow-up episodes debunking AI myths, and public talks by Harris at events like the AI for Good Global Summit, reinforcing the need for AI to augment rather than undermine human agency.

Reception, Impact, and Criticisms

Empirical Assessments of Influence

The Center for Humane Technology (CHT) has claimed contributions to product design changes at major technology firms, including features aimed at reducing screen time and notifications at companies such as Apple, Google, and Facebook, stemming from the "Time Well Spent" movement initiated by co-founder Tristan Harris in the mid-2010s. However, independent empirical analyses verifying causal attribution to CHT's advocacy remain absent, with such developments often aligning temporally with broader industry responses to public scrutiny rather than direct organizational influence. For instance, Apple's Screen Time feature, introduced in iOS 12 on September 17, 2018, coincided with heightened discussions on addictive design but predated or paralleled similar self-initiated adjustments by platforms facing regulatory pressures in Europe. A primary vehicle for CHT's outreach, the 2020 documentary , achieved significant viewership, with producers reporting over 100 million viewers across 190 countries by November 2021. This exposure correlated with temporary spikes in public discourse on harms, as measured by increased media coverage and social media mentions of terms like "" in the weeks following its Netflix release on September 9, 2020. Yet, surveys and behavioral studies have not demonstrated sustained reductions in user or engagement metrics attributable to the film; for example, global usage continued to rise post-release, with average daily increasing by approximately 10-15% year-over-year through 2021 in major markets. In policy domains, CHT's advocacy has informed congressional testimonies and reports, such as Harris's appearances before U.S. Senate committees in 2018 and 2023 on technology's societal impacts. Nonetheless, no peer-reviewed or governmental evaluations have quantified CHT's role in enacted legislation, such as the EU's (effective 2024) or U.S. state-level age verification laws, which draw from wider coalitions including regulators and other NGOs rather than CHT-specific inputs. Self-assessments by CHT highlight shifts in "policy discourse," but these lack metrics like citation frequency in bills or pre/post advocacy polling data to substantiate influence beyond correlational awareness gains. Critics have noted the absence of rigorous outcome evaluations, arguing that CHT's efforts may amplify moral panic without disrupting core profit-driven incentives in the attention economy, as evidenced by persistent growth in social media ad revenues—reaching $230 billion globally in 2023 despite reform calls. Longitudinal studies on awareness campaigns akin to CHT's, such as those on media literacy, show modest short-term attitude shifts but negligible long-term behavioral changes without enforcement mechanisms. Overall, while CHT has elevated ethical concerns in tech debates, empirical evidence of transformative influence on user autonomy, corporate conduct, or regulatory outcomes remains anecdotal and unverified by third-party data.

Achievements and Positive Outcomes

The Center for Humane Technology (CHT) co-produced the 2020 documentary , which featured former tech executives discussing persuasive design's societal effects and achieved 38 million household views in its first four weeks, alongside two for Outstanding Documentary or Special and Outstanding Cinematography for a Program. This exposure contributed to broader discourse on social media's role in and , with CHT reporting subsequent shifts in polls toward greater concern over tech accountability. CHT's policy advocacy has aligned with legislative advancements, including support for the (KOSA), which the U.S. Senate passed on August 29, 2024, imposing duties of care on platforms to mitigate harms to minors such as and exploitation. The organization participated in coalitions urging its enactment, testified in related congressional hearings, and highlighted cases of youth harm to underscore the need for design reforms. Similarly, CHT commended the introduction of the AI LEAD Act in September 2025, advocating for labeling requirements on AI-generated content to enhance transparency. Early efforts by co-founder Tristan Harris, including his 2013 internal Google presentation "A Call to Minimize Distraction & Respect Users’ Attention," which circulated widely within the industry, preceded features like Apple's Screen Time tools introduced in iOS 12 on June 4, 2018, enabling user limits on app usage. CHT's resources, such as its Policy Reforms Toolkit released in the early 2020s, have informed stakeholder discussions on incentive structures, with the group testifying before the U.S. Senate Committee on Commerce, Science, and Transportation in 2018 to promote humane design principles. By 2025, CHT had expanded to AI governance, publishing analyses like "The Narrow Path" in April 2025, which outlined risk mitigation frameworks adopted in select policy briefs, while maintaining programs empowering users through guides on tech habit reduction. These outputs have reached hundreds of millions via media amplification, per self-reported metrics, fostering incremental industry self-regulation amid persistent challenges.

Skepticism, Counterarguments, and Debunkings

Critics of the Center for Humane Technology (CHT) contend that its narrative overemphasizes the manipulative aspects of digital platforms while downplaying user agency and the tangible benefits of technology, such as enhanced connectivity during global events like the COVID-19 pandemic, where social media facilitated real-time information sharing and community support. For instance, assertions in CHT-backed projects like The Social Dilemma (2020) portraying algorithms as akin to "digital slot machines" engineering addiction have been challenged for conflating engagement metrics with clinical addiction, lacking robust causal evidence linking platform design directly to widespread substance-like dependencies. Counterarguments highlight that CHT's focus on harms, including mental health declines among youth, often relies on correlational data rather than establishing causality amid confounding factors like economic pressures or pre-existing societal trends; longitudinal studies, such as those reviewing self-reported well-being, indicate that heavy social media use correlates with lower happiness but does not prove platforms as the primary driver, with effects varying by individual usage patterns and offline contexts. Skeptics argue this selective emphasis risks promoting paternalistic interventions that undermine personal responsibility, as evidenced by debates where viewers of The Social Dilemma are positioned as passive victims rather than active participants capable of self-moderation. Debunkings of specific CHT claims, such as social media's role in eroding democracy through polarization, point to insufficient attribution of causality; while platforms amplify divisive content, empirical analyses show that partisan divides predate widespread social media adoption, with factors like cable news and cultural shifts playing larger roles in ideological entrenchment. Moreover, CHT's advocacy for redesigning technology to prioritize "human values" over profit has been critiqued as philosophically naive, ignoring how economic incentives drive innovation that has democratized access to education and global discourse, potentially stifling progress if enforced through regulation without market-tested alternatives. Founders like Tristan Harris, despite their insider credentials, face accusations of selective contrition—having contributed to persuasive design elements during their tech tenures—raising questions about the movement's consistency in addressing systemic incentives rather than offering superficial ethical overlays.

References

  1. [1]
    Impact and Story - Center for Humane Technology
    We're a nonprofit exposing the negative effects of persuasive technology and social media and empowering people to take action. Discover The Social Dilemma, ...
  2. [2]
    Randima Fernando - Center for Humane Technology
    He is a Co-Founder and former Executive Director of Center for Humane Technology ... Randy then served for seven years as founding Executive Director at Mindful ...
  3. [3]
    Center for Humane Technology
    Center for Humane Technology is a nonprofit dedicated to ensuring that the most consequential technologies actually serve humanity.CareersControl Your Tech UseImpact and StoryYour Undivided AttentionCourse
  4. [4]
    Your Undivided Attention - Center for Humane Technology
    Co-hosts Tristan Harris, Aza Raskin, and Daniel Barcay explore the unprecedented power of emerging technologies: how they fit into both our lives and a ...
  5. [5]
    Interview With Tristan Harris - Issues in Science and Technology
    May 16, 2023 · The technology ethicist talks about misinformation, governing artificial intelligence, and how technology can strengthen democracy.Missing: achievements | Show results with:achievements
  6. [6]
    CENTER FOR HUMANE TECHNOLOGY: NEW FEDERAL LAWSUIT ...
    Oct 23, 2024 · Lawsuit alleges interactions with Character.AI chatbot deceived and manipulated a 14-year-old Florida boy, causing him to take his own life.Missing: achievements | Show results with:achievements
  7. [7]
    Center for Humane Technology - InfluenceWatch
    Center for Humane Technology claims credit for inducing numerous social media and technology companies to introduce user interface changes designed to limit the ...Missing: achievements | Show results with:achievements
  8. [8]
    Why Silicon Valley can't fix itself | Technology - The Guardian
    May 3, 2018 · As the backlash against tech has grown, so too has the appeal of techies repenting for their sins. The Center for Humane Technology has been ...
  9. [9]
    The CHT Perspective - Center for Humane Technology
    Our mission centers on one critical question: how can technology better serve society? Operating across media, policy, and tech, CHT acts as the essential ...
  10. [10]
    Tristan Harris - Center for Humane Technology
    The deep resonance of the ideas led to the Time Well Spent movement, which sparked product changes at Facebook, Apple, and Google, and laid the groundwork for ...
  11. [11]
    Technology's “Time Well Spent” movement has lost its meaning
    The former Google design ethicist launched the Time Well Spent movement in 2013 after his presentation “A Call to Minimize Distraction & Respect Users' ...
  12. [12]
    Tristan Harris: Tech Is 'Downgrading Humans.' It's Time to Fight Back
    Apr 23, 2019 · He leaves Google to focus on a nonprofit he's founded called Time Well Spent, and he starts talking to reporters. He's profiled in The Atlantic, ...
  13. [13]
    Team and Board - Center for Humane Technology
    Randima Fernando. Co-Founder. Mindy Frisbee. Director of ... Aza Raskin. Co-Founder. Julia Scott. Senior Producer, Your ...
  14. [14]
    Center For Humane Technology on X: "Our new organization, the ...
    Feb 4, 2018 · Our new organization, the Center for Humane Technology, was formally announced today. Visit http://humanetech.com to learn more.
  15. [15]
    Tristan Harris
    Tristan is Co-Founder of the Center for Humane Technology (CHT), a nonprofit organization whose mission is to align technology with humanity's best interests.
  16. [16]
  17. [17]
    Aza Raskin – Aza Raskin
    Aza is a National Geographic Explorer, Co-Founder and President of the Center for Humane Technology, and Co-Founder and President of Earth Species Project.
  18. [18]
    About - [ Center for Humane Technology ]
    I'm Randima Fernando, a co-founder of Center for Humane Technology. At CHT these days, I lead our AI briefings and educate members of government, corporate ...
  19. [19]
    Center For Humane Technology - Nonprofit Explorer - ProPublica
    Summary charts: organization finances over time · Revenue. $3.58M (2024) · Expenses. $6.33M (2024) · Total Assets. $6.17M (2024) · Total Liabilities. $160k (2024).
  20. [20]
    Center for Humane Technology
    Funding opportunties · Our goals and initiatives · Back. Grantee. Center for Humane Technology. Visit Website. 1 Grants / $600,000. Year. Term. Amount. Funding ...
  21. [21]
    138018 - Center for Humane Technology - Ford Foundation
    Total Amount: $500,000 ; Grant approval: $250,000 – November 2020 ; Grant increase: $250,000 – June 2022 ; Approval date: November 2020 ; Start date: October 2020 ...<|separator|>
  22. [22]
    131422 - Center for Humane Technology - Ford Foundation
    Grantee. Center for Humane Technology. Visit website · More about this grantee. Total Amount: $100,000; Grant approval: $100,000 – August 2018; Approval date ...
  23. [23]
    2023 Grants - Future of Life Institute
    Feb 4, 2025 · Center for Humane Technology. Amount recommended. $500,000.00. Primary investigator. None assigned. Details. Project Summary. Support for AI ...
  24. [24]
    A better future with technology is possible.
    **Summary of Center for Humane Technology (CHT):**
  25. [25]
    What Is Humane Technology?
    Apr 7, 2022 · We seek to replace the existing, extractive paradigm of technology development with a more humane one. · Whether you're a technologist, designer, ...
  26. [26]
    The Three Rules of Humane Tech
    Apr 6, 2023 · Here are the three rules that Tristan and Aza propose: RULE 1: When we invent a new technology, we uncover a new class of responsibility. We ...
  27. [27]
    How Technology is Hijacking Your Mind — from a Magician and ...
    May 18, 2016 · Tristan Harris was a Design Ethicist at Google until 2016 where he studied how technology restructures two billion people's attention, wellbeing and behavior.<|separator|>
  28. [28]
    Persuasive Technology
    Persuasive technology constantly learns more about us and pairs that information with compelling and creative design ideas to influence our behavior more ...
  29. [29]
    [PDF] Good morning. I want to argue today that persuasive technology is a ...
    Conservative pollster Frank Luntz calls it the “the climate change of culture.” We at the Center for Humane Technology call it “human downgrading”: Page 8 ...Missing: achievements controversies
  30. [30]
    Rebelling against attention economy, Humane Tech movement ...
    Jul 25, 2019 · A growing coalition of social groups is pushing back against tech companies that feed digital addiction and overload.
  31. [31]
    Bringing Clarity to the Public - Center for Humane Technology
    Cutting-Edge Original Content: CHT's media team curates must-listen and must-read conversations that allow the public to truly understand how tech is impacting ...Missing: campaigns | Show results with:campaigns
  32. [32]
    Common Sense Partners with the Center for Humane Technology
    Feb 5, 2018 · The campaign, called Truth About Tech, will put pressure on the tech industry to make its products less intrusive and less addictive. The effort ...Missing: awareness | Show results with:awareness
  33. [33]
    Truth About Tech campaign warns of technology's threat to kids
    Feb 5, 2018 · Common Sense and the Center for Humane Technology launch an initiative to push for awareness of the effects of tech on children.
  34. [34]
    Ethical Tech Will Require a Grassroots Revolution - WIRED
    Feb 8, 2018 · The Center for Humane Technology wants to liberate us from tech addiction—and that starts with the people, not companies or Congress ...
  35. [35]
    #MySocialTruth - Center for Humane Technology
    #MySocialTruth offers a platform for young people like you to bring your voice to this movement. Share your experience, and help reimagine the future.
  36. [36]
    Humane Technology on 60 Minutes
    Nov 10, 2022 · We're a nonprofit exposing the negative effects of persuasive technology and social media and empowering people to take action.Missing: appearances | Show results with:appearances
  37. [37]
    Spotlight — Humane Technology on '60 Minutes' - YouTube
    Feb 6, 2024 · The weekly American news show 60 Minutes invited Center for Humane Technology co-founder Tristan Harris back recently to discuss political ...Missing: appearances | Show results with:appearances
  38. [38]
    Policy: Shifting Incentives - Center for Humane Technology
    HumaneTech's policy aims to shift incentives by clarifying accountability, realigning corporate incentives to prioritize safety, and driving structural change ...
  39. [39]
    How Change Happens - Center for Humane Technology
    As a strictly nonpartisan organization, CHT is able to work across the political spectrum in order to advocate for technology that serves the public interest.Missing: regulatory | Show results with:regulatory
  40. [40]
    The unlikely alliance bringing the tech giants to heel - POLITICO
    Jul 28, 2024 · An alliance of little-known advocacy groups has convinced five states to pass laws to protect kids online and is now making inroads in Washington.
  41. [41]
    Center For Humane Technology on X: "Over 50 organizations ...
    Mar 21, 2024 · urging him to bring the Kids Online Safety Act #KOSA to a vote in the U.S. Senate. It's time to put kids first and http://PassKOSA.org.
  42. [42]
    US Senate passes Kids Online Safety Act - datenschutz notizen
    Aug 29, 2024 · ... Online Safety Act (KOSA). ... Tristan Harris, former Design Ethicist at Google, has gone so far as to co-found the Center for Humane Technology ...
  43. [43]
    We're thrilled to see Vermont's Age Appropriate Design Code signed ...
    Jun 13, 2025 · Center For Humane Technology ... We look forward to more states following Vermont's lead to demonstrate that kids' online safety is a priority ...
  44. [44]
    Protecting state AI regulations - APA Services
    Jul 8, 2025 · Alongside partners including Common Sense Media, the Center for Humane Technology, and Mental Health America, the association called on ...
  45. [45]
    Policy Reforms Toolkit - Center for Humane Technology
    We're a nonprofit exposing the negative effects of persuasive technology and social media and empowering people to take action. Discover The Social Dilemma, ...
  46. [46]
    Domestic Lobbying by Center for Humane Technology - LegiStorm
    LegiStorm · LDA Lobbying for Center for Humane Technology · Summary · Top 3 Registrants · Q2 2025 Overview · Historical Trends - Overall Spending and Registrants.
  47. [47]
    Foundations of Humane Technology
    A free, self-paced online course for professionals shaping tomorrow's technology. We've paused the current course while working on a new version.
  48. [48]
    Youth Toolkit - Center for Humane Technology
    and push to change — a ...Persuasive Technology · The Attention Economy · Social Media and the BrainMissing: programs | Show results with:programs
  49. [49]
    Social Media and the Brain - Center for Humane Technology
    We're a nonprofit exposing the negative effects of persuasive technology and social media and empowering people to take action. Discover The Social Dilemma, ...Missing: critiques | Show results with:critiques
  50. [50]
  51. [51]
    Ledger of Harms - Center for Humane Technology
    They have developed sophisticated media manipulation strategies, including hijacking existing memes and seeding false narratives widely.Missing: achievements controversies
  52. [52]
    AI in Society - Center for Humane Technology
    While AI promises to enhance human cognition, eliminate drudgery, and accelerate humanity's most important scientific, technical, and industrial endeavors, this ...<|separator|>
  53. [53]
    The Social Dilemma (2020) - IMDb
    Rating 7.6/10 (93,094) The Social Dilemma: Directed by Jeff Orlowski-Yang. With Tristan Harris, Jeff Seibert, Bailey Richardson, Joe Toscano. Explores the dangerous human impact ...The Social Dilemma · Release info · Das Dilemma mit den sozialen... · User reviews
  54. [54]
    Watch The Social Dilemma | Netflix Official Site
    This documentary-drama hybrid explores the dangerous human impact of social networking, with tech experts sounding the alarm on their own creations.
  55. [55]
    The Social Dilemma (2020) - Box Office and Financial Information
    Jeff Orlowski, Director ; Vickie Curtis, Screenwriter ; Davis Coombe, Screenwriter ; Jeff Orlowski, Screenwriter ; Larissa Rhodes, Producer.
  56. [56]
    The Film - The Social Dilemma
    This documentary-drama hybrid reveals how social media is reprogramming civilization with tech experts sounding the alarm on their own creations.
  57. [57]
    The Social Dilemma - Center for Humane Technology
    Netflix documentary “The Social Dilemma” pulls back the curtain on how dangerous social media design manipulates our psychologies.
  58. [58]
    The AI Dilemma - Center for Humane Technology
    Mar 24, 2023 · The AI Dilemma. Apple Podcasts Spotify YouTube. Episode 65 | Mar 24 ... At Center for Humane Technology, we want to close the gap between ...
  59. [59]
    The A.I. Dilemma - March 9, 2023 - YouTube
    Apr 5, 2023 · ... the-ai-dilemma ------ Citations: 2022 Expert Survey on Progress ... Center for Humane Technology•17K views · 42:44 · Go to channel · Tristan ...
  60. [60]
    Center for Humane Technology praises AI LEAD Act, calls ... - LinkedIn
    Sep 30, 2025 · Center for Humane Technology commends Senators Durbin and Hawley for their bipartisan leadership in introducing the AI LEAD Act, ...<|separator|>
  61. [61]
    How do we build AI policy that serves the public?
    Jun 13, 2025 · The Center for Humane Technology is a nonprofit focused on steering society towards advancements that serve humanity rather than detract from it ...Missing: influence | Show results with:influence
  62. [62]
    AI Myths and Misconceptions - Center for Humane Technology
    May 11, 2023 · A few episodes back, we presented Tristan Harris and Aza Raskin's talk The AI Dilemma. People inside the companies that are building ...
  63. [63]
    Tristan Harris at the AI for Good Global Summit: The AI Dilemma
    ... Center for Humane Technology, has emerged as a key figure in discussing the ... In his recent talk at the AI for Good Global Summit, titled “The AI Dilemma ...
  64. [64]
    Behind the Curtain on The Social Dilemma with Jeff Orlowski-Yang ...
    Nov 11, 2021 · Exposure Labs' Jeff Orlowski-Yang and Larissa Rhodes started their film career focused on climate change. But what they realized was that the ...
  65. [65]
    What 'The Social Dilemma' misunderstands about social networks
    Sep 16, 2020 · (The Social Dilemma has been among the 10 most watched programs on Netflix all week.) This cartoon super villain view of the world strikes me as ...
  66. [66]
    Social Dilemma Review: Why Social Media Isn't Hijacking Your Brain
    One of the film's central themes is that social media is like an addictive drug, and I wrote the book on habit-forming technologies: Hooked: How to Build Habit- ...
  67. [67]
    Q&A with the Center for Humane Technology: 'Tech should help ...
    Jun 12, 2025 · Over 50 hours of conversation, adult-age Character AI chatbots groomed kids into romantic or sexual relationships, offered them drugs, and ...Missing: achievements controversies
  68. [68]
    Be Wary of Silicon Valley's Guilty Conscience - LibrarianShipwreck
    Feb 13, 2018 · The Center for Humane Technology is on point in identifying that there are serious technologically exacerbated problems in contemporary society.Missing: achievements | Show results with:achievements
  69. [69]
    The Social Dilemma is a Great Conversation Starter, but What Does ...
    Sep 28, 2020 · The Social Dilemma, a new documentary from Netflix, makes the case for a potential culprit: social media, the business models that drive it, and ...
  70. [70]
    Why We Can't Have Humane Technology | L.M. Sacasas
    Mar 11, 2018 · As you might imagine, there have been critics of this recent ethical awakening among at least a few of the well-connected in Silicon Valley.Missing: achievements controversies
  71. [71]
    The Social Dilemma
    The Social Dilemma blends documentary investigation and narrative drama to disrupt the disrupters, unveiling the hidden machinations behind everyone's favorite ...The Dilemma · The Mental Health Dilemma · Take a Social Media Reboot · The FilmMissing: behavior | Show results with:behavior
  72. [72]
    BREAKING NEWS | Center for Humane Technology
    Center for Humane Technology. 63,771 followers. 11mo. Report this post; Close menu. BREAKING NEWS - The Senate will vote on the Kids Online Safety Act this week ...<|separator|>
  73. [73]
    Why The Social Dilemma is Wrong - Danielle Newnham
    Aug 27, 2021 · Why The Social Dilemma is Wrong · And could actually cause more harm than good · Addiction · Propaganda · Social Media and Equality · Education.
  74. [74]
    [PDF] What 'The Social Dilemma' Gets Wrong
    We removed over 22 million pieces of hate speech in the second quarter of 2020, over 94% of which we found before someone reported it—an increase from a quarter ...
  75. [75]
    There Are Two Sides To The Debate About 'The Social Dilemma' On ...
    Sep 29, 2020 · It's a fairly simple division between the two: You either think the social media companies are to blame for making these apps too addictive and ...