Fact-checked by Grok 2 weeks ago

Reliability of Wikipedia

The reliability of Wikipedia concerns the accuracy, verifiability, and neutrality of information in its articles, which are collaboratively authored and editable by volunteers under policies emphasizing cited sources and a neutral point of view. Empirical assessments, such as a 2005 Nature investigation of science entries, have found Wikipedia's serious error rate (approximately four per article) comparable to that of Encyclopædia Britannica (three per article), suggesting reasonable factual reliability in non-contentious domains with robust editing. However, reliability diminishes in ideologically sensitive topics due to documented political biases, including a left-leaning tendency to portray right-leaning figures and concepts with greater negative sentiment, as quantified in recent computational analyses of article language and structure. Studies indicate this bias exceeds that observed in Britannica, arising not primarily from overt revision wars but from the composition and enforcement of content by the editor community. Wikipedia's open model also exposes it to vandalism and hoaxes—deliberate fabrications that can persist if undetected—though community tools and patrols typically revert such alterations swiftly, with analyses showing most hoaxes surviving mere hours to days before removal. These vulnerabilities, combined with uneven article quality across the site's six million English entries, highlight that while Wikipedia excels in breadth and timeliness, its dependability varies markedly by subject, demanding user caution and cross-verification especially in controversial areas.

Fundamental Mechanisms Affecting Reliability

Editing Model and Core Policies

Wikipedia's editing model operates on an principle, allowing any internet user—registered or anonymous—to add, modify, or remove content in , with oversight provided through , reversion of problematic edits, and discussion on article talk pages. This decentralized system enables swift incorporation of new information and collective error correction, as demonstrated by analyses showing that article quality typically improves through iterative edits, with many reaching a stable, high-quality state after sufficient revisions. However, the model is vulnerable to , hoaxes, and coordinated manipulation, as low permit bad-faith actors to introduce falsehoods that may persist briefly before detection, and contentious topics often devolve into prolonged edit wars where wholesale reversions hinder progress. The core content policies—neutral point of view (NPOV), verifiability, and no original research (NOR)—form the foundational guidelines for maintaining reliability. NPOV mandates that articles represent all significant viewpoints from reliable sources in proportion to their prominence, aiming to avoid advocacy or undue weight; verifiability requires all material to be attributable to secondary sources deemed reliable by consensus, excluding primary interpretation; and NOR bars unpublished analyses or syntheses, confining contributions to summarization of existing knowledge. These policies theoretically promote factual integrity by prioritizing evidence over opinion, with verifiability serving as a check against unsubstantiated claims. In practice, enforcement of these policies depends on volunteer , which introduces variability and potential for bias, as disputes are resolved through majority voting or rather than impartial . Empirical studies reveal inconsistent compliance, particularly with NPOV, where political articles often exhibit left-leaning slants due to selective sourcing and editor demographics favoring viewpoints, undermining the policy's neutrality ideal—for example, a 2012 analysis of U.S. political entries found deviations from balance aligning more with liberal media patterns. Verifiability, while strengthening scientific and historical coverage through citation requirements, falters when "reliable" sources are drawn disproportionately from ideologically aligned institutions like or academia, which systematic biases render non-neutral in social and political domains. Overall, while the model and policies enable broad coverage and self-correction in neutral topics, their reliance on community goodwill amplifies reliability risks in polarized areas, where and participation imbalances distort outcomes.

Editor Demographics and Motivations

Wikipedia editors exhibit a pronounced demographic imbalance, characterized by overwhelming male dominance, with a 2011 survey of 5,073 respondents reporting 91% male participation. Subsequent Wikimedia Foundation data from the 2020s maintains this skew at approximately 87% male among contributors, alongside underrepresentation of racial minorities such as fewer than 1% identifying as Black or African American in the U.S. editor base. Age demographics center on younger adults, with an average of 32 years in the 2011 study, while educational attainment skews high, as 61% held at least an associate or bachelor's degree. Geographically, editing activity concentrates in Western nations, with the U.S. (20%), Germany (12%), and the U.K. (6%) comprising a significant share of respondents in early surveys, reflecting limited input from the Global South. This profile persists despite diversity initiatives, contributing to coverage gaps in non-Western and minority perspectives. Motivations for editing primarily stem from intrinsic and altruistic impulses. In the 2011 survey, 69% initiated contributions to volunteer and disseminate knowledge, a factor sustaining 71% of ongoing participation, while 60% cited enjoyment as a key driver. Peer-reviewed analyses reinforce this, identifying self-concept enhancement—such as bolstering personal identity through communal knowledge-building—as a dominant force surpassing extrinsic incentives like recognition. Other research highlights task-oriented drives, including correcting errors and exchanging information, alongside social rewards from community interaction. For specialized contributors like scientists, motivations include leveraging expertise to counter misinformation, viewing Wikipedia as a public good warranting voluntary upkeep. However, demographic homogeneity intersects with motivations in ways that foster potential . Direct surveys of political affiliations remain limited, but patterns reveal ideological : contributors to politically slanted articles on U.S. topics cluster by partisan leanings, with left-leaning content drawing distinct networks from right-leaning ones. This suggests that for some, serves advocacy purposes, advancing preferred narratives under the guise of neutrality—a dynamic critiqued by co-founder , who attributes systemic left-wing to editors' skewed credibility assessments favoring progressive viewpoints over conservatism or traditionalism. Such motivations, amplified by low , can prioritize worldview alignment over empirical detachment, as evidenced by content analyses showing disproportionate negativity toward right-leaning figures. Wikimedia's self-reported data, while useful, may underemphasize these tensions due to institutional incentives for portraying inclusivity. Overall, while core drives emphasize , the editor pool's composition incentivizes selective fact-emphasis, undermining comprehensive reliability.

Governance Structures and Dispute Resolution

Wikipedia employs a decentralized, peer-governed where content decisions emerge from consensus among volunteer editors, without a central or formal . Administrators, elected by the through requests for adminship, hold privileges such as page protection, deletion, and user blocking to enforce policies, but their actions are subject to oversight via processes. This adhocratic model, blending anarchic participation with bureaucratic elements like peer , aims to distribute authority but has evolved into informal hierarchies influenced by editor experience and tenure. Dispute resolution follows a multi-tiered escalation process: initial discussions on article talk pages, followed by informal mechanisms like third opinions or Requests for Comments (RfCs), specialized noticeboards, volunteer mediation, and ultimately the Arbitration Committee (ArbCom) for conduct violations. RfCs solicit broader input to gauge on contentious issues, while ArbCom, comprising 7-15 elected arbitrators serving staggered terms, issues binding remedies in severe cases, such as topic bans or indefinite blocks. These processes prioritize verifiability, neutrality, and no original research policies to maintain content integrity. Empirical analyses reveal inefficiencies in that undermine reliability. A of approximately 7,000 RfCs from 2011-2017 found that about 33% remained unresolved due to vague proposals, protracted arguments, and insufficient third-party , allowing contentious to persist without closure. Qualitative insights from frequent closers highlighted deviations from deliberative norms, such as biased framing or niche topic disinterest, exacerbating delays and potential for unaddressed errors. ArbCom decisions exhibit influences from disputants' , defined by edit histories and community ties, which correlates with lighter sanctions to preserve social cohesion over strict norm enforcement. Analysis of 524 cases from 2004-2020 showed a negative relationship between an editor's Wikipedia-related edit volume and sanction severity, with well-connected individuals leveraging preliminary statements for favorable outcomes, often at the expense of newcomers or outsiders. Interviews with 28 editors underscored factional dynamics and "power plays," suggesting that favors entrenched networks—predominantly experienced, Western editors—potentially perpetuating ideological imbalances in content control. Critics argue that inconsistent policy application and power concentration in a small administrative corps enable capture by motivated subgroups, hindering neutral resolution and allowing biases to embed in articles. While rapid works for uncontroversial topics, high-stakes disputes risk stalemates or rulings that prioritize harmony over factual rigor, contributing to variable article quality and vulnerability to systemic skews.

Empirical Evaluations of Factual Accuracy

Comparisons with Traditional Encyclopedias

A 2005 investigation by Nature compared the accuracy of 42 biomedical science articles from Wikipedia and Encyclopædia Britannica, enlisting experts to identify factual errors, omissions, or misleading statements. The review found 162 such issues in Wikipedia entries versus 123 in Britannica, yielding averages of 3.9 and 2.9 errors per article, respectively, suggesting comparable overall reliability despite Wikipedia's collaborative model. Britannica disputed the methodology, claiming Nature inflated errors through inconsistent criteria, such as counting disputed interpretations as mistakes and overlooking Britannica's corrections; their independent audit identified four serious errors in Britannica (none major in Wikipedia) but 45 major errors in Wikipedia across the sample. Subsequent empirical work has reinforced similarities in raw factual accuracy while highlighting differences in systematic biases. For instance, analyses of historical and political topics, such as national histories, reveal Wikipedia entries occasionally omitting contentious events (e.g., wars) more frequently than Britannica equivalents, potentially due to editorial in crowd-sourced environments lacking centralized expertise. Traditional encyclopedias like Britannica employ paid subject-matter experts and rigorous , minimizing factual deviations through controlled revision cycles, whereas Wikipedia's volunteer-driven process enables quicker factual corrections but exposes content to transient inaccuracies during disputes. Quantitative assessments of ideological slant further differentiate the platforms. Greenstein and Zhu's 2012–2014 studies, examining U.S. political biographies and election coverage, measured slant via linguistic markers (e.g., partisan word usage) and found Wikipedia articles exhibited 25% greater bias toward Democratic viewpoints than Britannica's, with Wikipedia's median slant score at 1.96 versus Britannica's 1.62 on a normalized ; however, articles with higher edit volumes trended toward neutrality over time. These findings attribute Wikipedia's in volume (over 6 million articles by 2014 versus Britannica's curated ~65,000) to its scalability, but underscore traditional encyclopedias' advantage in consistent neutrality via gatekeeping, reducing vulnerability to demographic skews among contributors. Overall, while factual error rates align closely, Wikipedia's reliability lags in bias-resistant domains due to its decentralized governance compared to Britannica's professional curation.

Quantitative Studies on Error Rates

A 2005 comparative study published in Nature examined the accuracy of 42 Wikipedia articles on scientific topics by having experts review them alongside corresponding Encyclopædia Britannica entries. The analysis identified 162 factual errors, omissions, or misleading statements in Wikipedia compared to 123 in Britannica, yielding an average of approximately four errors per Wikipedia article and three per Britannica article. Britannica contested the methodology, arguing that the error count was inflated by including minor issues and that their own review found Wikipedia's inaccuracies to be about a third higher when applying stricter criteria. In a 2006 evaluation of historical content, Roy Rosenzweig assessed 25 Wikipedia biographies of notable Americans against Encarta and the American National Biography Online. Wikipedia achieved an 80% factual accuracy rate, lower than the 95-96% for the professional sources, with errors primarily in minor details and more frequent omissions of nuance. The study attributed this partly to Wikipedia's reliance on volunteer editors without specialized historical training, though it noted comparable rates to other encyclopedias in broad factual claims. A 2012 analysis in Public Relations Review reviewed 60 Wikipedia articles on companies, cross-checked against official filings and websites. It found factual errors in 60% of entries, including inaccuracies in founding dates, figures, and names, suggesting vulnerabilities in coverage of self-interested or commercial topics due to unverified edits. Focusing on , a 2014 study by Hasty et al. in the Journal of the American Osteopathic Association compared Wikipedia articles on the 10 most costly U.S. health conditions (e.g., , ) to peer-reviewed literature. Nine out of 10 articles contained errors, defined as contradictions or inconsistencies with evidence-based sources, with issues in treatment , factors, and . The authors highlighted Wikipedia's limitations for clinical , as errors persisted despite citations to primary research.
StudyDomainSample SizeError Rate in WikipediaComparison
(2005)Science42 articles~4 errors/article (162 total)Britannica: ~3 errors/article (123 total)
Rosenzweig (2006)U.S. History Biographies25 articles80% accuracy (20% error/omission)/ANBO: 95-96% accuracy
PRR (2012)Companies60 articles60% with factual errorsOfficial sources (e.g., filings)
Hasty et al. (2014)Medical Conditions10 articles90% with errors/inconsistenciesPeer-reviewed medical literature
These studies indicate error rates varying by domain, generally comparable to traditional encyclopedias in neutral scientific or historical facts but higher in specialized fields like medicine and commerce, where expert verification is sparse. Factors such as edit volume and topic contentiousness influence persistence, with peer-reviewed assessments underscoring the need for caution in high-stakes applications.

Domain-Specific Accuracy Assessments

Assessments of Wikipedia's factual accuracy reveal variability across domains, with stronger performance in empirical sciences and medicine compared to humanities and politically sensitive topics. A 2005 comparative study by Nature magazine evaluated 42 science articles, identifying 162 factual errors in Wikipedia entries versus 123 in Encyclopædia Britannica, concluding that Wikipedia's science coverage approached professional encyclopedia standards despite occasional minor inaccuracies. Similarly, the 2012 EPIC-Oxford study, involving expert evaluations of articles in English, Spanish, and Arabic across disciplines including biology and physics, found Wikipedia scoring higher on accuracy (mean 3.6 out of 5) than competitors like Citizendium in several scientific categories, though it lagged in depth for specialized subfields. In , Wikipedia demonstrates high factual reliability, particularly for monographs and descriptions, though completeness remains a limitation. A 2011 analysis of 100 articles rated 99.7%±0.2% for factual accuracy against professional references, with errors primarily in omissions rather than fabrications. A 2020 review of health-related content corroborated this, noting that 83% of articles achieved "good" or "very good" quality ratings by experts, outperforming non- entries due to stricter sourcing norms enforced by domain-specific volunteer editors and citation to peer-reviewed journals. However, studies highlight gaps in nuanced topics like , where accuracy averaged 78% in a 2014 evaluation, often due to oversimplification of conflicting evidence. Historical articles exhibit lower accuracy, attributed to interpretive disputes and reliance on secondary sources prone to . A comparative analysis of historical entries reported Wikipedia's accuracy at 80%, compared to 95-96% for established , with errors stemming from uncited claims or selective framing of events. The EPIC-Oxford evaluation echoed this, assigning history articles a mean accuracy score of 3.2 out of 5, below but above popular online alternatives, due to challenges in verifying primary sources amid edit wars over contentious narratives. In politics and biographies, factual details on verifiable events and careers are generally reliable, especially for prominent figures, but prone to inconsistencies in lesser-covered topics. A 2011 study of U.S. congressional candidate biographies found Wikipedia provided accurate political experience data in 100% of cases examined, sufficient for quantitative analysis. Brigham Young University research similarly validated its utility for political education, with error rates under 5% for election-related facts, though coverage completeness favored high-profile individuals over niche or historical politicians. These strengths derive from cross-verification by ideologically diverse editors, yet domain experts caution that factual precision does not preclude subtle distortions in aggregation of sourced material.

Systemic Biases and Neutrality Challenges

Evidence of Ideological Left-Leaning Bias

A 2024 computational analysis by Rozado examined over 1,000 Wikipedia articles on public figures and U.S. , using large models to annotate sentiment and emotional associations. The study found that content tends to link right-of-center figures and terms with more negative sentiment and emotions like or compared to left-of-center equivalents, indicating a mild to moderate left-leaning . Similar patterns emerged in assessments of political terminology, where right-leaning concepts received disproportionately negative framing. Earlier quantitative research by Shane Greenstein and Feng Zhu compared Wikipedia's coverage of U.S. political topics to across thousands of articles from 2008 to 2017. Their findings revealed that Wikipedia exhibited greater left-leaning slant in phrasing and emphasis, particularly on contentious issues like and civil rights, exceeding Britannica's neutrality in 2016 and 2018 updates. A 2012 precursor study by the same authors measured slant in 28,000 U.S. articles via dictionary-based methods, confirming Wikipedia's entries leaned left on average, though revisions reduced but did not eliminate the disparity. Wikipedia co-founder Larry Sanger has publicly asserted since 2020 that the platform's articles reflect a systemic left-wing bias, citing examples such as the deprecation of conservative-leaning sources like the Daily Wire while permitting left-leaning outlets, and skewed portrayals of topics like socialism and gender issues. Sanger attributes this to editor demographics and enforcement of neutrality policies that favor "establishment" views, a claim echoed in analyses showing persistent ideological asymmetry in high-controversy articles despite policy guidelines. These patterns align with broader empirical observations of content divergence: for instance, articles on right-leaning politicians often emphasize controversies with higher frequency and intensity than analogous left-leaning profiles, as quantified through of revision histories. While Wikipedia's neutral point of view aims for balance, studies indicate it fails to fully counteract the aggregate effect of editor incentives and sourcing preferences, resulting in measurable left-leaning distortions in political coverage.

Coverage Gaps and Selection Biases

Wikipedia's article coverage reveals systematic gaps, particularly in topics aligned with the interests and demographics of its predominantly , male, and left-leaning editor base, resulting in underrepresentation of non- perspectives, female-centric subjects, and conservative viewpoints. A 2025 global analysis of over 6 million articles identified disparities tied to factors including and political , with coverage skewed toward contributors from high-income, urbanized regions and excluding events or figures from lower-wealth or peripheral areas. Similarly, a 2023 study of event articles on Wikipedia quantified regional biases, showing that events in wealthier nations receive disproportionate attention relative to their global occurrence, while those in developing regions face coverage shortfalls of up to 50% compared to population-adjusted expectations. These gaps stem from self-selection among editors, who prioritize familiar subjects, amplifying imbalances in assessments under Wikipedia's guidelines. Gender-related coverage exhibits pronounced selection biases, with biographies of women comprising fewer than 20% of total entries and receiving shorter treatments alongside reduced visual elements. A controlled analysis of over 1,000 biographies confirmed that articles on racial and minorities are systematically shorter and employ less language patterns, attributing this to editor demographics where women constitute under 20% of active contributors. In scholarly citations, publications by authors are cited 10-15% less frequently in Wikipedia than expected based on academic impact metrics, reflecting sourcing preferences that favor male-dominated fields. Such patterns indicate not mere oversight but structural selection against topics perceived as niche by the core editing community. Ideological selection biases manifest in the deprecation of conservative-leaning sources and undercoverage of right-of-center figures or events, as —often cited as "reliable" under —exhibits documented left-leaning tilts that influence decisions. A 2025 report by the documented instances where conservative outlets like were blacklisted as unreliable, limiting their use in verifying alternative narratives and contributing to deletions or stubs on topics like U.S. conservative debates. In coverage, entries disproportionately feature male scholars, with living female and non-U.S. underrepresented by factors of 2-3 relative to their field prominence, per a 2022 assessment of over 500 academics. A 2024 causal analysis of 1,399 further linked these gaps to editor ideological clustering, where left-leaning majorities enforce sourcing norms that marginalize dissenting views, reducing overall neutrality in topic selection. This reliance on ideologically aligned secondary sources perpetuates exclusion, as empirical reviews show 's political entries lag in breadth compared to balanced encyclopedic benchmarks.

Conflict-of-Interest and Advocacy Editing

Conflict-of-interest (COI) editing on Wikipedia occurs when individuals or groups edit articles to promote external interests, such as financial gains, corporate reputations, or ideological agendas, rather than adhering to neutral point-of-view principles. This practice undermines the encyclopedia's reliability by introducing biased content that may persist due to inconsistent enforcement by volunteer moderators. Academic analyses have identified thousands of such articles; for instance, one compiled 3,280 COI-affected entries through content-based detection methods, highlighting patterns like promotional language and self-referential sourcing. Paid editing services represent a prominent form of , often involving firms hired to enhance client pages without . In 2013, the firm Wiki-PR was exposed for using hundreds of undisclosed accounts to edit on behalf of paying clients, leading to widespread blocks after community investigations revealed systematic manipulation. Similarly, medical device company employed staff and consultants to favor edits promoting kyphoplasty procedures, attempting to alter articles on related treatments despite lacking independence. More recently, as of 2025, large law firms have been documented paying undisclosed editors to excise mentions of scandals from their entries, violating transparency rules and prioritizing reputational control over factual completeness. Advocacy-driven editing further exacerbates reliability issues, particularly in politically charged topics, where coordinated groups advance narratives. A 2025 investigation identified at least 30 editors collaborating on over 1 million edits across more than 10,000 articles related to and the Israeli-Palestinian conflict, with activity spiking after , 2023. These editors, linked to advocacy efforts like Tech for Palestine recruitment, removed citations documenting (e.g., Samir Kuntar's convictions) and reframed events to minimize Palestinian violence while amplifying anti- framing, such as portraying as colonialist. Such coordination—evidenced by 18 times more inter-editor communications than groups—evades detection tools, allowing biased content to influence high-traffic pages. Enforcement challenges compound these problems, as declining volunteer numbers (e.g., a 40% drop in medical topic editors from 2008 to 2013) limit oversight, enabling undisclosed edits to proliferate. approaches for early detection of undisclosed paid editors have shown promise, outperforming baselines in identifying anomalous patterns, yet widespread adoption remains limited. Consequently, and advocacy editing contribute to systemic distortions, where external incentives override empirical sourcing, eroding trust in Wikipedia as a verifiable .

Error Propagation and Correction Processes

Vandalism and Rapid Reversion Mechanisms

refers to deliberate edits intended to disrupt or degrade content quality, including insertions of falsehoods, obscenities, or nonsensical material. The encounters roughly 9,000 such malicious edits each day. These acts constitute a small but persistent fraction of overall activity, estimated at about 2% of edits in sampled periods. To counter , relies on rapid detection and reversion through layered mechanisms combining human oversight and automation. Human patrollers, including approximately 5,000 and 1,400 administrators, monitor recent changes feeds to identify and undo suspicious edits via tools that restore prior versions en masse. Assisted tools like Huggle and STiki queue potentially problematic edits for review using algorithms analyzing , language patterns, and edit characteristics. Automated bots form the frontline for swift intervention, scanning edits within seconds of submission. Prominent examples include ClueBot NG, which employs neural networks trained on human-classified data to detect anomalies in edit behavior, achieving reversions in as little as 5 seconds and accumulating over 3 million such actions since 2011. These bots revert approximately one edit per minute on average and eliminate about 50% of vandalistic contributions autonomously. Edit filters, numbering around 100, preemptively block or warn on high-risk edits from new or unregistered users based on predefined rules. The combined efficacy of these systems ensures most obvious vandalism is corrected rapidly, often within minutes, contributing to reverts comprising up to 10% of total edits by 2010. prevalence fluctuates, reaching one in six edits during off-peak hours and one in three during high-activity periods, yet reversion graphs confirm high (82.8%) and (84.7%) in identifying damaging changes post-facto. While effective against blatant disruptions, these mechanisms are less adept at subtle or coordinated efforts, allowing some persistence until manual review.

Circular Referencing and Information Loops

Circular referencing occurs when articles cite external sources that, in turn, derive their information directly or indirectly from , forming interdependent loops that mask the absence of independent verification. This practice violates 's verifiability policy, which requires claims to be supported by reliable, published sources rather than self-reinforcing cycles. Such loops often arise when unverified or fabricated details are added to , subsequently copied by secondary sources like news outlets or blogs, which then serve as citations back to the original article, creating an illusion of multiple corroborating references. The term "citogenesis," describing this process of fact generation through , was coined by in his October 25, 2011, comic, which depicted a sequence where a false claim enters , propagates to external , and returns as "sourced" validation. In practice, this enables the persistence of , as editors and readers perceive looped citations as evidence of rather than tracing back to potentially dubious origins. For instance, niche historical or lacking primary documentation can gain entrenched status when outlets, seeking quick references, reproduce content and get cited in return, amplifying errors across the information ecosystem. These loops exacerbate reliability challenges by eroding to empirical or authoritative primaries, particularly in under-sourced topics where volunteer editors prioritize apparent sourcing over . Critics, including academic guides, warn that such cycles facilitate "fact-laundering," where low-quality or invented acquires undue legitimacy, complicating efforts to correct or debunk it once embedded. acknowledges the risk through guidelines prohibiting self-citation and templates flagging circular sources, yet detection relies on vigilant oversight, which is inconsistent for obscure entries. Empirical observation from documented cases shows that once loops form, reversion requires dismantling multiple interdependent references, often delaying accurate updates for months or years. The causal mechanism here stems from Wikipedia's open-editing model intersecting with the web's copy-paste culture: initial insertions evade scrutiny due to volume, secondary sources prioritize speed over , and reinforces the , perpetuating inaccuracies until challenged by external fact-checkers or primary . This dynamic disproportionately affects or evolving subjects, where source scarcity invites speculation disguised as fact, undermining the platform's claim to encyclopedic neutrality and verifiability.

Persistence of Misinformation Despite Policies

Despite 's core policies requiring verifiability from independent and adherence to a neutral point of view, has repeatedly endured for years without detection or correction. The "Bicholim Conflict," a fabricated account of an undeclared 1640–1641 between and the Indian , persisted as a detailed article from 2007 until its deletion in January 2013, evading scrutiny for over five years despite multiple citations to nonexistent sources. Empirical analyses of hoaxes underscore this vulnerability: a comprehensive study of 496 detected hoaxes on English Wikipedia revealed that while 90% were flagged within one hour of creation, 1% endured for more than one year, and another 1% garnered over 1,000 page views or citations in other articles, amplifying their reach before removal. Subtle fabrications, mimicking legitimate content with plausible but invented references, often bypass community patrols and revert mechanisms, as policies depend on editor vigilance rather than automated enforcement. Deliberate insertion experiments further quantify persistence risks. In a 2015 test embedding false but innocuous claims across articles, 63% of the misinformation remained uncorrected for weeks or months, highlighting delays in challenging entries lacking immediate controversy. A 2023 replication and extension of an earlier false-claims study found that while over one-third of added disinformation was reverted within hours, the majority lingered longer, particularly in low-traffic pages where policy enforcement is sporadic. Scientific exhibits similar inertia: a September 2025 investigation of citations to retracted papers showed that invalid research continues to be referenced in articles long after retractions, with incomplete updates failing to mitigate propagation to readers. This persistence stems from the decentralized editing model, where policy violations require consensus among volunteers, often delaying action on entrenched or unchallenged content until external verification intervenes.

Notable Incidents and Case Studies

High-Profile Hoaxes and Fabricated Content

One of the most elaborate fabrications on was the " conflict," an invented article detailing a supposed 14th-century between the villages of and Satari in , . Created on November 2, 2007, by an anonymous editor, the 4,500-word entry described fictional battles, diplomatic maneuvers, and a fabricated , complete with citations to non-existent historical sources. It evaded detection for over five years until December 20, 2012, when an editor identified inconsistencies in the referenced materials, leading to its deletion on January 4, 2013. The ranked among 's longer-running deceptions, highlighting how detailed can persist amid limited expert oversight for obscure topics. Another record-setting hoax, "Jar'Edo Wens," claimed to describe an ancient Aboriginal embodying the physical form of earthly and of human suffering through physical contact. Added on May 29, 2005, by an anonymous editor from an IP address—later identified as "John666"—the article survived nine years, nine months, and five days before deletion on March 3, 2015, after a queried its legitimacy on the Noticeboard. It achieved apparent credibility through mutual citations with other entries, such as "Dilga" and "," and edits by unwitting contributors, exemplifying circular reinforcement where fabricated content bolsters itself. The perpetrator, an serial er, had inserted similar fabrications elsewhere, underscoring systemic challenges in verifying culturally niche claims without verification. In a case of rapid misinformation spread, student Shane Fitzgerald inserted a phony quote—"Gandhi said: 'When I despair, I remember that all through history the way of truth and love has always won. There have been tyrants and murderers and for a time they seem invincible, but in the end, they always fall. Think of it, always' "—into the Wikipedia page on January 4, 2007, falsely attributing it to a 1940s . The fabrication circulated to over 100 news sites, including and , within hours, before Fitzgerald revealed it as a test of media verification practices six days later. This incident demonstrated 's potential as an unvetted source for secondary dissemination, with outlets failing to independently confirm the quote despite its absence from verified Gandhi archives. These hoaxes reveal persistent vulnerabilities in Wikipedia's model, where anonymous edits on under-monitored subjects can endure through superficial plausibility and lack of contradictory evidence, even as policies mandate reliable sourcing. In 2021, investigations uncovered a coordinated effort by a editor fabricating over articles on invented historical events, further illustrating how state-influenced or prank-driven deceptions exploit low-traffic pages. Such cases, often uncovered only by vigilant users rather than proactive checks, question the encyclopedia's safeguards against deliberate invention.

Political and Ideological Editing Scandals

In 2006, an investigation using software revealed that staffers from congressional offices had made over a thousand edits to Wikipedia articles from official IP addresses, often to remove embarrassing details or insert favorable information about politicians. For instance, edits from Senator Joe Biden's office altered his biography to downplay a involving his 1988 presidential campaign speeches. Similar changes were traced to offices of other members, including efforts to delete references to scandals or ethical issues, prompting to block edits from certain government IPs and sparking debates over conflict-of-interest policies. By 2014, persistent disruptive edits from US House of Representatives computers—totaling hundreds annually and focusing on political topics—led to a formal on editing from those IPs, as administrators deemed them violations of neutrality guidelines. Analysis showed patterns of whitewashing controversies, such as softening criticisms of lawmakers' voting records or issues, highlighting how institutional access enabled ideological or self-serving manipulations despite Wikipedia's volunteer oversight. More recent scandals involve coordinated ideological campaigns by clusters of editors. A 2025 Anti-Defamation League report documented at least 30 Wikipedia editors collaborating over years to insert anti-Israel narratives into articles on the Israeli-Palestinian conflict, circumventing policies by using sockpuppet accounts and selectively citing sources to amplify biased framing while suppressing counterviews. This included systematic downgrading of pro-Israel perspectives as "propaganda" and elevation of contentious claims without balanced sourcing, illustrating how small, ideologically aligned groups can dominate contentious topics. Wikipedia co-founder has publicly attributed such incidents to systemic left-leaning bias among editors, arguing in 2024-2025 statements that the platform's reliance on a self-selecting volunteer base—predominantly holding progressive views on , , and culture—fosters "capture" by ideologues who enforce viewpoints through blacklists of conservative sources and revert edits challenging dominant narratives. A 2024 Manhattan study empirically supported this, finding Wikipedia articles on more likely to incorporate Democratic-aligned language (e.g., "civil rights") over equivalents, with conservative topics showing higher rates of negative framing based on citation patterns. These cases underscore vulnerabilities in Wikipedia's decentralized model, where ideological editing scandals erode claims of neutrality without robust external verification.

Scientific and Medical Misinformation Events

In a 2014 analysis published in the Journal of the American Osteopathic Association, researchers compared entries on the ten most costly medical conditions in the United States—such as ischemic heart disease, , and —with information from peer-reviewed and , a clinical decision support resource. The study identified factual errors or omissions in 90% of the articles, including misleading statements on (e.g., implying could be diagnosed from a single elevated reading without specifying confirmatory protocols) and treatment (e.g., incomplete or inaccurate guidance on managing conditions like or ). These discrepancies arose from reliance on secondary sources, outdated data, or unsourced edits, leading the authors to recommend caution in using for medical information, particularly by patients and trainees. A 2025 study on Wikipedia's handling of retracted scientific papers revealed persistent citation of invalid research, with 71.6% of 1,181 analyzed instances deemed problematic: many citations were added before retraction but not removed afterward, while others were introduced post-retraction without noting the invalidation. For example, approximately 50% of retracted papers cited in articles lacked any indication of their retraction status, allowing flawed scientific claims—such as fabricated data in biomedical studies—to propagate despite 's verifiability policies and tools like Citation Bot. This issue spans fields like and , where retracted papers on topics from drug efficacy to genetic mechanisms continued influencing article content years later, highlighting gaps in editor vigilance and automated detection. The analysis, drawing from data and edit histories, underscored how collaborative editing fails to systematically purge discredited , potentially misleading readers on empirical validity. Chemical inaccuracies provide another documented case of enduring scientific errors. A 2017 letter in the Journal of Chemical Education detailed multiple instances of glaring structural errors in Wikipedia chemistry articles, such as incorrect depictions of molecular bonds and functional groups in entries for compounds like and certain neurotransmitters, which persisted for years despite reports to editors. One example involved a misdrawn Kekulé structure for derivatives, violating basic valence rules, while another featured erroneous in pages; these flaws remained uncorrected even after direct notifications, attributed to editor inexperience in specialized domains and resistance to non-consensus changes. Such errors, often sourced from unreliable images or unverified uploads, undermine 's utility for scientific education and research reference. During the , Wikipedia's coverage of the lab-leak hypothesis exemplified delayed correction of scientific narratives amid ideological editing pressures. Until mid-2021, the platform's articles frequently framed the hypothesis—supported by U.S. intelligence assessments and virological analyses of cleavage sites—as a "" or "debunked," citing early consensus statements from sources like while downplaying counter-evidence from at the . Editing wars, documented through talk-page disputes and revert logs, involved blocks of pro-lab-leak edits as "," with the hypothesis only reclassified as a viable origin scenario after FBI and Department of Energy endorsements in 2023. This persistence reflected broader challenges in neutral sourcing for contentious , where reliance on mainstream outlets—often aligned with natural-origin advocacy—delayed updates despite accumulating empirical indicators like proximity to high-risk labs and database deletions.

Expert and Institutional Perspectives

Academic and Research Evaluations

A 2005 comparative analysis published in Nature examined 42 science articles from Wikipedia and Encyclopædia Britannica, finding that Wikipedia contained on average four serious errors and omissions per article, compared to three in Britannica, leading to the conclusion that Wikipedia approached Britannica's accuracy in scientific entries. However, Britannica contested the methodology, arguing that the study's expert reviewers were not blinded to article sources, potentially introducing bias, and that Wikipedia had 162 factual errors versus Britannica's 123 across the reviewed content. Subsequent pilot studies, such as a 2012 multilingual comparison, echoed similar findings of parity in select topics but highlighted variability by language edition and subject depth. In medical and health domains, evaluations have yielded mixed results; a 2014 review of Wikipedia's coverage of disorders found it comparable to professional sources in accuracy but often lacking in completeness and clinical nuance. A 2011 assessment of pharmacological articles reported high factual overlap with textbooks, yet a broader 2016 analysis of information revealed gaps in completeness and readability relative to official guides. These inconsistencies underscore that while Wikipedia performs adequately in verifiable scientific facts, it frequently underperforms in synthesizing complex, evidence-based recommendations, with accuracy rates varying from 70-90% depending on the metric and topic. Research on ideological has identified systematic left-leaning slants, particularly in political and biographical content; a 2012 econometric of over 28,000 articles developed a slant index based on partisan media citations, revealing a leftward stronger than in Britannica or Encyclopædia.com. More recent computational analyses, including a examination of sentiment associations in articles on public figures, found more likely to link right-of-center terms and individuals with negative connotations, with effect sizes indicating mild to moderate asymmetry not fully mitigated by editor diversity. Field experiments, such as a Yale randomizing edits to political stubs, confirmed that crowd-sourced contributions exhibit detectable biases mirroring contributors' ideologies, persisting despite reversion policies. These findings suggest that while factual reliability holds in neutral domains, interpretive neutrality falters under open editing, influenced by editor demographics skewed toward progressive viewpoints. Overall, academic consensus acknowledges Wikipedia's utility for broad overviews and as a starting point for , with error rates often comparable to traditional encyclopedias in fields, but cautions against reliance in contentious or specialized areas due to propagation and incomplete sourcing. Longitudinal metrics from multilingual assessments further indicate that reliability correlates positively with volume and density, yet systemic underrepresentation of conservative perspectives raises questions about causal mechanisms in .

Journalistic Reliance and Internal Critiques

Journalists frequently consult for background information and quick reference during reporting, despite guidelines from organizations like advising against direct citation due to its editable nature and potential for transient errors. This reliance can propagate inaccuracies when unverified content from the encyclopedia is incorporated into news articles without independent fact-checking. A notable 2009 experiment by student Shane Fitzgerald illustrated this vulnerability: he inserted a fabricated quote falsely attributed to —"When I despair, I remember that all through history the way of truth and love has always won. There have been tyrants and for a time they seem invincible, but in the end, they always fall. Think of it, always."—into the Wikipedia entry on the leader; the persisted for five weeks, during which it was reproduced without verification by outlets including The Huffington Post, , and . Such incidents underscore how Wikipedia's open-editing model, while enabling rapid updates, exposes journalistic workflows to risks of "citation pollution," where media reports citing erroneous Wikipedia content create circular validation loops. Internal critiques of Wikipedia's reliability have emerged prominently from within its founding and editing community, highlighting systemic issues in editorial control and bias mitigation. Larry Sanger, who co-founded Wikipedia in 2001 alongside Jimmy Wales, has been a leading voice, arguing since his departure in 2002 that the platform's volunteer-driven model has devolved into ideological capture by anonymous activists prioritizing agendas over neutrality. In a May 2020 essay, Sanger detailed how Wikipedia exhibits "serious bias problems" on politically charged topics, such as conservative figures and events, where sourced facts are downplayed or omitted in favor of narratives aligned with left-leaning viewpoints prevalent among editors. By September 2025, in an op-ed for The Free Press, he proposed reforms including stricter expert verification and reduced anonymity to restore reliability, claiming the site's current state renders it untrustworthy for contentious subjects due to unchecked manipulation by a small cadre of ideologically motivated contributors. These concerns align with broader internal acknowledgments of uneven enforcement of neutral point of view policies, as evidenced by Sanger's observation that Wikipedia's reliance on secondary sources from biased institutions amplifies distortions rather than correcting them through first-hand scrutiny. While Sanger's critiques, informed by his foundational role, emphasize causal failures in Wikipedia's decentralized —such as the dominance of unaccountable editors over credentialed experts—defenders within the often counter that aggregate editing corrects errors over time, though empirical cases like prolonged hoaxes suggest otherwise. This internal reflects deeper tensions between Wikipedia's aspirational openness and the practical realities of human biases influencing content persistence, with Sanger attributing much of the degradation to a shift from collaborative knowledge-building to factional control post-2000s expansion. Courts in the United States have cited in over 400 judicial opinions, sometimes taking of its content or basing legal reasoning on it. A 2022 study by researchers at and the analyzed the impact of articles on judicial behavior, finding that the creation of a entry on a specific increased its citations in subsequent court opinions by more than 20 percent, suggesting the shapes judges' awareness and application of precedents. This influence persisted even after controlling for other factors, with a randomized control trial confirming that exposure to articles affects judicial decision-making in experimental settings. Despite such usage, numerous courts have explicitly criticized Wikipedia's reliability for legal purposes, emphasizing its editable nature and potential for anonymous alterations. In a 2008 Texas appellate decision, the court deemed entries "inherently unreliable" because they can be modified by anyone without verification, rejecting their use as . The Supreme Court in 2017 similarly disfavored reliance on , advising against it in formal legal analysis due to risks of inaccuracy. Federal courts have issued parallel warnings, with some circuits holding it as an unreliable source and cautioning against its evidentiary weight. In the , a 2023 analysis highlighted concerns that senior judges' frequent reference to could propagate erroneous information, potentially undermining judgment quality amid unverified edits. Policy contexts reflect similar skepticism; for instance, many academic and professional guidelines in prohibit citing in formal submissions, viewing it as insufficiently authoritative for policy formulation or regulatory reliance. entities have occasionally monitored or sought to influence editing rather than adopting it as a policy tool, underscoring doubts about its stability for official use. Overall, while permeates informal judicial research, explicit policy discourages its standalone role in binding decisions to mitigate risks of factual distortion.

Views from Traditional Encyclopedia Producers

Robert McHenry, former editor-in-chief of , critiqued Wikipedia in a 2004 essay titled "The Faith-Based Encyclopedia," arguing that its reliance on anonymous, volunteer editors without verifiable expertise fosters a system akin to communal faith rather than scholarly accountability, where errors persist due to the absence of identifiable authorship and pre-publication review. He illustrated this by examining Wikipedia's article on a historical figure, noting multiple factual inaccuracies and speculative content that remained uncorrected despite the platform's purported self-correcting mechanism. Encyclopædia Britannica Inc. further challenged Wikipedia's reliability in a 2006 response to a Nature journal study that purported to find comparable error rates in science articles between the two (3.9 errors per Wikipedia article versus 2.9 for Britannica). The company deemed the study "fatally flawed," citing methodological issues such as undisclosed reviewer identities, inconsistent error classification (e.g., counting reviewer misinterpretations as encyclopedia errors), and selective article sampling that overlooked Britannica's broader editorial standards, which include commissioning named experts and rigorous fact-checking. Britannica maintained that its professional processes ensure higher factual precision and depth, contrasting Wikipedia's vulnerability to vandalism, bias from unvetted contributors, and incomplete sourcing. These views underscore a core contention from traditional producers: encyclopedic reliability demands hierarchical expertise and gatekeeping, absent in Wikipedia's decentralized model, which prioritizes volume and over sustained accuracy. While acknowledging Wikipedia's utility for broad overviews, such critiques emphasize its inadequacy for authoritative reference, particularly in complex or contentious topics where anonymous edits can propagate without institutional recourse.

Tools, Metrics, and Recent Developments

Internal and External Reliability Tools

maintains internal tools primarily designed to detect , assess edit quality, and support to bolster article reliability. The Objective Revision Evaluation Service (ORES), launched in 2015, employs models to score revisions for potential damage and evaluate overall article quality, enabling editors to prioritize problematic edits. These models are trained on human-labeled data from tools like Wiki labels, achieving high precision in identifying revertable edits across languages. Automated bots complement ORES by rapidly reverting ; for instance, systems using statistical language models and detect subtle disruptions like sneaky vandalism, reducing response times compared to manual patrols. Internal quality assessment frameworks, such as those rating articles from to featured class, further guide improvements by evaluating factual completeness and sourcing, though these rely on community consensus rather than automated metrics alone. Externally, third-party tools like WikiTrust provide independent reliability indicators by analyzing revision history and author reputation to color-code text based on trustworthiness. Introduced around , WikiTrust highlights words from anonymous or low-reputation contributors in orange, with fading intensity for persistent content, aiming to alert readers to potentially unreliable passages without altering Wikipedia's core process. Evaluations of WikiTrust demonstrated its utility in surfacing vandalism-prone revisions, though adoption waned as it required extensions for and browsers. Recent external efforts include datasets like Wiki-Reliability for training models on content accuracy, facilitating broader research into propagation of errors.

Quantitative Metrics for Article Quality

Wikipedia maintains an internal content assessment system that categorizes articles into quality classes ranging from (minimal content) to featured article (highest standard, requiring comprehensive sourcing, neutrality, and ). This system, applied by volunteer editors, provides a quantitative distribution metric: as of October 2023, the English Wikipedia's approximately 6.7 million articles include roughly 6,800 featured articles and 35,000 good articles, representing less than 0.1% and about 0.5% of the total, respectively, while over 80% are or start-class with limited depth and verification. Featured articles demonstrate measurably higher , maintaining high-quality content for 86% of their lifecycle compared to 74% for non-featured articles, as measured by edit reversion rates and content persistence in a statistical analysis. Expert-reviewed studies yield error rate metrics, often revealing variability by topic. A 2005 blind peer review by Nature of 42 science articles identified 162 factual errors, omissions, or misleading statements in Wikipedia entries (averaging 3.9 per article) versus 123 in Encyclopædia Britannica (averaging 2.9 per article), indicating comparable but slightly higher error density in Wikipedia. Britannica disputed the findings, claiming methodological flaws such as selective error counting inflated Wikipedia's inaccuracies by a factor of three relative to their own. Subsequent domain-specific assessments show higher precision in technical fields; for instance, a 2014 evaluation of pharmacology articles found Wikipedia's drug information accurate in 99.7% ± 0.2% of cases against expert consensus. Automated predictive models offer scalability metrics for quality estimation. The Objective Revision Evaluation Service (ORES), deployed by the , uses to classify articles into six quality tiers, achieving up to 64% accuracy in multi-class prediction and a of 0.09 in regression-based scoring on held-out datasets. Systematic reviews of such models indicate classifiers reach 51-60% accuracy using features like reference count, edit history, and structural elements, though performance drops for lower classes like stubs due to sparse data. These metrics correlate positively with manual assessments: articles with more references and edits (e.g., over 100 revisions) are 2-3 times more likely to reach B-class or higher, per lifecycle analyses.
Metric TypeExample ValueDomain/NotesSource
Featured Article Proportion<0.1% of total , 2023
Error Rate (Errors per Article)3.9 (Wikipedia) vs. 2.9 (Britannica) topics, 2005
Accuracy in Specialized Topics99.7% ± 0.2%, 2014
ML Prediction Accuracy64% (6-class)Article quality models, 2023
High-Quality Lifetime86% (featured) vs. 74% (non-featured)Edit stability, 2010
These metrics underscore that while aggregate quality improves with editor investment, the of low-assessed articles—predominantly stubs—limits overall reliability, as confirmed by correlations between class ratings and verifiability scores in large-scale datasets.

Reform Efforts and Ongoing Criticisms (2020-2025)

In response to persistent concerns over content reliability and bias, the advanced several initiatives during the 2020-2025 period as part of its broader Movement Strategy, finalized in 2020, which emphasized structural changes for sustainability, inclusion, and equitable decision-making to indirectly bolster editorial quality. These included investments in tools for tracking and knowledge integrity, such as projects to curate information sources more efficiently and combat through community-driven processes. Additionally, the WikiCred program, ongoing since earlier years but intensified in this timeframe, hosted events like WikiCredCon 2025, which targeted improvements in handling amid rising threats like editor and doxxing attempts that undermine neutral editing. Despite these measures, criticisms of Wikipedia's reliability intensified, particularly regarding ideological imbalances in article framing. Co-founder , who has highlighted systemic left-leaning bias since a 2020 analysis citing uneven sourcing and editorial suppression of conservative viewpoints, reiterated in 2025 that anonymous editing and administrative overreach perpetuate censorship of dissenting perspectives, including on topics like origins and political figures. A 2024 Manhattan Institute study analyzing over 1,500 Wikipedia articles via found statistically significant negative sentiment toward right-leaning terms and figures compared to left-leaning equivalents, attributing this to editor demographics skewed toward urban, progressive contributors. External scrutiny escalated in 2025 with a U.S. House Oversight Committee investigation launched by Republicans in , probing allegations of coordinated manipulation by foreign actors and U.S. academics to insert anti-Israel and pro-Russia content, demanding details on Wikipedia's detection of such "bad actors" and editor . Critics, including Sanger and outlets documenting these issues, argued that Wikipedia's policy favors establishment —often critiqued for left-wing tilts—while marginalizing primary or data, exacerbating reliability gaps in politically charged topics. Concurrently, active editor numbers stagnated around 130,000 monthly contributors by mid-decade, with core experienced editors declining due to burnout and AI competition like diverting research traffic, further straining quality oversight. These developments underscored unresolved tensions between Wikipedia's open model and demands for rigorous, unbiased curation.

References

  1. [1]
    Internet encyclopaedias go head to head - Nature
    Dec 14, 2005 · Jimmy Wales' Wikipedia comes close to Britannica in terms of the accuracy of its science entries, a Nature investigation finds.
  2. [2]
    [PDF] Is Wikipedia Politically Biased? | Manhattan Institute
    Jun 1, 2024 · Our results suggest that Wikipedia's NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles. • ...
  3. [3]
    Is Wikipedia Biased? - American Economic Association
    In its earliest years, Wikipedia's political entries lean Democrat on average. The slant diminishes during Wikipedia's decade of experience. This change does ...
  4. [4]
    [PDF] Impact, Characteristics, and Detection of Wikipedia Hoaxes
    A Wikipedia hoax is a deliberately fabricated falsehood made to masquerade as truth, often about nonexistent entities or events.
  5. [5]
    Comparison of Wikipedia and other encyclopedias for accuracy ...
    Aug 6, 2025 · Overall, Wikipedia 's accuracy rate was 80 percent compared with 95‐96 percent accuracy within the other sources. This study does support the ...
  6. [6]
    [PDF] Statistical Measure of the Effectiveness of the Open Editing Model of ...
    The goal of our study is to analyze the evolution of content in Wikipedia articles over time and estimate the fraction of time that articles are in high–quality ...
  7. [7]
    A simple model of edit activity in Wikipedia - ScienceDirect.com
    The model characterizes editors by edit ability and maintenance tendency, and articles by their potential and current qualities. It models content and ...
  8. [8]
    Wikipedia and the Meaning of Truth | MIT Technology Review
    Oct 20, 2008 · To be fair, Wikipedia's verifiability policy states that “articles should rely on reliable, third-party published sources” that themselves ...
  9. [9]
    Is Wikipedia Politically Biased? - Manhattan Institute
    Jun 20, 2024 · Wikipedia's neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results ...
  10. [10]
    Wikipedia's Neutrality: Myth or Reality? - City Journal
    Jun 24, 2024 · Central to that purpose is the site's neutral point of view (NPOV) ... ” My new computational analysis of Wikipedia's content, however ...
  11. [11]
    [PDF] Wikipedia Editors Study: Results From The Editor Survey, April 2011
    1. Demographic characteristics of Editors: Since we haven't had good demographic data about Wikipedia editors, a caricatured profile of Wikipedia editors has ...
  12. [12]
    Change the Stats - Wikimedia Foundation
    Wikimedia contributors are 87% male. · Fewer than 1% of Wikipedia's editor base in the U.S. identify as Black or African American.Missing: education | Show results with:education
  13. [13]
    Demographic disparity in Wikipedia coverage: a global perspective
    Feb 21, 2025 · We study global gender and citizenship disparities in Wikipedia coverage. We measure global coverage in several ways, including the number of languages in ...
  14. [14]
    Look It Up: Humanities Students are Filling Wikipedia's Content Gaps
    Nov 12, 2024 · There's something like 1,400 people who edit the Wikipedia page for hip hop a month, which is to say more people have seen her scholarly work ...<|separator|>
  15. [15]
    Motivations of Wikipedia content contributors - ScienceDirect.com
    Motivations of Wikipedia content contributors ... The aim of this study was to evaluate how motivation affects individual knowledge sharing behavior in Wikipedia.
  16. [16]
    Motivations of contributors to Wikipedia - ACM Digital Library
    This paper aims to explain why people are motivated to contribute to the Wikipedia project. A comprehensive analysis of the motivations of Wikipedians is ...
  17. [17]
    Why these scientists devote time to editing and updating Wikipedia
    Feb 19, 2025 · Researchers can recognize a reliable source of information, making them ideal contributors to the free global online encyclopaedia.
  18. [18]
    [PDF] Ideological Segregation among Online Collaborators: Evidence from ...
    We analyze the contributors of biased and slanted content in Wikipedia articles about U.S. politics, and focus on two research questions: (1) Do contributors ...
  19. [19]
    Chairman Cruz Sounds Alarm Over Left-Wing Ideological Bias on ...
    Oct 7, 2025 · ' As Sanger tells it, most Wikipedia editors' 'notion of what is credible' biases them against 'conservatism, traditional religiosity, and ...Missing: surveys | Show results with:surveys
  20. [20]
  21. [21]
    (PDF) Adhocratic Governance in the Internet Age: A Case of Wikipedia
    Aug 6, 2025 · As the evidence will show, while Wikipedia's governance shows elements common to many traditional governance models, it appears to be closest to ...<|separator|>
  22. [22]
    Reflections on the English Wikipedia Arbitration Committee
    Jun 5, 2013 · The Arbitration Committee (ArbCom) on the English Wikipedia is a group of editors that rules on disputes between editors related to user ...
  23. [23]
    Why some Wikipedia disputes go unresolved | MIT News
    Nov 6, 2018 · The study sheds light on how some RfC processes “deviate from established norms, leading to inefficiencies and biases,” says Dario Taraborelli, ...
  24. [24]
    How Social Capital Affects the Arbitration of Disputes on Wikipedia
    May 4, 2023 · This article examines how social capital affects the resolution of disputes by focusing on English Wikipedia's Arbitration Committee, ...
  25. [25]
    [PDF] Fatally Flawed - Britannica
    Mar 7, 2006 · Britannica refutes Nature's study, claiming the investigation was flawed and misleading, and that Britannica was more accurate than Wikipedia.
  26. [26]
    (Don't) Mention the War: A Comparison of Wikipedia and Britannica ...
    Both datasets show very low expected error rates (0.01 per dataset). For Wikipedia, we estimate the highest error rate in the 11th century (.24), since a ...
  27. [27]
    Wikipedia Or Encyclopædia Britannica: Which Has More Bias?
    Jan 20, 2015 · “There has been lots of research on the accuracy of Wikipedia, and the results are mixed—some studies show it is just as good as the experts ...<|separator|>
  28. [28]
    Do Experts Or Crowd-Based Models Produce More Bias? Evidence ...
    We find that Wikipedia articles are more slanted toward Democratic views than are Britannica articles, as well as more biased. The difference in bias between a ...
  29. [29]
    [PDF] Do Experts or Crowd-Based Models Produce More Bias? Evidence ...
    Greenstein and Zhu (2016) examine the evolution of political articles at Wikipedia over ten years to answer questions such as whether the Internet has a ...
  30. [30]
    Can History be Open Source? Wikipedia and the Future of the Past
    This article seeks to answer some basic questions about history on Wikipedia. How did it develop? How does it work? How good is the historical writing?
  31. [31]
    Wikipedia vs peer-reviewed medical literature for information about ...
    Most Wikipedia articles representing the 10 most costly medical conditions in the United States contain many errors when checked against standard ...Missing: factual rates
  32. [32]
    [PDF] Assessing the accuracy and quality of Wikipedia entries compared ...
    Introduction. Previous studies, most notably the one carried out by the journal Nature in 2005, have sought to compare the quality of Wikipedia articles ...<|separator|>
  33. [33]
    Accuracy and completeness of drug information in Wikipedia - NIH
    Wikipedia's drug information is generally accurate, but often incomplete, lacking depth, and should not be the sole source of information.
  34. [34]
    Situating Wikipedia as a health information resource in various ...
    Feb 18, 2020 · A more recent investigation finds that Wikipedia's medical content is now consistently superior to its non-medical content with 83% of its ...
  35. [35]
    (PDF) How Accurate Are Wikipedia Articles in Health, Nutrition, and ...
    Aug 9, 2025 · Previous studies of Wikipedia have reported mixed results regarding the quality of information on health-related topics.
  36. [36]
    View of The visibility of Wikipedia in scholarly publications
    A comparison of Wikipedia and other encyclopedias in historical entries revealed that Wikipedia's accuracy was 80 percent compared with 95–96 percent accuracy ...
  37. [37]
    Wikipedia as a Data Source for Political Scientists: Accuracy and ...
    Apr 8, 2011 · In every case, Wikipedia provided sufficient detail in its biographies to identify every candidate's political experience at the time of each ...
  38. [38]
    Wikipedia a reliable source for political info, study says - BYU News
    Apr 13, 2011 · A peer-reviewed study by Brigham Young University political scientist Adam Brown validates Wikipedia as a reliable place to get a political education.
  39. [39]
    Wikipedia as a Data Source for Political Scientists: Accuracy and ...
    Aug 9, 2025 · When evaluating Wikipedia's accuracy, it has been observed to have more accurate and deep information about prominent leaders, long-term ...
  40. [40]
    New Study Finds Political Bias Embedded in Wikipedia Articles
    Jun 20, 2024 · Findings show that Wikipedia entries are more likely to attach negative sentiment to terms representative of right-leaning political orientation ...
  41. [41]
  42. [42]
    Is Wikipedia Politically Biased? - by David Rozado
    Jun 20, 2024 · Wikipedia's editors could benefit from better tools to identify and reduce bias. Computational tools to detect biased content and adversarial ...Missing: surveys | Show results with:surveys
  43. [43]
    The wealth and regional gaps in event attention and coverage on ...
    Nov 8, 2023 · Overall, this work provides a nuanced understanding of attention and coverage on Wikipedia through event articles and adds new empirical ...
  44. [44]
    How article category in Wikipedia determines the heterogeneity of its ...
    Jan 7, 2024 · The quality of Wikipedia articles rises with the number of editors per article as well as a greater diversity among them.
  45. [45]
    [PDF] Visual Gender Biases in Wikipedia: A Systematic Evaluation across ...
    Men's Wikipedia biographies tend to have more images, and written and visual content trends are dissimilar. Less than 20% of biographies are about women.
  46. [46]
    [PDF] Controlled Analyses of Social Biases in Wikipedia Bios - arXiv
    Feb 9, 2022 · This study analyzes social biases in Wikipedia bios, focusing on gender and racial minorities, and differences in article length and language ...
  47. [47]
    Gender and country biases in Wikipedia citations to scholarly ...
    Nov 5, 2022 · We show that publications by women are cited less by Wikipedia than expected, and publications by women are less likely to be cited than those by men.
  48. [48]
    Wikipedia accused of blacklisting conservative US media - The Times
    Feb 6, 2025 · Wikipedia has been accused of blacklisting conservative media from being used as source material. The Media Research Center, a conservative ...
  49. [49]
    Reducing Bias in Wikipedia's Coverage of Political Scientists | PS
    Mar 31, 2022 · This article shows that Wikipedia's coverage of political scientists remains skewed by gender and nationality, and I suggest ways for political scientists to ...
  50. [50]
    The Covert World of People Trying to Edit Wikipedia for Pay
    Aug 11, 2015 · Undisclosed paid edits, Heilman says, “often distract the core community of editors away from more important topics.” There is also a fear that ...Missing: scandals | Show results with:scandals
  51. [51]
    [PDF] Content-Based Conflict of Interest Detection on Wikipedia
    We have built a CoI dataset which contains 3,280 CoI articles and 3,450 non-CoI articles, which could be used in future research on CoI detection;. • We have ...
  52. [52]
    Is the PR Industry Buying Influence Over Wikipedia? - VICE
    Oct 18, 2013 · The king of these Wikipedia reputation managers is a company called Wiki-PR, that specializes in editing Wikipedia on behalf of their paying clients.
  53. [53]
    Scandals Erased, Editors Paid: How Big Law Firms Try to Control ...
    Sep 4, 2025 · A deep analysis by Law.com shows how some law firms pay editors, flout the rules, whether consciously or not, and remove controversies to ...
  54. [54]
    Editing for Hate: How Anti-Israel and Anti-Jewish Bias Undermines ...
    Mar 18, 2025 · ADL has found clear evidence that a group of at least 30 editors circumvent Wikipedia's policies in concert to introduce antisemitic narratives, anti-Israel ...
  55. [55]
    [PDF] DETECTING UNDISCLOSED PAID EDITING IN WIKIPEDIA
    As we can see from Table 5.2, our approaches (LSTM1 and LSTM2) to early detect Undisclosed Paid Editors easily outperform both AUROC and average precision ...
  56. [56]
    The use of software tools and autonomous bots against vandalism
    Sep 2, 2015 · These musings serve to underline that the trend in Wikipedia to include both bots and humans in patrolling is a fortunate one indeed—vandalism ...
  57. [57]
    Cross-Language Prediction of Vandalism on Wikipedia Using Article ...
    Aug 7, 2025 · We propose detecting vandalism using a range of classifiers in a monolingual setting, and eval-uated their performance when using them across ...Missing: reversion | Show results with:reversion
  58. [58]
    [PDF] How Reverts Affect the Quantity and Quality of Wikipedia Work
    ABSTRACT. Reverts are important to maintaining the quality of Wiki- pedia. They fix mistakes, repair vandalism, and help enforce policy.
  59. [59]
    [PDF] Spatio-Temporal Analysis of Reverted Wikipedia Edits
    We conduct the first systematic analysis of Wikipedia article revert graphs to identify vandalism and damaging edits after they have been undone as ground truth ...Missing: empirical | Show results with:empirical
  60. [60]
    Citogenesis: the serious circular reporting problem Wikipedians are ...
    Mar 7, 2019 · This circular referencing created a problem for proving the veracity of the information found on the Wikipedia page—where was the original ...
  61. [61]
    TechScape: When Wikipedia fiction becomes real life fact | Internet
    Mar 30, 2022 · And while Wikipedia's systems, both formal and informal, do generally work to expel low-effort vandalism, falsehoods can stick around if they ...Missing: studies | Show results with:studies
  62. [62]
    Citogenesis - XKCD
    Where Citations Come From: Citogenesis Step #1 Through a convoluted process, a user's brain generates facts. These are typed into Wikipedia.Missing: examples | Show results with:examples<|separator|>
  63. [63]
    Wikipedia hoax about a war that never happened deleted after 5 years
    Jan 5, 2013 · The Bicholim Conflict isn't the longest-running Wikipedia hoax, as The Daily Dot notes, but it still makes the top ten list. It was narrowly ...
  64. [64]
    'Bicholim Conflict' article on Wikipedia for 5 years a hoax - UPI.com
    Jan 4, 2013 · The "Bicholim Conflict" hoax ranks eighth on the list of the longest-running known hoaxes pulled off on Wikipedia, The Daily Dot said. The ...
  65. [65]
    Impact, Characteristics and Detection Of Wikipedia Hoaxes
    We find that most hoaxes have negligible impact along all of these three dimensions, but that 1% of hoaxes survive for over an year, 1% receive significant ...Missing: persist examples
  66. [66]
    Experiment concludes: Most misinformation inserted into Wikipedia ...
    Apr 13, 2015 · Experiment concludes: Most misinformation inserted into Wikipedia may persist. by Gregory Kohs. A months-long experiment to deliberately insert ...
  67. [67]
    The Persistence of Retracted Papers on Wikipedia - arXiv
    Sep 22, 2025 · This paper investigates how citations to retracted research are handled on English Wikipedia. We construct a novel dataset that integrates ...
  68. [68]
    Fake Wikipedia entry on Bicholim Conflict finally deleted after five ...
    Jan 3, 2013 · Fake Wikipedia entry on Bicholim Conflict finally deleted after five years ... Wikipedia hoax, lasting just over eight years before being ...
  69. [69]
    Wikipedia's 'Goan war' unmasked as elaborate hoax - Phys.org
    Jan 8, 2013 · Wikipedia's 'Goan war' unmasked as elaborate hoax. Wikipedia is open for anyone to edit and therefore "can be abused to create hoaxes.
  70. [70]
    The story behind Jar'Edo Wens, the longest-running hoax in ...
    Apr 15, 2015 · Jar'Edo Wens was a blatant prank, a bald invention, dropped into Wikipedia nine years ago by some unknown and anonymous Australian.
  71. [71]
    Aussie's Jar'Edo Wens prank sets new record as Wikipedia's longest ...
    Mar 23, 2015 · The article, which has taken the record for the longest-lived Wikipedia hoax to date, remained unchallenged for nine years and nine months, ...
  72. [72]
    Jared Owens, God of Wikipedia - Wikipediocracy
    Mar 15, 2015 · On May 29, 2005, an anonymous editor using an Australian IP address added “Jar'Edo Wens” (an exotically punctuated “Jared Owens”) to Wikipedia's ...
  73. [73]
    Student hoaxes world's media on Wikipedia - NBC News
    May 12, 2009 · Student hoaxes world's media on Wikipedia. When Dublin university student Shane Fitzgerald posted a poetic but phony quote on Wikipedia, he said ...Missing: high- | Show results with:high-
  74. [74]
    Wikipedia's longest hoax ever gets busted after more than 10 years
    Feb 4, 2016 · Calamondin12 had stumbled on the longest-running hoax in Wikipedia history, lasting 10 years and 1 month by the time it was deleted on September 3, 2015.Missing: examples | Show results with:examples
  75. [75]
    The Hunt for Wikipedia's Disinformation Moles - WIRED
    Oct 17, 2022 · The research tracked 86 editors who are already banned from Wikipedia. The editors tried to sway narratives on the English-language Wikipedia ...
  76. [76]
    Top 10 Notorious Wikipedia Hoaxes - Listverse
    Nov 10, 2019 · 8Jar'Edo Wens ... This case shows just how easy it is to compose a faux article that slips through the cracks and stays on Wikipedia for years.
  77. [77]
    Congress Is Still Editing Wikipedia a Lot -- Here's How We Know
    Jul 15, 2014 · Ever since members of Congress got busted for rewriting Wikipedia articles in 2006, anonymous editors of the online encyclopedia on Capitol ...
  78. [78]
    Wikipedia blocks 'disruptive' page edits from US Congress - BBC
    Jul 25, 2014 · Wikipedia administrators have imposed a ban on page edits from computers at the US House of Representatives, following persistent disruptive editing.
  79. [79]
    Dozens of Wikipedia editors colluded on years-long anti-Israel ...
    Mar 18, 2025 · More than two dozen Wikipedia editors allegedly colluded in a years-long scheme to inject anti-Israel language on topics related to the Israeli-Palestinian ...
  80. [80]
    I Founded Wikipedia. Here's How to Fix It. - The Free Press
    Sep 30, 2025 · I launched the site in 2001. Today, it's been captured by anonymous editors who manipulate articles to fit their ideological biases.
  81. [81]
    Wikipedia's lefty bias measured in study — but I've felt it firsthand
    Jun 25, 2024 · A new report from the Manhattan Institute confirms that perception, based on a computerized language study of thousands of Wikipedia articles.Missing: pdf | Show results with:pdf<|separator|>
  82. [82]
    Wikipedia has wrong information on top medical ailments, study finds
    May 30, 2014 · The Journal of the American Osteopathic Association compared Wikipedia articles on the 10 most expensive medical conditions to peer-reviewed ...
  83. [83]
    Study: 90 percent of health-related Wikipedia articles contain errors
    Jul 1, 2014 · A recent study published in the Journal of the American Osteopathic Association found significant errors in nine out of 10 Wikipedia ...
  84. [84]
    Measuring the quality of scientific references in Wikipedia
    When Wikipedia articles cite scientific papers that have been subsequently retracted, 50% of the retracted papers could not be identified as retracted ...
  85. [85]
    Glaring Chemical Errors Persist for Years on Wikipedia
    Jan 26, 2017 · In reality, this is not the case. Major chemical errors on Wikipedia persist for years, sometimes even after they have been reported.
  86. [86]
    Wikipedia is at war over the coronavirus lab leak theory - CNET
    Jun 27, 2021 · Wikipedia's policies and guidelines, strengthened by its two decades online, ensure misinformation and vandalism are snuffed out with great ...
  87. [87]
    Seven years after Nature, pilot study compares Wikipedia favorably ...
    Aug 2, 2012 · In 2005, Nature famously reported that Wikipedia articles on scientific topics contained just four errors per article on average, compared to ...
  88. [88]
    Why is the common knowledge resource still neglected by academics?
    Dec 3, 2019 · This article argues that it is high time not only to acknowledge Wikipedia's quality but also to start actively promoting its use and development in academia.
  89. [89]
    Wikipedia and Medicine: Quantifying Readership, Editors, and the ...
    Mar 4, 2015 · A narrow look at pharmacological articles assessed Wikipedia's accuracy to be high based on significant overlap with textbook sources [11].
  90. [90]
    Editorial Bias in Crowd-Sourced Political Information
    Our results demonstrate that crowd-sourced information is subject to an editorial bias that favors the politically active.<|separator|>
  91. [91]
    ‪Włodzimierz Lewoniewski‬ - ‪Google Scholar‬
    Relative quality and popularity evaluation of multilingual Wikipedia articles ... 2015. Modeling popularity and reliability of sources in multilingual Wikipedia.
  92. [92]
    What journalists should know about Wikipedia – a primer - Poynter
    Jul 17, 2018 · A common question we often get is “Should I cite Wikipedia directly as a source?” The answer to that is a resounding no. Wikipedia is a resource ...
  93. [93]
    Journalists Fooled By Wikipedia Hoax. Oh Really? - NPR
    May 8, 2009 · There's a really interesting article by John Timmer from Ars Technica about a Wikipedia hoax that fooled the likes of The Guardian.<|separator|>
  94. [94]
    Why did Wikipedia gain the reputation of an non credible source that ...
    Apr 12, 2024 · ... error rate is still rather low (it was 2.2 on average for encyclopaedia Britannica, a well respected document vs 3.8 on Wikipedia, per article).Wikipedia is a trove of misinformation. : r/IntellectualDarkWeb - RedditTIL that Wikipedia is "close to the level of accuracy in Encyclopedia ...More results from www.reddit.com
  95. [95]
    Wikipedia Is Badly Biased - LarrySanger.org
    May 14, 2020 · It is in fact a controversial view that the historical accuracy of the Gospels is uncertain; others disagree, holding that, upon analysis, it is ...Missing: critiques | Show results with:critiques
  96. [96]
    Wikipedia co-founder says site has liberal bias — here's his plan to ...
    Oct 3, 2025 · Wikipedia co-founder Larry Sanger is alleging that the online encyclopedia has a left-leaning bias and has released a nine-point plan to address ...
  97. [97]
  98. [98]
    Wikipedia Is More One-Sided Than Ever - LarrySanger.org
    Jun 30, 2021 · Wikipedia is supposed to be like Switzerland, proverbially speaking: not casting any side as the enemy, and certainly not taking pot-shots at one side.Missing: critiques | Show results with:critiques<|control11|><|separator|>
  99. [99]
    THE CITATION OF WIKIPEDIA IN JUDICIAL OPINIONS
    Wikipedia has been cited in over four hundred American judicial opinions. Courts have taken judicial notice of Wikipedia content, based their reasoning on ...
  100. [100]
    Study finds Wikipedia influences judicial behavior | MIT News
    Jul 27, 2022 · A new study finds clear empirical evidence that Wikipedia influences judges' application of the law. The study was led by Neil Thompson of ...
  101. [101]
    User-Generated Content Shapes Judicial Reasoning: Evidence from ...
    Mar 19, 2024 · Wikipedia articles influence legal judgments, as we show using a first-of-its-kind randomized control trial on judicial decision making.
  102. [102]
    Court Holds That Wikipedia Entries Are “Inherently Unreliable”
    Nov 24, 2008 · It think this Texas court was exactly right: “Wikipedia entries are inherently unreliable because they can be written and edited anonymously by ...Missing: criticisms | Show results with:criticisms
  103. [103]
    Can You Trust Wikipedia to Decide Your Courtroom Fate?
    and very advisedly — rejected any reliance on Wikipedia as a source of facts in the courtroom.
  104. [104]
    Is Wikipedia A Reliable Legal Authority? (2018 Update)
    Aug 2, 2018 · Courts in this circuit have held that Wikipedia is “an unreliable source of information” and have warned against reliance on it and other ...
  105. [105]
    Wikipedia information "undermining quality of judgments"
    Feb 8, 2023 · The widespread use of online source Wikipedia by senior judges could mean fake information spreading, leading to bad judgments.<|separator|>
  106. [106]
    The Faith-Based Encyclopedia
    The Faith-Based Encyclopedia. Away back about 1993, '94 -- in retrospect, the ... Robert McHenry is Former Editor in Chief, the Encyclop dia Britannica ...
  107. [107]
    Technology | Wikipedia study 'fatally flawed' - BBC NEWS
    Mar 24, 2006 · Encyclopaedia Britannica has hit back at research published in Nature, comparing its accuracy to Wikipedia.
  108. [108]
  109. [109]
    ORES - MediaWiki
    Sep 17, 2025 · ORES (Objective Revision Evaluation Service) is a web service and API that provides machine learning as a service for Wikimedia projects ...Edit quality · Advanced support · Article quality · Topic routingMissing: reliability | Show results with:reliability
  110. [110]
    Artificial intelligence service "ORES" gives Wikipedians X-ray specs ...
    Nov 30, 2015 · This service empowers Wikipedia editors by helping them discover damaging edits and can be used to immediately “score” the quality of any Wikipedia article.Missing: reliability | Show results with:reliability
  111. [111]
    [PDF] Detecting Wikipedia Vandalism with Active Learning ... - biz.uiowa.edu
    This paper proposes an active learning approach using lan- guage model statistics to detect Wikipedia vandalism. Wi- kipedia is a popular and influential ...Missing: reversion rates
  112. [112]
    [PDF] Context-aware Detection of Sneaky Vandalism on Wikipedia across ...
    Our results show how context-aware detection techniques can become a new state-of-the-art counter-vandalism tool for Wikipedia that complements current feature ...Missing: reliability | Show results with:reliability
  113. [113]
    [PDF] On Measuring the Quality of Wikipedia Articles
    ABSTRACT. This paper discusses an approach to modeling and measuring information quality of Wikipedia articles. The approach is based on the idea that the ...<|separator|>
  114. [114]
    Evaluating WikiTrust: A trust support tool for Wikipedia
    WikiTrust aims at helping readers to judge the trustworthiness of articles by coloring the background of less trustworthy words in a shade of orange. In this ...
  115. [115]
    Adding Trust to Wikipedia, and Beyond | MIT Technology Review
    Sep 4, 2009 · A tool called WikiTrust, which helps users evaluate information on Wikipedia by automatically assigning a reliability color-coding to text ...
  116. [116]
    Wikipedia to Color Code Untrustworthy Text - WIRED
    Aug 30, 2009 · An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has ...
  117. [117]
    [PDF] Detecting Wikipedia Vandalism using WikiTrust - Luca de Alfaro
    The goal of the automated tool is to find vandalized revisions wherever they may occur in the revision history of Wikipedia articles. This type of vandalism ...
  118. [118]
    A Large Scale Dataset for Content Reliability on Wikipedia
    Jul 11, 2021 · We provide an overview of the possible downstream tasks enabled by such data, and show that Wiki-Reliability can be used to train large-scale ...
  119. [119]
    Here are the English Wikipedia's ten longest featured articles – Diff
    May 12, 2016 · The English Wikipedia has more than 4,700 featured articles at the time of writing—fewer than 0.1 percent of all articles. Featured articles ...<|separator|>
  120. [120]
    Wikipedia featured articles - Wikimedia Meta-Wiki
    On this page the number of featured articles on local Wikipedias together with their rating criteria are displayed. However, some wikis may have different ...
  121. [121]
    [PDF] Statistical Measure of Quality in Wikipedia
    We show that non–featured articles tend to have high–quality content 74% of their life- time and this is 86% for featured articles. Furthermore, we show that ...Missing: quantitative | Show results with:quantitative
  122. [122]
    [PDF] Quality Assessment of Wikipedia Articles without Feature Engineering
    Jun 23, 2016 · Forest(RF) implemented by ORES, achieved the accuracy of. 51%, 48% and 60% respectively. Using the feature set com- posed of 11 features ...<|separator|>
  123. [123]
    Assessing the quality of Wikipedia articles with lifecycle based metrics
    Oct 25, 2009 · In this paper we offer new metrics for an efficient quality measurement. The metrics are based on the lifecycles of low and high quality ...
  124. [124]
    A Large Scale Dataset for Content Reliability on Wikipedia
    We provide an overview of the possible downstream tasks enabled by such data, and show that Wiki-Reliability can be used to train large-scale models for content ...<|separator|>
  125. [125]
    Movement Strategy/Recommendations - Meta-Wiki - Wikimedia.org
    Apr 23, 2025 · 1. Increase the Sustainability of Our Movement 2. Improve User Experience 3. Provide for Safety and Inclusion 4. Ensure Equity in Decision-making
  126. [126]
    Improve Knowledge Integrity - Wikimedia Research
    We have been leading projects to help our communities represent, curate, and understand information provenance in Wikimedia projects more efficiently.
  127. [127]
    WikiCredCon 2025 Tackles Credibility Threats to Wikipedia – Diff
    Jan 23, 2025 · WikiCredCon 2025's theme is “Reliable Sources,” and will focus on themes including combating harassment and increasing attempts to dox editors.<|separator|>
  128. [128]
    Wikipedia's censorship is a threat to civilization itself - New York Post
    Oct 2, 2025 · Wikipedia founder Larry Sanger made waves this week when he discussed the crowdsourced website's outright censorship of conservative voices ...Missing: criticism | Show results with:criticism
  129. [129]
    Republicans investigate Wikipedia over allegations of organized bias
    Aug 27, 2025 · Republicans on the House Oversight Committee are investigating alleged bias in Wikipedia entries.
  130. [130]
    Trump admin launches probe into Wikipedia over alleged 'bad actors'
    House Republicans launched an investigation this week into claims that US academics and foreigners are conspiring to make ...
  131. [131]
    Wikipedia co-founder Larry Sanger exposes ideological ... - Fox News
    Oct 9, 2025 · Wikipedia's co-founder on anonymous editors, why the site is biased against conservatives and how to fix it. Larry Sanger criticizes 'reliable ...
  132. [132]
    ChatGPT Is Stealing Readers From Wikipedia
    Aug 25, 2025 · Editing activity on these articles may also be declining, though the evidence there is less conclusive.