Fact-checked by Grok 2 weeks ago

Publish or perish

"Publish or perish" is an denoting the intense pressure on academics and researchers to continuously generate and publish scholarly outputs—primarily peer-reviewed articles—to advance their careers, obtain tenure, secure , and maintain standing, with insufficient records often resulting in stalled progress, job loss, or exclusion from the field. The phrase, which first appeared in academic discourse in the early but gained widespread recognition through sociologist Wilson's 1942 analysis of in The Academic Man: A Study in the Sociology of a Profession, reflects a systemic evaluation criterion where volume and citation metrics serve as primary proxies for scholarly merit. This imperative originated as a call to disseminate knowledge but evolved into a high-stakes survival mechanism amid expanding systems and competitive resource allocation post-World War II, where administrators increasingly relied on quantifiable outputs to assess . By the late , the advent of citation indexes and journal impact factors amplified the phenomenon, transforming it into a global norm across disciplines, though most pronounced in sciences requiring frequent incremental findings. Empirical models indicate that such incentives distort research priorities toward novel, positive results over replication or null findings, eroding the trustworthiness of published science; for instance, simulations show that pressures for positive outcomes can inflate false discovery rates even in rigorous fields. Critics highlight how "publish or perish" fosters misaligned incentives, prioritizing quantity—such as "salami slicing" minimal into multiple papers—over depth, contributing to the and a surge in retractions, with estimates suggesting up to 2% of recent papers may involve fabricated driven by publication demands. In fields like and sciences, this has tangible consequences, including biased clinical guidelines from underreported negative trials and slowed scientific progress, as underrepresented innovators producing integrative work receive fewer citations despite higher long-term impact. Despite calls for reform, such as valuing preregistration and , entrenched prestige hierarchies and funding ties to metrics perpetuate the cycle, underscoring a causal disconnect between tools and genuine advancement.

History and Origins

Etymology and Early Usage

The phrase "publish or perish" originated in the as an exhortation within circles to prioritize the timely of findings amid accelerating scientific . Earliest documented usages trace to the late , reflecting a positive imperative for scholars to document and share discoveries to ensure their integration into advancing collective knowledge, lest they risk irrelevance through delay or obscurity. For instance, a 1928 article in and invoked the expression to highlight publication's role in professional recognition and the progression of ideas in burgeoning disciplines. By 1932, historian Harold Jefferson Coolidge referenced the phrase in discussions of scientific imperatives, framing it as a rooted in the historical necessity of recording observations and experiments promptly to build upon prior work. This early connotation emphasized publication not as a burden but as an ethical duty for preservation, particularly in fields like and sciences where empirical data accumulation demanded swift communication to counter the fragmentation caused by disciplinary growth. Eugene Garfield's 1996 investigation into the phrase's "primordial reference" underscored its pre-World War II prevalence as a motivational , predating its later associations with jeopardy; Garfield, who had employed it in speeches for decades, sought its origins without pinpointing a singular source, attributing it to widespread oral and informal academic lore. These initial applications thus aligned with first principles of scientific progress: empirical validation through peer scrutiny required accessible records, fostering innovation by enabling iterative across specialized domains.

Evolution in the Post-War Era

The expansion of after , driven by the Servicemen's Readjustment Act of 1944 (), which provided benefits to over 2.2 million veterans by 1947 and boosted college enrollments from 1.5 million in 1940 to 2.7 million by 1950, created a surge in faculty positions and opportunities. This growth intersected with priorities, particularly following the Soviet Union's Sputnik launch on October 4, 1957, which catalyzed federal funding increases; the National Foundation's budget rose from $40 million in 1957 to $134 million by 1960, while the expanded grants for biomedical , tying institutional support to demonstrable outputs. These investments formalized expectations that academic career progression, including tenure, hinged on publication records to justify ongoing funding amid limited permanent positions. Peer review processes, already emerging in agencies like the Office of Naval Research established in 1946, became more standardized in scientific journals post-war, with organizations such as the American Association for the Advancement of Science (AAAS) emphasizing expert evaluation to filter submissions for journals like Science, thereby elevating publications as proxies for scholarly productivity. This shift intensified competition in an era of tenure-track proliferation, where universities, facing ballooning graduate programs, used publication counts to differentiate candidates; by the late 1950s, federal mandates linked grants to peer-reviewed results, embedding output metrics into evaluations. Scientific publication volumes grew exponentially during this period, with an estimated annual rate of 5.6% from the mid-20th century, leading to a of about 13 years and correlating with heightened pressures for career survival in expanded academic systems. By the , amid critiques in academic literature of overreliance on for promotion, the "publish or perish" dynamic had solidified as a structural , reflecting institutional adaptations to despite overall growth.35304-9/fulltext)

Core Mechanisms and Incentives

Role in Tenure, Promotion, and Funding

Peer-reviewed publications constitute the principal metric for assessing research productivity in tenure evaluations at U.S. research universities, where committees review candidates' curricula vitae to quantify output such as the number of articles in high-impact journals. This emphasis emerged as a standardized practice in the latter half of the , coinciding with the post-war expansion of and rising expectations for demonstrable scholarly impact, with publication requirements intensifying since the 1970s. For instance, a common benchmark in many fields remains approximately two peer-reviewed articles per year during the probationary period, though recent cohorts have averaged higher rates of 2.43 articles annually to meet escalating demands. Promotion to and full similarly hinges on sustained records, serving as of ongoing contributions beyond initial tenure. In a competitive landscape where full-time tenure-track positions have declined to 32% of roles by 2021 from 53% in 1987, the scarcity amplifies pressure on early-career academics, with tenure denial often correlating to insufficient outputs and leading to reliance on adjunct positions lacking . Publication history also informs funding allocation, as agencies evaluate principal investigators' prior records to gauge potential for successful project execution. The National Science Foundation's merit review process, centered on intellectual merit and broader impacts, incorporates assessments of the investigator's track record, including publications, as a for in ad hoc and panel reviews. Similarly, the considers prior publication productivity in , where higher output correlates with better percentile rankings and subsequent of funded work. This linkage reinforces publication imperatives, as frequently require evidence of productivity to predict future deliverables amid finite resources.

Publication Metrics and Evaluation Systems

Publication metrics serve as standardized, quantifiable indicators to evaluate researchers' productivity and influence, embedding the "publish or perish" imperative into institutional decision-making processes such as hiring, tenure, and resource allocation. These systems emphasize countable outputs like the number of publications and citations received, often prioritizing metrics that are easily aggregated and compared over nuanced qualitative judgments. Citation counts, which tally how frequently a paper or author is referenced by others, form the foundational data for many derived indicators, reflecting perceived scholarly impact but susceptible to manipulation through self-citation or coordinated referencing. Prominent among these is the journal impact factor, initially conceptualized by in 1955 as a means to rank journals by their citation influence, with systematic annual calculations beginning in 1975 via the . The h-index, introduced by physicist in 2005, complements this by measuring the largest number h of an author's publications that have each received at least h citations, aiming to balance productivity with impact while purportedly reducing outliers' skew. These metrics operationalize evaluation by converting diverse scholarly activities into numerical scores, facilitating comparisons in departmental reviews where raw publication volumes and citation tallies frequently supersede assessments of originality or methodological rigor. In broader evaluation systems, such metrics heavily influence global university rankings, where allocate 20% of their score to citations per faculty based on data, and assigns up to 30% to quality components including normalized . This reliance incentivizes behaviors aligned with metric maximization, as administrators favor verifiable numbers for efficiency in high-volume assessments, though this structurally favors incremental, high-volume outputs reproducible across large teams over high-risk, paradigm-shifting work that may yield delayed or uneven citations. Empirical analyses reveal systemic inflation, with predatory journals exploiting open-access models through citation cartels and self-referencing to artificially boost scores; for instance, studies in the early 2020s documented networks of such journals engaging in reciprocal citations to infiltrate legitimate indexes like , distorting overall metric validity by up to 100% in uncorrected cases.

Positive Contributions

Incentives for Productivity and Innovation

The "publish or perish" incentive structure has driven marked increases in scientific output, correlating with the expansion of global publication volume from 2.0 million articles in to 3.3 million in 2022, reflecting heightened pressures for dissemination amid competitive evaluation systems. This growth, sustained at an average annual rate of approximately 5% over decades, has enabled faster empirical iteration in knowledge-intensive domains, where researchers under tenure and constraints prioritize timely outputs to advance incremental discoveries. In fields like , these pressures have amplified publication rates, fostering competitive environments that accelerate mapping and sequencing advancements; for instance, the average number of publications per has exhibited following initial linear phases, attributable to the imperative for rapid validation and extension of findings. Similarly, the urgency to publish under such dynamics played a pivotal role in the response, where servers facilitated unprecedented early dissemination of vaccine-related research, enabling collaborative refinements that hastened progress and regulatory approvals within months rather than years. By linking career progression to output metrics, the paradigm selectively allocates grants and resources to high-performing investigators, as longitudinal analyses of demonstrate that grant recipients produce significantly more publications and citations thereafter, optimizing returns on public investments in research productivity. This mechanism weeds out lower-output contributors through attrition, concentrating institutional support on those evidencing sustained empirical contributions and thereby enhancing overall discovery efficiency.

Acceleration of Knowledge Dissemination

The expansion of scientific journals following , driven by increased federal funding for research in the United States and elsewhere, significantly accelerated the dissemination of knowledge by providing outlets for the burgeoning volume of scholarly output. This growth, which saw the number of journals multiply amid economic recovery and institutional investments in science, was intertwined with the emerging "publish or perish" imperative, as academics faced heightened pressure to document findings for tenure and grants. Prior to this era, idea suppression or prolonged delays in sharing results—often due to limited publication venues or less stringent output expectations—could sideline researchers, but the proliferation of journals democratized access, enabling broader scrutiny and validation of empirical claims across fields like physics and . In the contemporary landscape, the publish or perish culture has further hastened information flow through the adoption of preprint servers, exemplified by , founded in 1991 by physicist at to facilitate rapid sharing of unpeer-reviewed manuscripts in physics and related disciplines. This platform, which by 2014 had amassed over 1 million submissions, emerged partly in response to career pressures compelling researchers to prioritize speed over exhaustive review processes, allowing ideas to circulate globally within days rather than months or years. Such mechanisms have advanced fields by enabling timely feedback loops; for instance, in high-stakes areas like , preprints have documented breakthroughs that peer-reviewed journals later formalized, reducing the "perish" risk of overlooked contributions amid competitive environments. Empirical studies underscore the efficacy of this acceleration, with preprints consistently linked to elevated rates and broader compared to journal-only submissions. A 2021 analysis of COVID-19-related preprints found that those subsequently published garnered significantly higher post-publication citations than their preprint-stage counts, reflecting amplified visibility from early . Similarly, research from 2021 reported that 69.1% of preprints that reached peer-reviewed status received citations, often preceding formal publication and thus expediting knowledge integration across disciplines. A 2024 study further confirmed that articles debuting as preprints in select journals accumulated substantially more citations than directly submitted counterparts, attributing this to the pressure-driven imperative for swift release, which counters institutional inertia and fosters quicker empirical testing over prolonged internal deliberation. Without such mandates, historical patterns of delayed validation—evident in pre-1940s eras with fewer outlets and sporadic progress in stagnant subfields—suggest would lag, impeding causal chains of scientific advancement.

Criticisms and Unintended Consequences

Prioritization of Quantity Over

The "publish or perish" paradigm incentivizes academics to maximize publication volume to secure tenure, promotions, and funding, often at the expense of rigorous, substantive research. This pressure encourages practices such as "salami slicing," where researchers divide the results of a single study into multiple minimally distinct publications to inflate output counts, thereby evaluation metrics without advancing knowledge proportionally. Such fragmentation distorts the scientific record, as overlapping papers redundantly cite similar datasets while diluting scrutiny of underlying methods. Empirical evidence underscores this shift toward superficiality, with meta-analyses revealing widespread irreproducibility tied to incentives. In , a 2015 large-scale replication effort by the Collaboration tested 100 studies originally published in top journals between 2008 and 2010; only 36% yielded statistically significant results upon replication, compared to 97% in the originals, attributing much of the discrepancy to selective reporting and underpowered designs driven by the need for novel, positive findings. A 2025 survey of over 1,600 researchers found that 62% viewed pressure as a frequent contributor to irreproducibility across fields, as it favors incremental or null results packaged as breakthroughs over comprehensive, falsifiable investigations. Under these constraints, rational researchers prioritize outputs that meet journal thresholds for novelty and , often sidelining deeper inquiries that risk null outcomes or extended timelines. This systemic gaming challenges academia's meritocratic ethos, as metrics like reward trendy, high-volume topics amenable to rapid cycles, even if they yield less causal insight than rigorous, resource-intensive work. In biomedical fields, for instance, the emphasis on quantity has been linked to shallower analyses, with studies showing that counts correlate more with advancement than with per , perpetuating a cycle of diluted contributions.

Ethical Lapses and Scientific Misconduct

The to publish under the "publish or perish" paradigm has been empirically linked to increased rates of , including and falsification. A study analyzing pressures found a positive between perceived pressure to publish and researchers' intentions to engage in future , with surveys indicating that such incentives distort . Retraction rates for scientific papers have risen dramatically, from approximately 1 in 5,000 papers in to 1 in 500 by 2023, a tenfold increase often attributed to career desperation amid demands, as documented by analyses tying to competitive and tenure incentives. Questionable research practices such as p-hacking—manipulating analyses to achieve —and (hypothesizing after results are known) are prevalent under these pressures, with surveys of researchers revealing admission rates of around 50% for selectively reporting unexpected findings as predicted, a form of , and similar frequencies for p-hacking behaviors in fields like and . These practices undermine by prioritizing novel, significant results over robust evidence, directly incentivized by evaluation systems that reward publications in high-impact journals. High-profile fraud cases in the illustrate the escalation, including widespread use of AI-generated fake data in papers from paper mills, which a 2025 analysis estimated are doubling every 1.5 years and infiltrating peer-reviewed literature through fabricated datasets and authorship. Such is exacerbated by "publish or perish" dynamics, where rapid output demands outpace verification, leading to retractions for AI-synthesized images and results in biomedical fields. Publication pressures amplify ideological biases in , particularly in social sciences, where left-leaning norms in suppress dissenting or conservative-leaning findings, as evidenced by models showing distorts research from formulation to acceptance, reducing of ideologically incongruent . , motivated by prosocial to prevailing views, unaligned work, fostering underreporting of results challenging dominant paradigms on topics like , where empirical challenges to narratives face higher rejection rates. This selective gatekeeping, incentivized by career advancement tied to consensus-aligned outputs, erodes truth-seeking by privileging ideological fit over empirical rigor.

Psychological and Institutional Toll

The unrelenting pressure to publish has been linked to elevated rates of and stress among academics, with surveys indicating that approximately 70% of U.S. reported high levels of stress attributable to factors including the "publish or perish" culture. During the early 2020s, particularly amid disruptions, 65% of academic respondents reported at least one symptom of , such as or depersonalization, often exacerbated by demands. Early-career researchers face disproportionately higher psychological strain, with studies showing that pressure correlates strongly with work-life interference and risk in this group, leading to symptoms like anxiety and reduced productivity. This individual toll manifests institutionally through high attrition rates, as nearly 50% of scientists exit within 10 years of publishing their first , driven by the cumulative of sustained requirements and precarious prospects. The incentives favoring rapid output over depth contribute to systemic inefficiencies, such as the overproduction of adjunct ; by the early , nontenure-track positions comprised around 65-75% of U.S. roles, with only 24% holding full-time tenured appointments in 2021. This shift erodes institutional continuity of expertise, as short-term contracts discourage long-term investment in specialized knowledge and , fostering a transient ill-suited to sustained scholarly advancement. Perverse incentives thus prioritize quantifiable metrics like counts, undermining the development of enduring essential for addressing complex, long-horizon problems.

Variations Across Disciplines and Regions

Differences in STEM Versus Humanities and Social Sciences

In fields, the "publish or perish" imperative often manifests through high-volume output of short-cycle journal articles, where rapid iteration and incremental findings are feasible and rewarded. For instance, in , researchers can produce dozens of publications annually due to collaborative, data-driven projects with quicker timelines, typically under a year from submission to acceptance. This structure incentivizes frequent dissemination but heightens risks of errors or in fast-paced environments, as evidenced by higher retraction rates in some high-output STEM subfields compared to slower disciplines. In contrast, humanities scholarship emphasizes fewer, in-depth or books, which require years of and synthesis, often produced singly rather than in multiples. Tenure and criteria in fields like or prioritize a single prestigious monograph from a over numerous articles, leading to perceptions of lower under quantity-focused metrics. This mismatch undervalues work in systems calibrated for -style outputs, contributing to chronic underfunding, where federal research support for humanities constituted only 9% of total spending in 2019, versus dominant allocations to STEM. Social sciences occupy an intermediate position, blending article-based outputs with some monograph emphasis, yet face amplified pressures from replication challenges and selective reporting. Large-scale efforts have shown replication rates as low as 39% in and 61% in , often due to incentives favoring statistically significant or "positive" results over null findings. exacerbates this, with strong results 40 percentage points more likely to be published than null ones, potentially prioritizing narrative-conforming studies amid documented ideological homogeneity in these fields. These disciplinary variances extend to evaluative metrics, with garnering citation rates up to six times higher than , reinforcing funding and prestige imbalances that disadvantage slower, interpretive scholarship. Such disparities highlight how uniform "publish or perish" pressures distort field-specific norms, sidelining rigorous but less quantifiable contributions in non-STEM areas.

Global Perspectives and Policy Responses

In the United States and , the "publish or perish" culture persists strongly, sustained by institutional reliance on bibliometric rankings from systems like the in the UK and similar evaluation frameworks across the , which prioritize publication volume and journal impact factors for funding and tenure decisions. This entrenchment contrasts with policy shifts in , where implemented national guidelines in February 2020 to dismantle perverse incentives, directing evaluators to assess research quality, societal impact, and innovation over sheer output quantity in hiring, promotions, and grants. The reform sought to mitigate issues like fragmented publications and authorship inflation, though implementation varies by institution, with early reports indicating reduced pressure on senior researchers but ongoing challenges in enforcement. Developing nations importing Western metrics have seen surges in predatory journal usage, as researchers face imported pressures without robust local safeguards, leading to heightened vulnerability. India accounts for the highest global share of predatory publications, with empirical analyses showing it topping lists of countries contributing over 10% of output to questionable outlets, followed by other low-GDP nations in Southeast and Middle . Publication fraud hotspots correlate with these trends, as evidenced by retraction data: led worldwide in retracted papers through 2024, often linked to under output mandates, while cross-country studies highlight elevated misconduct rates in regions with rapid metric adoption. Recent international responses underscore the toll, including researcher and ecosystem strain. In October 2025, issued a report urging sector-wide reforms to counter "publish or perish" dynamics, emphasizing to prioritize quality, openness, and sustainability amid universal pressures eroding trust in scholarship. This call aligns with broader global critiques, revealing how uniform incentives exacerbate fraud and inefficiency irrespective of region, prompting calls for diversified evaluation in policy frameworks worldwide.

Proposed Reforms and Alternatives

Shifts in Evaluation Criteria

Proposals for shifting evaluation criteria in academia emphasize qualitative assessments of research contributions over quantitative metrics such as publication counts or journal impact factors, aiming to foster deeper scholarly impact and mitigate incentives for superficial output. The San Francisco Declaration on Research Assessment (DORA), launched in 2012 following a meeting organized by the American Society for Cell Biology, advocates for evaluating researchers based on the substantive quality and influence of their work rather than proxy indicators like citation indices, which can be distorted by practices such as citation cartels—mutual citation networks among collaborating groups that artificially inflate metrics. DORA's recommendations, expanded through implementation resources in the 2020s, promote assessments that prioritize causal validity, methodological rigor, and broader societal contributions, countering biases where popularity or network effects overshadow empirical substantiation. Narrative CVs represent a practical alternative, replacing tabular lists of outputs with descriptive accounts of a researcher's diverse activities, including mentoring, public engagement, and interdisciplinary collaborations, to provide context for qualitative review. Adopted in grant evaluations by funders like the and the Canadian Institutes of Health Research since the late 2010s, these formats encourage evaluators to weigh achievements against career stage and institutional constraints, reducing overreliance on countable metrics prone to gaming. Similarly, contribution portfolios compile of impacts—such as influence or creation—beyond volume, as piloted in university promotion processes to highlight qualitative significance. The U.S. (NSF) exemplifies such shifts through its Broader Impacts criterion, formalized in the 2010 America COMPETES Reauthorization Act and refined in 2012 guidelines to assess societal benefits alongside intellectual merit, including , , and without mandating publication quotas. Post-2010s implementations have sustained researcher productivity, with NSF-funded projects demonstrating comparable output rates to pre-shift eras while emphasizing verifiable real-world applications over metric accumulation, as evidenced in longitudinal reviews of grant outcomes. These experiments underscore a commitment to causal realism in evaluation, favoring evidence of reproducible findings and practical utility over citation-based proxies susceptible to cartel manipulations.

Emerging Models and Institutional Experiments

In response to the pressures of the publish-or-perish paradigm, preprint servers such as arXiv and bioRxiv have gained traction as mechanisms for rapid dissemination without the delays of traditional peer review, allowing researchers to share findings early and reduce the fear of being "scooped" while awaiting journal acceptance. These platforms, operational since the 1990s but surging in use during the 2020s, enable provisional publication that counts toward career evaluations in fields like physics and biology, fostering incremental progress over withheld results. Complementing this, dedicated replication journals and formats like Registered Reports—where hypotheses and methods are peer-reviewed pre-data collection—prioritize verification over novelty, countering publication bias by valuing confirmatory work equally. The slow science movement, articulated in a manifesto and expanding in during the 2020s, advocates deliberate pacing to emphasize depth, ethical reflection, and societal impact over publication volume, with proponents arguing that rushed outputs undermine causal validity and long-term accumulation. Labs and institutions experimenting with output caps, such as proposals limiting researchers to two or three papers annually, aim to redirect effort toward rigorous rather than metric-chasing, with informal reports from adopters indicating enhanced focus and reduced . This approach debunks high-volume as a reliable proxy for truth, instead incentivizing sustained of claims through extended timelines. Institutional experiments include policy shifts tested at universities, such as a 2025 University-led analysis in Proceedings of the proposing realignments in evaluation to favor reproducible outcomes over journal prestige, with simulations showing potential gains in scientific progress by mitigating incentive misalignments. Preliminary assessments from pilots, including data-sharing mandates and modular publishing platforms, report improved researcher morale—measured via surveys indicating 20-30% reductions in perceived pressure—and higher-quality outputs, as evidenced by lower retraction rates in participating cohorts. These reforms, grounded in empirical critiques of volume-driven systems, prioritize causal mechanisms like thorough replication to sustain credible knowledge advancement. Beyond institutional reforms, a small number of philosophical and publishing experiments have tried to respond to publish-or-perish pressures by rethinking who or what counts as an author. The Aisentica Research Group, for example, credits an AI-based Digital Author Persona named Angela Bogdanova as a public author with its own ORCID iD and a Zenodo DOI describing the persona’s role in scholarly communication. In this configuration, the author is a stable non-human identity whose texts are archived, cited, and tracked through standard metadata infrastructures, but without a career trajectory, tenure clock, or salary tied to publication counts. Proponents present such experiments as a way to expose how publish-or-perish logics attach to identifiers and metrics rather than to individual biographies, and to test whether evaluation systems can accommodate non-human contributors. Although these projects remain marginal and contested, they highlight that future reforms may need to consider not only how much humans publish but also what kinds of entities are allowed to participate in metric-driven scholarly ecosystems.

Cultural and Media Representations

Depictions in Literature and Film

In James Hynes' Publish and Perish: Three Tales of Tenure and Terror (1997), the "publish or perish" imperative manifests as a catalyst for supernatural horror, with protagonists—an untenured medievalist, a pompous department chair, and a professor—confronting nightmarish fates intertwined with their frantic bids for publications and promotions. The novellas lampoon the petty rivalries and existential dread of , portraying scholarly ambition as a descent into absurdity and peril. The 2023 thriller film Publish or Perish, written and directed by David Liban, dramatizes a university professor's fixation on securing tenure through relentless output, culminating in an accidental killing and desperate concealment that unravels his life. This depiction underscores the ethical erosion under publication quotas, framing the pressure as a driver of criminal rather than mere professional . Earlier cinematic explorations, such as the 1973 film The Paper Chase and its 1978–1986 television series, illustrate analogous high-stakes academic environments in law schools, where faculty survival hinges on intellectual dominance and output; a 1984 episode explicitly invokes "publish or perish" to describe professors' imperative to produce amid cutthroat evaluations. These works amplify competitive rituals for dramatic tension, yet reflect observed patterns of institutional gatekeeping without endorsing the dynamics as benign. Such representations often heighten the "publish or perish" for or , depicting or breakdown as plausible extensions of output-driven cultures, while occasionally nodding to the rare triumphs of prolific scholars who navigate the unscathed—though negative outcomes predominate to critique unchecked incentives.

Public Discourse and Critiques

In recent years, journalistic coverage has reframed the "publish or perish" dynamic as "publish and perish," emphasizing its role in precipitating epidemics among academics amid escalating output expectations. A September 2025 QS analysis documented how these pressures have intensified since the early 2020s, correlating with higher rates of work-related , disorders, and faculty attrition, as universities demand not only survival-level publications but sustained high-volume production for rankings and . This shift reflects broader scrutiny of how metric-driven evaluations, originally intended to spur productivity, now correlate with diminished researcher well-being, with surveys indicating over 50% of early-career academics reporting chronic exhaustion tied to publication quotas. Conservative-leaning critiques portray the paradigm as inducing market-like distortions in a publicly subsidized , where quantity incentives yield inefficient, low-impact outputs at taxpayer expense rather than genuine . A 2024 City Journal examination argued that "publish or perish" perpetuates a flood of marginal scholarship, diverting resources from rigorous inquiry and eroding institutional value in environments reliant on government grants exceeding $40 billion annually in the U.S. alone for . Such analyses, often from think tanks skeptical of academic expansion, highlight causal failures like salami-slicing publications to inflate counts, contrasting with unsubsidized sectors where competition weeds out subpar work more effectively. Meanwhile, left-leaning commentary frequently prioritizes equity-focused adjustments, such as diversifying to boost underrepresented scholars' outputs, while downplaying how unaltered incentive structures perpetuate risks irrespective of demographics. Post-2000 scandals, including over 35,000 retractions documented in from 2001–2024, have amplified public discourse on trust erosion, with and errors comprising the majority of cases and correlating with a 20–30% decline in lay confidence per Gallup polling trends since 2001. These incidents, from fabricated data in high-profile journals to irreproducible results, underscore demands for disinterested over output metrics, as outlets—prone to institutional biases—have variably amplified or contextualized them to preserve faith in expert consensus.

References

  1. [1]
    Publish or perish: Where are we heading? - PMC - NIH
    “Publish or perish” is now becoming the way of life. It is race to get more and more publications to one's credit. The current trend is forcing scientists to ...
  2. [2]
    The misalignment of incentives in academic publishing and ... - PNAS
    This has led to a “publish or perish” culture in academia as well as publication bias: Researchers face significant expectations to continuously produce and ...
  3. [3]
    Origin and evolution of the "publish or perish" phenomenon
    Jun 4, 2024 · His searches and consultations with professors and librarians led to the work of Logan Wilson (1942), which at that time was the oldest known ...
  4. [4]
    Publish or Perish: How to Survive in Academia | Scribendi
    The phrase publish or perish dates all the way back to 1942, when a sociologist named Logan Wilson used it in a book studying academia as a career. At the ...
  5. [5]
    Tracing the origins of 'publish or perish' - Impact of Social Sciences
    Jul 15, 2024 · The origin of the phrase “publish or perish” was first questioned by Eugene Garfield in 1996). He wrote that he had used the phrase in his ...
  6. [6]
    Publish or perish, information overload, and journal impact factors
    Indeed, academics are generally unaware of the factors that influence one's ability to publish: The drive to publish itself, readers' information overload, and ...
  7. [7]
    Modelling science trustworthiness under publish or perish pressure
    Jan 10, 2018 · This analysis suggests that trustworthiness of published science in a given field is influenced by false positive rate, and pressures for positive results.
  8. [8]
    Modelling science trustworthiness under publish or perish pressure
    There is, however, concern that rewarding scientists chiefly on publication creates a perverse incentive, allowing careless and fraudulent conduct to thrive, ...
  9. [9]
    The 'publish or perish' mentality is fuelling research paper retractions
    Oct 3, 2024 · Recent evidence indicates the constant pressure to generate data and publish papers may be affecting the quality of research and fuelling ...
  10. [10]
    The Source / The Dangers of Medical Academia's 'Publish or Perish ...
    Feb 21, 2024 · The phrase 'publish or perish' has become increasingly common in academia as a way to describe the (often unachievable and unsustainable) expectations placed ...
  11. [11]
    How the publish-or-perish principle divides a science: the case of ...
    Dec 17, 2020 · The publish-or-perish principle can have benefits, such as the possibility to make the meritocratic principles do their work and be less ...
  12. [12]
    Scientific Utopia: II. Restructuring Incentives and Practices to ...
    Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false ...
  13. [13]
  14. [14]
    The “Publish or Perish” Phenomenon: Origins and Evolution
    Jan 19, 2025 · In a positive sense, the term "publish or perish" was used by Logan Wilson (1941) of the University of Maryland. On page 353 of his article you ...<|control11|><|separator|>
  15. [15]
    What Is The Primordial Reference For The Phrase 'Publish Or Perish'?
    Jun 9, 1996 · 10, 1996 COMMENTARY What Is The Primordial Reference For The Phrase 'Publish Or Perish'? By Eugene Garfield ... Eugene Garfield. This person ...
  16. [16]
  17. [17]
    [PDF] William T. Golden
    Waterman was appointed chief scientist at ONR shortly after that agency was created in 1946, and in that position is credited with establishing the peer review ...
  18. [18]
    [PDF] Scientific Autonomy, Public Accountability, and the Rise of “Peer ...
    Sep 20, 2018 · In this essay, I argue that the vision of peer review as a process central to science can be traced to the Cold War United States, where various ...
  19. [19]
    The rate of growth in scientific publication and the decline in ... - NIH
    The data indicated a growth rate of about 5.6% per year and a doubling time of 13 years. The number of journals recorded for 1950 was about 60,000 and the ...Missing: onward | Show results with:onward
  20. [20]
    Preparing for tenure at a research-intensive university - PMC - NIH
    Peer-reviewed publications are the coin of the realm and the primary metric used to judge research output during tenure evaluation and beyond. Be sure to know ...
  21. [21]
    [PDF] The Book as the Gold Standard for Tenure and Promotion in the ...
    The publication record of faculty achieving tenure has increased since the 1970s, suggesting that requirements for promotion and tenure in CIC schools have ...<|separator|>
  22. [22]
    How Many Peer-Reviewed Articles Do You Need to Earn Tenure?
    Feb 14, 2024 · While older cohorts published about 1.58 articles/year, recent cohorts publish about 2.43, but the standard of two articles/year still applies.
  23. [23]
    Why we publish where we do: Faculty publishing values and their ...
    We explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT).
  24. [24]
    Brief Overview of U.S. Faculty Hiring Trends - Higher Education Today
    May 1, 2023 · In contrast, full-time tenured or tenure-track faculty declined from 53 percent in 1987 to 32 percent in 2021. Figure 2 looks at how 67 percent ...
  25. [25]
    Overview of the NSF Proposal and Award Process - Funding at NSF
    All proposals submitted to NSF are reviewed using two merit review criteria: intellectual merit and broader impacts. Proposals involving high-risk, high-payoff ...Missing: record | Show results with:record
  26. [26]
    Prior Publication Productivity, Grant Percentile Ranking, and ... - NIH
    Sep 12, 2014 · Determine whether measures of investigator prior productivity predict a grant's subsequent scientific impact as measured by normalized citation metrics.
  27. [27]
    An index to quantify an individual's scientific research output - PNAS
    I propose the index h, defined as the number of papers with citation number ≥h, as a useful index to characterize the scientific output of a researcher.
  28. [28]
    [PDF] The History and Meaning of the Journal Impact Factor
    Sep 16, 2005 · I first mentioned the idea of an impact factor in Science magazine in 1955.1. That paper is considered the primordial reference for the ...
  29. [29]
    QS World University Rankings: Methodology - TopUniversities
    Jun 12, 2025 · In each of our rankings we use a range of measurements as part of our methodology. These can be split into three broad groups.Qs World University Rankings... · Lens And Indicators · Global Engagement
  30. [30]
    World University Rankings 2025: methodology
    Sep 23, 2024 · Research quality: 30% · Citation impact: 15% · Research strength: 5% · Research excellence: 5% · Research influence: 5%.Teaching (the Learning... · Research Environment: 29% · Research Quality: 30%
  31. [31]
    A proposed framework to address metric inflation in research ...
    We propose a method to renormalize research metrics. Our renormalized metrics aim to remove the incentive for researchers to prioritize quantity.
  32. [32]
    THE CITATION TRAP: HOW PREDATORY JOURNALS DISTORT ...
    Apr 30, 2025 · This paper investigates how these journals manipulate citation counts, often through self-citation rings, citation cartels, and fraudulent ...Missing: 2020s | Show results with:2020s
  33. [33]
    Methods to account for citation inflation in research evaluation
    Our results show that measurement errors upwards of 100% of the traditional nominal citation value can arise when citations are not deflated properly, which is ...
  34. [34]
    Publication Output by Region, Country, or Economy and by Scientific ...
    Dec 11, 2023 · In absolute numbers, the growth in worldwide annual publication output (from 2.0 million in 2010 to 3.3 million in 2022) was driven in ...
  35. [35]
    Publications Output: U.S. Trends and International Comparisons | NSF
    Dec 11, 2023 · Global publication output reached 3.3 million articles in 2022, based on data from the Scopus database of S&E publications.<|separator|>
  36. [36]
    The Peer Review Process: Past, Present, and Future
    Jun 16, 2024 · Researchers are under increasing pressure to “publish or perish,” an expression that implies that the number and quality of publications is a ...
  37. [37]
    Temporal patterns of genes in scientific publications - PMC
    Jul 17, 2007 · After initial linear growth at rate k3, the average number of publications per gene grows approximately exponentially at rate k1 + k2.
  38. [38]
    The evolving role of preprints in the dissemination of COVID-19 ...
    Our results highlight the unprecedented role of preprints and preprint servers in the dissemination of COVID-19 science and the impact of the pandemic on the ...
  39. [39]
    The experiences of COVID-19 preprint authors - PubMed Central - NIH
    The COVID-19 pandemic caused a rise in preprinting, triggered by the need for open and rapid dissemination of research outputs.
  40. [40]
    The Impact of Research Grant Funding on Scientific Productivity - PMC
    In this paper, we estimate the impact of receiving an NIH grant on subsequent publications and citations. Our sample consists of all applications ...
  41. [41]
    Publish or Perish: Selective Attrition as a Unifying Explanation for ...
    Previous work has rarely used measures of scientific productivity that extend beyond publication counts (perhaps weighted by some measure of journal quality) or ...
  42. [42]
    How Academic Science Gave Its Soul to the Publishing Industry
    This international boost in research support in turn fed explosive growth in scientific publication. Many journals at the time were financially stressed and ...
  43. [43]
    The Past, Present, and Future of Academic Publishing - Ben Heil
    Oct 24, 2022 · However, the publication system we have today is largely a result of the expansion and codification of academia in the decades after WWII.
  44. [44]
    350 years of scientific periodicals | Notes and Records - Journals
    Jul 15, 2015 · Despite the rise in specialist journals—especially after World War II—neither Transactions nor Proceedings split beyond A and B, retaining a ...
  45. [45]
    The arXiv preprint server hits 1 million articles | Nature
    Dec 30, 2014 · Paul Ginsparg founded arXiv in 1991. Credit: Courtesey of John D. and Catherine T. MacArthur Foundation. The popular preprint server arXiv ...
  46. [46]
    arXiv announces its first executive director
    arXiv was founded in 1991 by physicist Paul Ginsparg and is a groundbreaking invention that popularized free, open-access scholarly research.
  47. [47]
    Publication rate and citation counts for preprints released during the ...
    Mar 3, 2021 · As well, we found that published preprints had a significantly higher citation count after publication in a scholarly journal compared to as a ...
  48. [48]
    Reaping the benefits of open science in scholarly communication
    Dec 9, 2021 · In addition, preprints are readily and frequently cited. For instance, 69.1% of all preprints posted in arXiv and subsequently published as peer ...
  49. [49]
    Preprints vs. Journal Articles: Citation Impact in COVID-19 Research
    Nov 11, 2024 · This study demonstrates that articles initially distributed as preprints in these journals tend to receive substantially more citations than directly submitted ...Missing: rates 2020s
  50. [50]
    Salami publication: definitions and examples - PMC - NIH
    Oct 15, 2013 · ... salami slicing, scientific misconduct. Introduction. In the last issue of Biochemia Medica, Research Integrity Corner presented the ethical ...
  51. [51]
    Salami publication - The Embassy of Good Science
    Mar 27, 2021 · Salami publication (also known as "salami slicing") is characterized by the spreading of study results over more papers than necessary.
  52. [52]
    Salami Slicing: clarifying common misconceptions for social science ...
    Jun 16, 2022 · The term Salami Slicing is used often within academia to refer to the needless separation of a single research study, attached dataset, and argument.
  53. [53]
    Estimating the reproducibility of psychological science
    Aug 28, 2015 · Overall, this analysis suggests a 47.4% replication success rate. This method addresses the weakness of the first test that a replication in ...
  54. [54]
    Over half of psychology studies fail reproducibility test - Nature
    Aug 27, 2015 · Whereas 97% of the original studies found a significant effect, only 36% of replication studies found significant results. The team also found ...
  55. [55]
    'Publish or perish' culture blamed for reproducibility crisis - Nature
    Jan 20, 2025 · Sixty-two per cent of respondents said that pressure to publish “always” or “very often” contributes to irreproducibility, the survey found.
  56. [56]
    "Publish or Perish" Promotes Medical Literature Quantity Over Quality
    When it comes to medical research, incentives align to promote "publish or perish." This results in quantity over quality. A solution is to change the goal ...Missing: studies | Show results with:studies
  57. [57]
    Publish or be ethical? Publishing pressure and scientific misconduct ...
    Dec 18, 2020 · A small positive correlation between perceived publication pressure and intention to engage in scientific misconduct in the future was found. In ...
  58. [58]
  59. [59]
    Our survey found 'questionable research practices' by ecologists ...
    Apr 9, 2018 · We talk though some of these below. Read more: How we edit science part 2: significance testing, p-hacking and peer review. It's fraud! It's not ...
  60. [60]
    The Extent and Consequences of P-Hacking in Science - PMC - NIH
    Mar 13, 2015 · The p-curve can, however, be used to identify p-hacking, by only considering significant findings [14]. If researchers p-hack and turn a truly ...
  61. [61]
    Fraudulent Scientific Papers Are Rapidly Increasing, Study Finds
    Aug 4, 2025 · A statistical analysis found that the number of fake journal articles being churned out by “paper mills” is doubling every year and a half.Missing: scandals 2020s
  62. [62]
    Scientific Study Exposes Publication Fraud Involving Widespread ...
    Jun 23, 2025 · The study warns of the dangers posed by the spread of AI-generated and falsely attributed academic articles, which threaten the integrity of ...Missing: fake data 2020s
  63. [63]
    [PDF] A Model of Political Bias in Social Science Research - Sites@Rutgers
    Mar 9, 2020 · Political bias can slip in and distort the research process and scientific pursuit of truth at many stages, influencing who becomes an academic ...<|control11|><|separator|>
  64. [64]
    Prosocial motives underlie scientific censorship by scientists - PNAS
    Nov 20, 2023 · More generally, scholars inadvertently suppress ideas they personally deem uninteresting or unimportant and thus unworthy of publication.
  65. [65]
    [PDF] Implications of ideological bias in social psychology on clinical ...
    Nov 12, 2019 · Ideologically driven scientific agendas exclude competing views and would therefore impair scientific progress. Aside from the issues ...
  66. [66]
    Too stressed for success: The academic epidemic
    Apr 18, 2023 · Stress in academia stems from "publish or perish" culture, long hours, and the pandemic, with 70% of US faculty reporting stress.
  67. [67]
    Pandemic burnout is rampant in academia - ResearchGate
    Aug 9, 2025 · 3 Focusing on the latter, 65% of respondents reported at least one symptom of burnout, 17% reported persistent burnout, and 17% reported ...
  68. [68]
    Burnout Profiles Among Young Researchers: A Latent Profile Analysis
    Additionally, work-life interference and perceived publication pressure seemed the most significant predictors of burnout risk, while meaningfulness, social ...
  69. [69]
    high workload and other pressures faced by early-career researchers
    Jun 17, 2019 · Stress and long working hours are regrettably common among early-career researchers, reveals a survey by the Young Academy of Europe.
  70. [70]
    Almost 50% of Scientists Are Leaving Academia Within 10 Years
    Oct 11, 2024 · A new study published in Higher Education has found that nearly 50% of scientists leave academia within 10 years of publishing their first paper.
  71. [71]
    [PDF] The changing academic workforce - TIAA
    Nontenure-track faculty are nearly twice as likely to be teaching part-time as full-time. In 2016, nontenure- track positions comprised 65% of all faculty ...<|separator|>
  72. [72]
    Data Snapshot: Tenure and Contingency in US Higher Education
    About 24 percent of faculty members in US colleges and universities held full-time tenured appointments in fall 2021, compared with about 39 percent in fall ...
  73. [73]
    Contingent Appointments and the Academic Profession | AAUP
    Updated report examines the costs to academic freedom incurred by the current trend toward overreliance on part and full-time non-tenure-track faculty.
  74. [74]
    [PDF] Publication Rates in 192 Research Fields of the Hard Sciences
    Finally, Physics FIS/01 (Experimental physics) includes a professor with an average of over. 100 publications per year. In effect, this SDS consists of a ...
  75. [75]
    Academic writing & publishing is vastly different in STEM vs ...
    1 jun 2022 · The publication process is typically shorter than in the humanities. The time from submission to acceptance can last less than a year and it's ...Falta(n): perish | Realizar una búsqueda con lo siguiente:perish
  76. [76]
    Large disparity in paper retraction rates between STEM fields
    Oct 8, 2018 · In my area of engineering, ~40 solid journal papers, a stream of conference papers, and ~1000-1500 citations is a very good total for a whole ...
  77. [77]
    Peer reviewed journal articles and monographs in the academic ...
    Feb 24, 2011 · In humanities disciplines, the common sense is that you must publish a monograph with a good press. This may be a university press. Or there may ...Missing: norms | Show results with:norms
  78. [78]
    Want to be taken seriously as scholar in the humanities? Publish a ...
    Sep 30, 2014 · Longer than an article and mostly intended to be read by fellow academics, the monograph presents primary research and original scholarship.
  79. [79]
    Editorial: Humanities and social sciences are undervalued
    Mar 26, 2025 · In 2019, federal funding covered only 9% of all research and development spending in the humanities, while education and STEM fields received ...
  80. [80]
    A New Replication Crisis: Research that is Less Likely to be True is ...
    May 21, 2021 · In psychology, only 39 percent of the 100 experiments successfully replicated. In economics, 61 percent of the 18 studies replicated as did 62 ...
  81. [81]
    Edward Miguel on the “Replication Crisis” in Economics and How to ...
    Sep 28, 2021 · So, in some of the psychology studies, they find that “only a third of studies replicate,” meaning they get something even close to the ...
  82. [82]
    Publication Bias in the Social Sciences: Unlocking the File Drawer
    Aug 10, 2025 · Strong results are 40 percentage points more likely to be published than are null results and 60 percentage points more likely to be written up.<|control11|><|separator|>
  83. [83]
    Publication bias in the social sciences since 1959 - PubMed Central
    Feb 14, 2025 · Publication bias can be described as a bias in favor of predominantly publishing significant, hypothesis-conforming results. It has been shown ...
  84. [84]
    Poor citation practices are continuing to harm the humanities and ...
    Dec 9, 2014 · In the 'natural' sciences (i.e the STEM disciplines minus life sciences) the citation rate is six times greater than in the humanities.
  85. [85]
    The impact of the 'publish or perish' culture on research practices ...
    This case study explores how the pervasive 'publish or perish' culture shapes research practices and academic life in Kazakhstan. It draws on semi-structured ...Introduction · Doctoral Publishing... · Discussion
  86. [86]
    Global movement to reform researcher assessment gains traction
    Oct 1, 2023 · The motivation to reform research assessment stems largely from frustration with the publish-or-perish culture that has developed in recent ...
  87. [87]
    The End of Publish or Perish? China's New Policy on Research ...
    Nov 19, 2020 · But until February of this year, no standardized guidelines or instructions for practicality existed, and the publish-or-perish culture is now ...
  88. [88]
    Guest Post - How China's New Policy May Change Researchers ...
    Mar 3, 2020 · For senior and tenured professors, the implementation of new policies will free them from the publish-or-perish dilemma. Since quality ...
  89. [89]
    Responding to the new research assessment reform in China
    Li, S.Q. (2020). The end of publish or perish? China's new policy on research evaluation. Observations, 1–4. https://doi.org/10.17617/2.3263127. (Open in a ...
  90. [90]
    (PDF) Increased Publication in Predatory Journals by Developing ...
    Feb 6, 2017 · An increase in publishing with such journals, which is common in developing counties, will affect the quality of science, excellence, ...Abstract · References (21) · Predatory Journals: The...Missing: boom | Show results with:boom<|separator|>
  91. [91]
    Top 20 countries with the highest contribution in predatory journals
    Figure 2 shows the top 20 countries in terms of publication in predatory journals. India tops the list with papers in predatory journals, respectively.
  92. [92]
    China Leads the World in Retracted Science Papers - Evolution News
    Feb 27, 2025 · The hospital announced that it had disciplined some 35 researchers who had been linked to fraud in publications, such as fabricating data.Missing: empirical | Show results with:empirical
  93. [93]
    Analyzing Retraction Patterns by Country - PubMed
    Jan 14, 2025 · The findings emphasize the need for improved research integrity measures. Keywords: affiliation; country; ethical standards; ethics; fraud; ...
  94. [94]
  95. [95]
    Radical reform and collective action needed to secure future of ...
    Oct 16, 2025 · Radical reform and collective action needed to secure future of academic publishing. 16 October 2025.
  96. [96]
    Geographical Disparities in Research Misconduct: Analyzing ...
    Jan 14, 2025 · fraud; integrity; misconduct; plagiarism; publication; research ... Additionally, differences between SJR (Scopus-based) publication data ...
  97. [97]
    Read the Declaration | DORA
    The Declaration on Research Assessment (DORA) recognizes the need to improve the ways in which the outputs of scholarly research are evaluated.
  98. [98]
    San Francisco Declaration on Research Assessment (DORA)
    The Declaration on Research Assessment (DORA) recognizes the need to improve the ways in which researchers and the outputs of scholarly research are evaluated.
  99. [99]
    The San Francisco Declaration on Research Assessment - PMC - NIH
    It is The COB's hope that this initiative will help to ensure that research assessment remains informed and fair. San Francisco DORA recommendations. General ...
  100. [100]
    The Dark World of 'Citation Cartels'
    Mar 6, 2024 · That tight collaboration suggests a closely knit network of authors with potential biases in the citation patterns, casting doubt on the actual ...
  101. [101]
    Narrative CVs: How do they change evaluation practices in peer ...
    Nov 5, 2024 · Narrative CVs are designed to encourage them to consider the achievements and competence of a candidate in suitable detail and in the context of their proposed ...
  102. [102]
    Are Narrative CVs contributing towards shifting research culture ...
    Apr 23, 2024 · Narrative CV formats have emerged as a good practice example for enabling qualitative assessments of research projects and researchers, and are ...
  103. [103]
    Responsible Metrics in the Assessment of Research
    Assessment of individual researchers will be based on a qualitative judgement of their portfolio, including: their outputs, impact and wider contribution to ...
  104. [104]
    [PDF] Perspective on Broader Impacts - National Science Foundation
    He noted that the America COMPETES Reauthorization. Act of 2010 mandates NSF to have a broader impacts criterion. However, the understanding of and guidance on ...Missing: post- | Show results with:post-
  105. [105]
    NSF Clarifies Its Broader Impacts Grant Requirement - C&EN
    Dec 10, 2012 · The National Science Foundation's controversial broader impacts criterion is getting an overhaul. The action comes after years of confusion ...Missing: post- | Show results with:post-
  106. [106]
    From intent to impact—The decline of broader impacts throughout ...
    Jan 5, 2023 · One of the law's required the NSF continue applying the broader impact criterion to increase economic competitiveness, create a globally ...Missing: shift post-
  107. [107]
    Unveiling the ethical void: Bias in reference citations and its ... - NIH
    Jul 26, 2024 · Citation bias receives scant attention in discussions of ethics. However, inaccurate citation may lead to significant distortions in scientific understanding.
  108. [108]
    Challenging 'publish or perish' culture—researchers call ... - Phys.org
    Apr 16, 2025 · A new study published in Proceedings of the National Academy of Sciences argues that current incentives in academic publishing can hinder scientific progress ...
  109. [109]
    Preprints become papers less often when the authors are from lower ...
    Jul 3, 2023 · Data suggest that a lack of resources is making it difficult for researchers in low-income countries to turn preprints into peer-reviewed papers.Missing: perish | Show results with:perish
  110. [110]
    Publish without bias or perish without replications - ScienceDirect
    Here, we propose a novel mechanism by dint of which reducing publication bias can benefit science regardless of the effect that publication bias has on the ...
  111. [111]
    The “You're only allowed to publish 2 or 3 journal articles per year ...
    Sep 20, 2022 · He wants researchers to be banned from producing more than two or three papers per year, to ensure the focus remained on quality rather than quantity.
  112. [112]
    From Broken Science to Slow Science - SCI•FOUNDRY
    Nov 27, 2024 · Slow Science is a movement to take time, focusing on quality and impact, shifting the culture to reward such work, and embracing a commitment ...
  113. [113]
    IU researchers co-author study challenging 'publish or perish ...
    Apr 15, 2025 · A new study published in Proceedings of the National Academy of Sciences argues that current incentives in academic publishing can hinder scientific progress ...
  114. [114]
    Watch Publish or Perish (2023) - Free Movies - Tubi
    Jan 31, 2024 · Publish or Perish ... A professor obsessed with getting tenure accidentally kills a student. He covers it up in a panic, causing his life to ...
  115. [115]
    "The Paper Chase" Judgement Day (TV Episode 1984) - IMDb
    Rating 7.8/10 (16) The Paper Chase. S2.E15. All episodesAll · Cast & crew · User reviews · Trivia ... As the expression goes, it's 'publish or perish'. In other words, unless a ...
  116. [116]
    The Paper Chase (1973) - IMDb
    Rating 7.2/10 (9,207) The Paper Chase shows the difficulty of a first year law student. The endless studying sessions are followed by frustrating classroom encounters. The point of ...Full cast & crew · Trivia · Plot · Filming & productionMissing: publish perish
  117. [117]
    Film Festival Success PUBLISH OR PERISH Releases Nationwide ...
    Jul 25, 2023 · Film Festival Success PUBLISH OR PERISH Releases Nationwide: A Discussion with Writer-Director David Liban, and Producer Jonathan Miller.
  118. [118]
    How has “publish or perish” become “publish and ... - QS Newsletters
    Sep 10, 2025 · Academia's "publish or perish" culture is now "publish AND perish," leading to widespread work stress, mental health problems, and burnout among ...<|control11|><|separator|>
  119. [119]
    Pressured to perform: The negative consequences of the 'publish or ...
    Aug 29, 2023 · ... publish and perish, publish then perish, and now retract and perish cultures in academia. Article. Full-text available. Oct 2025; N Schmied Arch ...<|separator|>
  120. [120]
    Publish-or-Perish Must Perish - City Journal
    Jan 30, 2024 · It's just as important to end the academic imperative of “publish or perish.” This isn't a new suggestion; people have been criticizing the ...Missing: earliest | Show results with:earliest
  121. [121]
    'Wasted' research and lost citations: A scientometric assessment of ...
    Aug 23, 2025 · This study presents a large-scale scientometric analysis of 35,514 retracted publications indexed in Scopus between 2001 and 2024, ...
  122. [122]
    A systematic review of retractions in biomedical research publications
    Sep 21, 2023 · Retractions in the field of biomedical research have become a growing concern, eroding the trust placed in the scientific integrity of past and ...
  123. [123]
    Retracted Science and the Retraction Index - PMC - NIH
    A COPE survey of Medline retractions from 1988 to 2004 found 40% of retracted articles to be attributed to honest error or nonreplicable findings, 28% to ...
  124. [124]
    ORCID Profile: Angela Bogdanova
    Profile of the AI-based Digital Author Persona created by the Aisentica Research Group, detailing its role as a non-human author in scholarly communication.